A smart trap classifies mosquitoes by sex and genus based on their buzzing sound
The trap uses an optical sensor with an artificial intelligence system trained by recording the flight of more than 4,300 mosquitoes over two years
The trap has proven effective with tiger and common house mosquitoes, two species of particular concern as regards public health because they can spread diseases such as dengue and West Nile fever
Developed by IRTA and the company Irideon, the trap is unique in terms of its technical capabilities and will speed up the mosquito-borne disease surveillance and control process
In conjunction with the company Irideon S.L., the specialized entomology and arbovirus team from the Animal Health Research Centre (CReSA) of the Institute of Agrifood Research and Technology (IRTA) has developed a prototype optical sensor that, when coupled to a trap, automatically classifies captured mosquitoes by genus and sex with an unprecedented level of accuracy. Published recently in the journal Parasites & Vectors, the results of laboratory tests show the sensor to be capable of distinguishing between mosquitoes of the Aedes and Culex genera with an accuracy of 94.2%. In the same study, conducted as part of the European VECTRACK project, the sensor also distinguished between male and female Aedes mosquitoes with an accuracy of 99.4%, and between male and female Culex mosquitoes without a single error.
The researchers worked with the Aedes and Culex genera because they include two of the species currently causing most concern in the European public health and mosquito-borne virus surveillance arena. They are the tiger mosquito (Aedes albopictus), an urban mosquito that can carry viruses such as dengue, Zika and chikungunya; and the common house mosquito (Culex pipiens), which lives in urban, rural and wetland environments and can spread diseases such as West Nile fever. “The priority is for sensors to identify the females of each species, because it is the females that can bite people and transmit viruses,” says IRTA-CReSA researcher Sandra Talavera and leader of the VECTRACK project.
Over a two-year period, scientists in the IRTA-CReSA laboratory have recorded the flight of more than 4,300 individual tiger and common house mosquitoes, which were reared on the premises, to train the optical sensor and enable it to detect the mosquitoes’ wingbeat frequency — basically their characteristic buzzing sound — in hertz (Hz). The wingbeat frequency of mosquitoes can range from 300 to 900 Hz. “We have worked on the basis of wingbeat frequency because it is very particular to each species and varies depending on a mosquito’s sex and other biological factors, including size, age and mating behaviour, as well as on environmental factors such as temperature,” explains IRTA-CReSA predoctoral researcher Maria Isabel González, and first author of the study. Previously available optical sensors have been capable of distinguishing between mosquitoes and other insects and counting specimens, but not of identifying the species, sex or other characteristics of each mosquito.
The sensor uses wing movement shadows
When a mosquito flies close to the trap’s inlet tube, it can be pulled into the trap by a suction fan. When that happens, the sensor detects the insect by means of an optical panel that emits light and another that receives it. As the mosquito crosses the sensing zone, it casts a shadow on the optical receiver. When it flaps its wings, the light falling on the receiver is modulated, causing changes in the amplitude of the light waveform recorded by the sensor. “Using the Python programming language, the sensor translates the optical signals into acoustic signals,” states João Encarnação, chief business officer of Irideon S.L.
The mosquito flight recordings made are 30 milliseconds long on average and can be downloaded from the sensor as playable, viewable audio files. To train the sensor, information from the recordings has been combined with machine learning algorithms.
A technological revolution in mosquito surveillance
With diseases transmitted by mosquitoes killing more than 700,000 people across the planet every year, swift species identification is essential when cases of such illnesses arise. Identification is carried out, through observation of insect morphology, by professional entomologists. “It is a very painstaking and urgent task, especially in emergencies when you are racing against time in the possible expansion of a mosquito-borne virus,” Talavera remarks.
It is hoped that the artificial intelligence system will enable the traps to perform remote mosquito identification in real time and immediately send results to the competent authorities to help them make decisions, thus speeding up the mosquito-borne virus surveillance and control process. One advantage is that it will be possible to prioritize the use of human resources and avoid sending people to perform an on-site entomological inspection if the species they are interested in has not been detected. For example, in entomological inspections involved in a suspected case of dengue, there is no need for anybody to expressly travel to perform on-site sampling if a trap at the relevant location tells us there are no tiger mosquitoes present. Remote connection will also be useful for general analyses of trends and the risk of the transmission of diseases that mosquitoes carry.
The last few years have seen the launch of numerous studies aimed at developing technologies based on identifying the wingbeat of mosquitoes, sometimes in combination with body shape, but achieving a high level of sensor accuracy is challenging. “For now, our study has shown the optical sensor to be reliable in controlled laboratory conditions,” says Talavera. “Soon, though, we will have the results of tests carried out in the field, where environmental conditions vary and can have an influence,” she concludes.
González-Pérez, M.I., Faulhaber, B., Williams, M. et al. A novel optical sensor system for the automatic classification of mosquitoes by genus and sex with high levels of accuracy. Parasites Vectors 15, 190 (2022). https://doi.org/10.1186/s13071-022-05324-5