TY - JOUR
T1 - Species-level detection of thrips and whiteflies on yellow sticky traps using YOLO-based deep learning detection models
AU - Laekeman, Broes
AU - Bonte, Jochem
AU - Dermauw, Wannes
AU - Christiaens, Annelies
AU - Gobin, Bruno
AU - Van Huylenbroeck, Johan
AU - Dhooghe, Emmy
AU - Lootens, Peter
PY - 2025/11/18
Y1 - 2025/11/18
N2 - As of today, pest insects such as thrips and whiteflies cause the loss of 20% - 40% of the global agricultural yield. To reduce chemical pesticide use while maintaining high-quality horticultural standards, early detection of pest infestations is essential. Although AI-assisted pest monitoring systems using sticky trap images exist today, none currently enable effective species-level detection of thrips and/or whiteflies. However, early species-level identification would allow for more targeted, species-specific control strategies, leading to reduced, localized, and more efficient pesticide application. Therefore, in this study, we evaluated the potential and limitations of real-time species-level detection of thrips (Frankliniella occidentalis and Echinothrips americanus) and whiteflies (Bemisia tabaci and Trialeurodes vaporariorum) using non-microscopic, RGB yellow sticky trap images and recent YOLO-based deep learning detection models. To this end, a balanced and labelled image dataset was gathered, consisting of the studied pest species, caught on one type of yellow sticky trap. Subsequently, various versions of the YOLO11 and YOLO-NAS detection model architectures were trained and tested using this dataset at various (digitally reduced) pixel resolutions. All tested high-resolution dataset (pixel size: 5 µm) models achieved species-level detection of the studied pests on an independent test dataset (mAP@50: 79% - 89% | F1@50: 74% - 87%). Even the smallest model (YOLO11n) delivered feasible macro-averaged (mAP@50: 80% | F1@50: 77%) and classwise performance scores (AP@50: 72% - 85% | F1@50: 68% - 82%). The minimum required pixel resolution for feasible species-level detection in greenhouse horticulture was identified as 80 µm for both the YOLO11n and YOLO11x models, enabling the use of modern smartphones, action cameras, or low-cost standalone camera modules. Combined with the low complexity and decent performance of the YOLO11n model, these results demonstrate the potential of feasible, real-time, automated species-level monitoring of (yellow) sticky traps in greenhouse horticulture. Future research should focus on extending this technology to additional pest species, sticky trap types, and ambient light conditions.
AB - As of today, pest insects such as thrips and whiteflies cause the loss of 20% - 40% of the global agricultural yield. To reduce chemical pesticide use while maintaining high-quality horticultural standards, early detection of pest infestations is essential. Although AI-assisted pest monitoring systems using sticky trap images exist today, none currently enable effective species-level detection of thrips and/or whiteflies. However, early species-level identification would allow for more targeted, species-specific control strategies, leading to reduced, localized, and more efficient pesticide application. Therefore, in this study, we evaluated the potential and limitations of real-time species-level detection of thrips (Frankliniella occidentalis and Echinothrips americanus) and whiteflies (Bemisia tabaci and Trialeurodes vaporariorum) using non-microscopic, RGB yellow sticky trap images and recent YOLO-based deep learning detection models. To this end, a balanced and labelled image dataset was gathered, consisting of the studied pest species, caught on one type of yellow sticky trap. Subsequently, various versions of the YOLO11 and YOLO-NAS detection model architectures were trained and tested using this dataset at various (digitally reduced) pixel resolutions. All tested high-resolution dataset (pixel size: 5 µm) models achieved species-level detection of the studied pests on an independent test dataset (mAP@50: 79% - 89% | F1@50: 74% - 87%). Even the smallest model (YOLO11n) delivered feasible macro-averaged (mAP@50: 80% | F1@50: 77%) and classwise performance scores (AP@50: 72% - 85% | F1@50: 68% - 82%). The minimum required pixel resolution for feasible species-level detection in greenhouse horticulture was identified as 80 µm for both the YOLO11n and YOLO11x models, enabling the use of modern smartphones, action cameras, or low-cost standalone camera modules. Combined with the low complexity and decent performance of the YOLO11n model, these results demonstrate the potential of feasible, real-time, automated species-level monitoring of (yellow) sticky traps in greenhouse horticulture. Future research should focus on extending this technology to additional pest species, sticky trap types, and ambient light conditions.
KW - B390-crop-protection
KW - Frankliniella occidentalis
KW - Echinothrips americanus
KW - Bemisia tabaci
KW - Trialeurodes vaporariorum
KW - automated pest monitoring
KW - integrated pest management (IPM)
KW - smart traps
KW - artificial intelligence (AI)
KW - B390-horticulture
KW - automated pest monitoring
KW - smart traps
KW - B432-ornamental-plants
KW - automated pest monitoring
KW - smart traps
UR - https://www.mendeley.com/catalogue/84644676-0e69-3930-8343-3fc19a09b61d/
U2 - 10.3389/fpls.2025.1668795
DO - 10.3389/fpls.2025.1668795
M3 - A1: Web of Science-article
SN - 1664-462X
VL - 16
JO - Frontiers in Plant Science
JF - Frontiers in Plant Science
ER -