4 December 2021

Deep Learning Based Automatic Multiclass Wild Pest Monitoring Approach Using Hybrid Global and Local Activated Features

Po Yang profile photo


Liu Liu, University of Science and Technology of China, Hefei, China

Chengjun Xie, Institute of Intelligent Machines, Chinese Academy of Sciences, Hefei, China

Rujing Wang, Institute of Intelligent Machines, Chinese Academy of Sciences, Hefei, China

Po Yang, Department of Computer Science, University of Sheffield, Sheffield, U.K.

Sud Sudirman, Department of Computer Science, Liverpool John Moores University, Liverpool, U.K.

Jie Zhang, Institute of Intelligent Machines, Chinese Academy of Sciences, Hefei, China

Rui Li, Institute of Intelligent Machines, Chinese Academy of Sciences, Hefei, China

Fangyuan Wang, University of Science and Technology of China, Hefei, China

Read the full paper

What is this paper about?

Specialised pest control have been a high-priority issue for the agriculture industry in many countries. A popular solution is the use of artificial intelligence (AI) techniques for automated, image-based identification of pests. However, these solutions suffer from reduced accuracy and robustness in real-world applications due to multiplicity of crops and variety of pests. To tackle the problem, this article proposes a novel deep learning based automatic approach using hybrid and local activated features for pest monitoring. In the presented method, we exploit the global information from feature maps to build our global activated feature pyramid network to extract pests' highly discriminative features across various scales over both depth and position levels. The experimental results show that our solution performs over 75.03% mean average precision (mAP) in industrial circumstances, which outweighs two other state-of-the-art methods: Faster R-CNN with mAP up to 70% and feature pyramid network mAP up to 72%.

Why is the research important and/or novel?

To our best knowledge, the two stage CNN based pest monitoring approach using hybrid global and local activated feature proposed in our article is one of the best automatic wheat pest recognition models in the world. One key contribution is that we design a global activated feature pyramid network that enables identifying features of tiny pest like rotation, scale and translation, and also extracting intuitive features of pest from complex background. The model was also trained and evaluated by one largescale multi-class wheat pest datasets in the world containing 88.6 K images (16 types) with 582K labelled pest objects. It proves that deep learning-based pest recognition model could be used as a cost-effective solution for practically pest control application. The researchers from Wadhwani Institute for Artificial Intelligence in INDIA expanded our model in supporting cotton pest management, and received a $2M USD grant from Google to create technologies that help reduce crop losses in cotton farming, through integrated pest management.

Anything else that you would like to highlight about the paper? 

Its early conference publication received the Best Paper Award in the 19th IEEE International Conference on Industrial Informatics (No 1 out of 428 papers). Through this work, Sheffield has successfully secured two InnovateUK farming innovation projects: “Integrating Visual and Context Information into a Mobile Intelligence Solution for Sustainable Management of Wheat Pests and Soil Health” with RSK ADAS and Mutus-Tech, and “Farmer-centred Interoperable Mobile-Cloud System: Integrating Data from Farming Activities and Environmental Information for Sustainable Fertiliser Management” with Velcourt and AntData.

REF 2021 illustration

Research Excellence Framework 2021 results

The results demonstrate our research and impact excellence across a broad range of disciplines and confirm that our research is having a significant positive impact on lives across the globe.