Leaf Classification for Crop Pests and Diseases in the Compressed Domain

Sensors (Basel). 2022 Dec 21;23(1):48. doi: 10.3390/s23010048.

Abstract

Crop pests and diseases have been the main cause of reduced food production and have seriously affected food security. Therefore, it is very urgent and important to solve the pest problem efficiently and accurately. While traditional neural networks require complete processing of data when processing data, by compressed sensing, only one part of the data needs to be processed, which greatly reduces the amount of data processed by the network. In this paper, a combination of compressed perception and neural networks is used to classify and identify pest images in the compressed domain. A network model for compressed sampling and classification, CSBNet, is proposed to enable compression in neural networks instead of the sensing matrix in conventional compressed sensing (CS). Unlike traditional compressed perception, no reduction is performed to reconstruct the image, but recognition is performed directly in the compressed region, while an attention mechanism is added to enhance feature strength. The experiments in this paper were conducted on different datasets with various sampling rates separately, and our model was substantially less accurate than the other models in terms of trainable parameters, reaching a maximum accuracy of 96.32%, which is higher than the 93.01%, 83.58%, and 87.75% of the other models at a sampling rate of 0.7.

Keywords: agricultural images; compressed domain; image classification; neural networks.

MeSH terms

  • Data Compression* / methods
  • Neural Networks, Computer
  • Plant Leaves

Grants and funding

This research was funded by the National Natural Science Foundation of China (Grant Nos.61861021). Jiangxi Natural Science Foundation (Grant Nos.20224BAB202038). National Natural Science Foundation of China(Grant Nos.61863027) and the Science and Technology Project of the Education Department of Jiangxi Province (Grant Nos.GJJ190194 and GJJ200426).