Detection and Disinfestation of Diseased Plants with YOLO Based ANFIS Controlled Unmanned Ground Vehicle


Creative Commons License

Yılmaz S., Polat D., Akboynuz E. B., Gedikli E. A., Yılmaz Z.

International Journal of Multidisciplinary Studies and Innovative Technologies, cilt.9, sa.1, ss.151-161, 2025 (Hakemli Dergi)

Özet

Plant diseases remain a major challenge in modern agriculture, causing considerable reductions in both yield and crop quality. This study focuses on the development of an intelligent unmanned ground vehicle (UGV) capable of detecting plant diseases in real time and autonomously responding through targeted spraying. A camera mounted on the UGV captures continuous images of crop rows, and disease detection is carried out using the YOLO (You Only Look Once) algorithm—chosen for its speed and accuracy in real-time object recognition. To evaluate model performance, YOLOv7, v8, and v9 were trained using datasets focused on potato leaf diseases, including early and late blight. The YOLOv8 model was selected for deployment on a Raspberry Pi 4B based on its superior detection accuracy. Additionally, a servo motor-enhanced vision system was implemented to broaden the camera’s coverage.

The UGV's autonomous driving is enabled by a combination of five ultrasonic sensors and an ANFIS (Adaptive Neuro-Fuzzy Inference System)-based decision-making module, which governs navigation and motion planning. As the vehicle traverses the field, the onboard system identifies infected plants and activates a localized spraying mechanism to treat only the affected areas. This integrated approach significantly reduces pesticide use, minimizes environmental harm, and lowers the dependency on manual labor. The results demonstrate a promising application of artificial intelligence and embedded systems for sustainable and efficient disease management in precision agriculture.