Static laser weeding system based on improved YOLOv8 and image fusion

Published: 3 October 2024
Abstract Views: 123
PDF: 77
Publisher's note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Authors

Laser weeding is one of the promising weed control methods for weed management in organic agriculture. However, the complex field environments lead to low weed detection accuracy, which makes it difficult to meet the requirements of high-precision laser weed control. To overcome this challenge and facilitate precise weeding by laser weeding robots in complex fields, this study suggests the use of a dual-mode image fusion algorithm of visible light and infrared light based on machine vision. This innovative technology, introducing infrared information based on visible light images, enhances weed detection accuracy and resilience to environmental factors. The introduction of the Swin-transformer module and Slim-neck module enables the creation of a brand new weed detection model allied with the YOLOv8 model, applicable for weed meristem detection. According to the experimental results, for fusion images with a resolution of 640*640, the dual-scale fusion of RGB and NIR images on the improved network has an average accuracy (mAP) of 96.0% and a detection accuracy of 94.0%, respectively. This study builds a laser weeding robot with a mobile platform, a weed recognition module and a laser polarization transmitter module. The ROS system is utilized to effectively detect weeds and determine their geometric center position after the weed detection model is successfully installed on the robot platform. The laser vibrator demonstrates accurate deflection to the weed growth position during the weed detection and laser illumination experiment. The results show that the accuracy of weed detection has reached 82.1%, and the efficiency of laser weeding has reached 72.3%. These results prove the feasibility of the laser weeding method proposed in this study. However, the fusion strategy of these two kinds of images still has great room for improvement in terms of detection accuracy and efficiency. In the future, multiple modal information can be used to improve the identification efficiency of weeds in the field.

Dimensions

Altmetric

PlumX Metrics

Downloads

Download data is not yet available.

Citations

Ahmad, J., Muhammad, K., Ahmad, I., Ahmad, W., Smith, M.L., Smith, L.N., Jain, D.K., Wang, H., Mehmood, I., 2018. Visual features based boosted classification of weeds for real-time selective herbicide sprayer systems. Comput. Ind. 98:23-33. DOI: https://doi.org/10.1016/j.compind.2018.02.005
Arsa, D.M.S., Ilyas, T., Park, S.-H., Won, O., Kim, H., 2023. Eco-friendly weeding through precise detection of growing points via efficient multi-branch convolutional neural networks. Comput. Electron. Agr. 209:107830. DOI: https://doi.org/10.1016/j.compag.2023.107830
Bavirisetti, D.P., Dhuli, R., 2016. Two-scale image fusion of visible and infrared images using saliency detection. Infrared Phys. Technol. 76:52–64. DOI: https://doi.org/10.1016/j.infrared.2016.01.009
Bawden, O., Kulk, J., Russell, R., McCool, C., English, A., Dayoub, F., Lehnert, C., Perez, T., 2017. Robot for weed species plant-specific management: BAWDEN et al. J. Field Robot. 34;1179–1199. DOI: https://doi.org/10.1002/rob.21727
Bwambale, E., Abagale, F.K., Anornu, G.K., 2022. Smart irrigation monitoring and control strategies for improving water use efficiency in precision agriculture: A review. Agr. Water Manage. 260:107324. DOI: https://doi.org/10.1016/j.agwat.2021.107324
Cheng, X., Geng, K., Wang, Z., Wang, J., Sun, Y., Ding, P., 2023. SLBAF-Net: Super-Lightweight bimodal adaptive fusion network for UAV detection in low recognition environment. Multimed. Tools Appl. 82:47773-47792. DOI: https://doi.org/10.1007/s11042-023-15333-w
Cisternas, I., Velásquez, I., Caro, A., Rodríguez, A., 2020. Systematic literature review of implementations of precision agriculture. Comp. Electron. Agr. 176:105626. DOI: https://doi.org/10.1016/j.compag.2020.105626
Espejo-Garcia, B., Panoutsopoulos, H., Anastasiou, E., Rodríguez-Rigueiro, F.J., Fountas, S., 2023. Top-tuning on transformers and data augmentation transferring for boosting the performance of weed identification. Comp. Electron. Agr. 211:108055. DOI: https://doi.org/10.1016/j.compag.2023.108055
Fahad, S., Hussain, S., Chauhan, B.S., Saud, S., Wu, C., Hassan, S., Tanveer, M., Jan, A., Huang, J., 2015. Weed growth and crop yield loss in wheat as influenced by row spacing and weed emergence times. Crop Protect. 71:101–108. DOI: https://doi.org/10.1016/j.cropro.2015.02.005
Gai, J., Tang, L., Steward, B.L., 2020. Automated crop plant detection based on the fusion of color and depth images for robotic weed control. J. Field Robot. 37:35–52. DOI: https://doi.org/10.1002/rob.21897
Gan, H., Lee, W.S., Alchanatis, V., Ehsani, R., Schueller, J.K., 2018. Immature green citrus fruit detection using color and thermal images. Comp. Electron. Agr. 152:117–125. DOI: https://doi.org/10.1016/j.compag.2018.07.011
Jiang, W., Quan, L., Wei, G., Chang, C., Geng, T., 2023. A conceptual evaluation of a weed control method with post-damage application of herbicides: A composite intelligent intra-row weeding robot. Soil Till. Res. 234:105837. DOI: https://doi.org/10.1016/j.still.2023.105837
Li, D., Li, B., Long, S., Feng, H., Wang, Y., Wang, J., 2023. Robust detection of headland boundary in paddy fields from continuous RGB-D images using hybrid deep neural networks. Comp. Electron. Agr. 207:107713. DOI: https://doi.org/10.1016/j.compag.2023.107713
Li, D., Song, Z., Quan, C., Xu, X., Liu, C., 2021. Recent advances in image fusion technology in agriculture. Comp. Electron. Agr. 191:106491. DOI: https://doi.org/10.1016/j.compag.2021.106491
Li, H., Wu, X.-J., 2019. DenseFuse: a fusion approach to infrared and visible images. IEEE T. Image Process 28:2614–2623. DOI: https://doi.org/10.1109/TIP.2018.2887342
Ma, J., Ma, Y., Li, C., 2019. Infrared and visible image fusion methods and applications: A survey. Inform. Fusion 45:153–178. DOI: https://doi.org/10.1016/j.inffus.2018.02.004
Machleb, J., Peteinatos, G.G., Kollenda, B.L., Andújar, D., Gerhards, R., 2020. Sensor-based mechanical weed control: Present state and prospects. Comput. Electron. Agr. 176:105638. DOI: https://doi.org/10.1016/j.compag.2020.105638
Martelloni, L., Frasconi, C., Sportelli, M., Fontanelli, M., Raffaelli, M., Peruzzi, A., 2020. Flaming, Glyphosate, hot foam and nonanoic acid for weed control: a comparison. Agronomy. 10:129. DOI: https://doi.org/10.3390/agronomy10010129
Marx, C., Barcikowski, S., Hustedt, M., Haferkamp, H., Rath, T., 2012. Design and application of a weed damage model for laser-based weed control. Biosyst. Eng. 113:148–157. DOI: https://doi.org/10.1016/j.biosystemseng.2012.07.002
Mathiassen, S.K., Bak, T., Christensen, S., Kudsk, P., 2006. The effect of laser treatment as a weed control method. Biosyst. Eng. 95:497–505. DOI: https://doi.org/10.1016/j.biosystemseng.2006.08.010
Pérez-Ruíz, M., Slaughter, D.C., Fathallah, F.A., Gliever, C.J., Miller, B.J., 2014. Co-robotic intra-row weed control system. Biosyst. Eng. 126:45–55. DOI: https://doi.org/10.1016/j.biosystemseng.2014.07.009
Quan, L., Jiang, W., Li, Hailong, Li, Hengda, Wang, Q., Chen, L., 2022. Intelligent intra-row robotic weeding system combining deep learning technology with a targeted weeding mode. Biosyst. Eng. 216:13–31. DOI: https://doi.org/10.1016/j.biosystemseng.2022.01.019
Raja, R., Nguyen, T.T., Slaughter, D.C., Fennimore, S.A., 2020. Real-time weed-crop classification and localisation technique for robotic weed control in lettuce. Biosyst. Eng. 192:257–274. DOI: https://doi.org/10.1016/j.biosystemseng.2020.02.002
Rani, L., Thapa, K., Kanojia, N., Sharma, N., Singh, S., Grewal, A.S., Srivastav, A.L., Kaushal, J., 2021. An extensive review on the consequences of chemical pesticides on human health and environment. J. Cleaner Prod. 283:124657. DOI: https://doi.org/10.1016/j.jclepro.2020.124657
Sujaritha, M., Annadurai, S., Satheeshkumar, J., Kowshik Sharan, S., Mahesh, L., 2017. Weed detecting robot in sugarcane fields using fuzzy real time classifier. Comput. Electron. Agr. 134:160–171. DOI: https://doi.org/10.1016/j.compag.2017.01.008
Tu, S., Pang, J., Liu, H., Zhuang, N., Chen, Y., Zheng, C., Wan, H., Xue, Y., 2020. Passion fruit detection and counting based on multiple scale faster R-CNN using RGB-D images. Precision Agric. 21:1072–1091. DOI: https://doi.org/10.1007/s11119-020-09709-3
Wang, A., Zhang, W., Wei, X., 2019. A review on weed detection using ground-based machine vision and image processing techniques. Comput. Electron. Agr. 158:226–240. DOI: https://doi.org/10.1016/j.compag.2019.02.005
Wang, M., Leal-Naranjo, J.-A., Ceccarelli, M., Blackmore, S., 2022. A novel two-degree-of-freedom gimbal for dynamic laser weeding: design, analysis, and experimentation. IEEE/ASME Trans. Mechatron. 27:5016–5026. DOI: https://doi.org/10.1109/TMECH.2022.3169593
Wang, Y., Zhang, S., Dai, B., Yang, S., Song, H., 2023. Fine-grained weed recognition using Swin Transformer and two-stage transfer learning. Front. Plant Sci. 14:1134932. DOI: https://doi.org/10.3389/fpls.2023.1134932
Westwood, J.H., Charudattan, R., Duke, S.O., Fennimore, S.A., Marrone, P., Slaughter, D.C., Swanton, C., Zollinger, R., 2018. Weed management in 2050: perspectives on the future of weed science. Weed Sci. 66:275–285. DOI: https://doi.org/10.1017/wsc.2017.78
Wu, X., Aravecchia, S., Lottes, P., Stachniss, C., Pradalier, C., 2020. Robotic weed control using automated weed and crop classification. J. Field Robot. 37:322–340. DOI: https://doi.org/10.1002/rob.21938
Xiong, Y., Ge, Y., Liang, Y., Blackmore, S., 2017. Development of a prototype robot and fast path-planning algorithm for static laser weeding. Comput. Electron. Agr. 142:494–503. DOI: https://doi.org/10.1016/j.compag.2017.11.023
Xu, K., Zhu, Y., Cao, W., Jiang, X., Jiang, Z., Li, S., Ni, J., 2021. Multi-Modal deep learning for weeds detection in wheat field based on RGB-D images. Front. Plant Sci. 12:732968. DOI: https://doi.org/10.3389/fpls.2021.732968
Xue, Y., Ju, Z., Li, Y., Zhang, W., 2021. MAF-YOLO: Multi-modal attention fusion based YOLO for pedestrian detection. Infrared Phys. Technol. 118:103906. DOI: https://doi.org/10.1016/j.infrared.2021.103906
Zhang, X., Ye, P., Leung, H., Gong, K., Xiao, G., 2020. Object fusion tracking based on visible and infrared images: A comprehensive review. Inform. Fusion 63:166–187. DOI: https://doi.org/10.1016/j.inffus.2020.05.002
Zhu, H., Zhang, Y., Mu, D., Bai, L., Zhuang, H., Li, H., 2022. YOLOX-based blue laser weeding robot in corn field. Front. Plant Sci. 13:1017803. DOI: https://doi.org/10.3389/fpls.2022.1017803
Zhu, J., Wang, J., DiTommaso, A., Zhang, C., Zheng, G., Liang, W., Islam, F., Yang, C., Chen, X., Zhou, W., 2020. Weed research status, challenges, and opportunities in China. Crop Protect. 134:104449. DOI: https://doi.org/10.1016/j.cropro.2018.02.001
Zou, K., Wang, H., Yuan, T., Zhang, C., 2023. Multi-species weed density assessment based on semantic segmentation neural network. Precis. Agric. 24:458–481. DOI: https://doi.org/10.1007/s11119-022-09953-9

How to Cite

Du, X. (2024) “Static laser weeding system based on improved YOLOv8 and image fusion”, Journal of Agricultural Engineering. doi: 10.4081/jae.2024.1598.