Review Articles

A review of deep learning based agricultural remote sensing image segmentation

Publisher's note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
Published: 1 December 2025
721
Views
143
Downloads

Authors

Agricultural remote sensing image segmentation, which involves classifying each pixel of an image into a specific category, has recently been driven by deep learning methods due to their powerful feature extraction capabilities. This paper presents a systematic review of deep learning-based image segmentation techniques for agricultural remote sensing, along with an overview of current challenges and emerging research trends. First, it outlines the characteristics of agricultural remote sensing tasks and the requirements for remote sensing image acquisition and processing, providing an in-depth analysis of the nature of agricultural remote sensing data. Next, it systematically reviews the evolution of deep learning-based methods, with a focus on summarizing segmentation network architectures, including convolution-based models, transformer-based models, hybrid architectures, lightweight models, and vision-language models. Moreover, it discusses several deep learning paradigms designed for annotation-efficient scenarios, including semi-supervised, weakly supervised, self-supervised, and transfer learning. Then, it offers an in-depth analysis of key challenges, such as data annotation, computational cost, and model generalization. Finally, it summarizes the latest advances in deep learning for agricultural remote sensing image segmentation and outlines potential future research directions, aiming to provide technical references that promote the practical application and successful deployment of deep learning in this critical domain.

Downloads

Download data is not yet available.

Citations

Abbas, A., Jain, S., Gour, M., Vankudothu, S. 2021. Tomato plant disease detection using transfer learning with C-GAN synthetic images. Comput. Electron. Agric. 187:106279. DOI: https://doi.org/10.1016/j.compag.2021.106279
Abouzahir, S., Sadik, M., Sabir, E. 2017. IoT-empowered smart agriculture: A real-time light-weight embedded segmentation system. In: E. Sabir, A. García Armada, M. Ghogho, M. Debbah (eds.), Ubiquitous networking. UNet 2017.Cham, Springer. pp. 319–332 DOI: https://doi.org/10.1007/978-3-319-68179-5_28
Aguilar, M., Agüera, F., Aguilar, F., Carvajal, F. 2008. Geometric accuracy assessment of the orthorectification process from very high resolution satellite imagery for Common Agricultural Policy purposes. Int. J. Remote Sens. 29:7181-7197. DOI: https://doi.org/10.1080/01431160802238393
Ahlswede, S., Madam, N. T., Schulz, C., Kleinschmit, B., Demir, B. 2022. Weakly supervised semantic segmentation of remote sensing images for tree species classification based on explanation methods. Proc. IEEE Int. Geoscience and Remote Sensing Symp., Kuala Lumpur. pp. 4846–4849. DOI: https://doi.org/10.1109/IGARSS46834.2022.9884676
Ahamed, T., Tian, L., Jiang, Y., Zhao, B., Liu, H., Ting, K. 2012. Tower remote-sensing system for monitoring energy crops; Image acquisition and geometric corrections. Biosyst. Eng. 112:93-107. DOI: https://doi.org/10.1016/j.biosystemseng.2012.03.003
Ahmad, H., Sun, J., Nirere, A., Shaheen, N., Zhou, X., Yao, K. 2021. Classification of tea varieties based on fluorescence hyperspectral image technology and ABC-SVM algorithm. J. Food Process. Preserv. 45:e15241. DOI: https://doi.org/10.1111/jfpp.15241
Alam, M., Wang, J., Guangpei, C., Yunrong, L., Chen, Y. 2021. Convolutional neural network for the semantic segmentation of remote sensing images. Mob. Netw. Appl. 26:200-215. DOI: https://doi.org/10.1007/s11036-020-01703-3
Alzubaidi, L., Chlaib, H., Fadhel, M., Chen, Y., Bai, J., Albahri, A., et al. 2024. Reliable deep learning framework for the ground penetrating radar data to locate the horizontal variation in levee soil compaction. Eng. Appl. Artif. Intell. 129:107627. DOI: https://doi.org/10.1016/j.engappai.2023.107627
Arango, R., Campos, A., Combarro, E., Canas, E., Díaz, I. 2016. Mapping cultivable land from satellite imagery with clustering algorithms. Int. J. Appl. Earth Obs. Geoinf. 49:99-106. DOI: https://doi.org/10.1016/j.jag.2016.01.009
Astruc, G., Gonthier, N., Mallet, C., Landrieu, L. 2025. OmniSat: Self-supervised modality fusion for Earth observation. In A. Leonardis, E. Ricci, S. Roth, O. Russakovsky, T. Sattler, G. Varol (eds.), Computer Vision – ECCV 2024. Cham, Springer. pp. 409-427. DOI: https://doi.org/10.1007/978-3-031-73390-1_24
Awais, M., Li, W., Hussain, S., Cheema, M., Li, W., Song, R., et al. 2022. Comparative evaluation of land surface temperature images from unmanned aerial vehicle and satellite observation for agricultural areas using in situ data. Agriculture 12:184. DOI: https://doi.org/10.3390/agriculture12020184
Badrinarayanan, V., Kendall, A., Cipolla, R. 2017. SegNet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE T. Pattern Anal. Mach. Intell. 39:2481-2495. DOI: https://doi.org/10.1109/TPAMI.2016.2644615
Banks, S., White, L., Behnamian, A., Chen, Z., Montpetit, B., Brisco, B., et al. 2019. Wetland classification with multi-angle/temporal SAR using random forests. Remote Sens. 11:670. DOI: https://doi.org/10.3390/rs11060670
Bannari, A., Morin, D., Bénié, G.B., Bonn, F.J. 1995. A theoretical review of different mathematical models of geometric corrections applied to remote sensing images. Remote Sens. Rev. 13:27-47. DOI: https://doi.org/10.1080/02757259509532295
Bazi, Y., Bashmal, L., Al Rahhal, M., Al Dayil, R., Al Ajlan, N. 2021. Vision transformers for remote sensing image classification. Remote Sens. 13:516. DOI: https://doi.org/10.3390/rs13030516
Beriaux, E., Jago, A., Lucau-Danila, C., Planchon, V., Defourny, P. 2021. Sentinel-1 time series for crop identification in the framework of the future CAP monitoring. Remote Sens. 13:2785. DOI: https://doi.org/10.3390/rs13142785
Cartolano, A., Cuzzocrea, A., Pilato, G. 2024. Analyzing and assessing explainable AI models for smart agriculture environments. Multim. Tools Appl. 83:1–22. DOI: https://doi.org/10.1007/s11042-023-17978-z
Castillo-Martínez, M., Gallegos-Funes, F., Carvajal-Gámez, B., Urriolagoitia-Sosa, G., Rosales-Silva, A. 2020. Color index based thresholding method for background and foreground segmentation of plant images. Comput. Electron. Agric. 178:105783. DOI: https://doi.org/10.1016/j.compag.2020.105783
Chandra, S., Hareendran, S., Albaaji, G. 2024. Precision farming for sustainability: An agricultural intelligence model. Comput. Electron. Agric. 226:109386. DOI: https://doi.org/10.1016/j.compag.2024.109386
Charisis, C., Argyropoulos, D. 2024. Deep learning-based instance segmentation architectures in agriculture: A review of the scopes and challenges. Smart Agric. Technol. 8:100448. DOI: https://doi.org/10.1016/j.atech.2024.100448
Chen, L.-C., Papandreou, G., Kokkinos, I., Murphy, K., Yuille, A.L. 2014. Semantic image segmentation with deep convolutional nets and fully connected CRFs. arXiv 1412.7062v4.
Chen, L.-C., Papandreou, G., Kokkinos, I., Murphy, K., Yuille, A.L. 2018. DeepLab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFs. IEEE T. Pattern Anal. Mach. Intell. 40:834-848. DOI: https://doi.org/10.1109/TPAMI.2017.2699184
Chen, L.-C., Papandreou, G., Schroff, F., Adam, H. 2017. Rethinking atrous convolution for semantic image segmentation. arXiv 1706.05587.
Chen, L.-C., Zhu, Y., Papandreou, G., Schroff, F., Adam, H. 2018. Encoder-decoder with atrous separable convolution for semantic image segmentation. In: V. Ferrari, M. Hebert, C. Sminchisescu, Y. Weiss (eds) Computer Vision – ECCV 2018. Cham, Springer. pp. 801–818. DOI: https://doi.org/10.1007/978-3-030-01234-2_49
Cheng, J., Zhu, Y., Zhao, Y., Li, T., Chen, M., Sun, Q., et al. 2024. Application of an improved U-Net with image-to-image translation and transfer learning in peach orchard segmentation. Int. J. Appl. Earth Obs. Geoinf. 130:103871. DOI: https://doi.org/10.1016/j.jag.2024.103871
Chiu, M., Xu, X., Wei, Y., Huang, Z., Schwing, A., Brunner, R., et al. 2020. Agriculture-Vision: A large aerial image database for agricultural pattern analysis. Proc. IEEE/CVF Conf. on Computer Vision and Pattern Recognition (CVPR), Seattle. pp. 2825–2835. DOI: https://doi.org/10.1109/CVPR42600.2020.00290
Cui, X., Han, W., Zhang, H., Dong, Y., Ma, W., Zhai, X., et al. 2023. Estimating and mapping the dynamics of soil salinity under different crop types using Sentinel-2 satellite imagery. Geoderma 440:116738. DOI: https://doi.org/10.1016/j.geoderma.2023.116738
Darwin, B., Dharmaraj, P., Prince, S., Popescu, D., Hemanth, D. 2021. Recognition of bloom/yield in crop images using deep learning models for smart agriculture: A review. Agronomy 11:646. DOI: https://doi.org/10.3390/agronomy11040646
de Souza, E., Scharf, P., Sudduth, K. 2010. Sun position and cloud effects on reflectance and vegetation indices of corn. Agron. J. 102:734-744. DOI: https://doi.org/10.2134/agronj2009.0206
Deng, L., Mao, Z., Li, X., Hu, Z., Duan, F., Yan, Y. 2018. UAV-based multispectral remote sensing for precision agriculture: A comparison between different cameras. ISPRS J. Photogramm. Remote Sens. 146:124-136. DOI: https://doi.org/10.1016/j.isprsjprs.2018.09.008
Di, S., Liao, M., Zhao, Y., Li, Y., Zeng, Y. 2021. Image superpixel segmentation based on hierarchical multi-level LI-SLIC. Opt. Laser Technol. 135:106703. DOI: https://doi.org/10.1016/j.optlastec.2020.106703
Dobrota, C., Carpa, R., Butiuc-Keul, A. 2021. Analysis of designs used in monitoring crop growth based on remote sensing methods. Turk. J. Agric. For. 45:730-742. DOI: https://doi.org/10.3906/tar-2012-79
Dosovitskiy, A., Beyer, L., Kolesnikov, A., Weissenborn, D., Zhai, X., Unterthiner, T., et al. 2020. An image is worth 16x16 words: Transformers for image recognition at scale. arXiv 2010.11929v2.
Du, S., Du, S., Liu, B., Zhang, X. 2021. Incorporating DeepLabV3+ and object-based image analysis for semantic segmentation of very high resolution remote sensing images. Int. J. Digit. Earth 14:357-378. DOI: https://doi.org/10.1080/17538947.2020.1831087
El Sakka, M., Mothe, J., Ivanovici, M. 2024. Images and CNN applications in smart agriculture. Eur. J. Remote Sens. 57:2352386. DOI: https://doi.org/10.1080/22797254.2024.2352386
Feng, G., Wang, C., Wang, A., Gao, Y., Zhou, Y., Huang, S., et al. 2024. Segmentation of wheat lodging areas from UAV imagery using an ultra-lightweight network. Agriculture 14:244. DOI: https://doi.org/10.3390/agriculture14020244
Flores, C., Valenzuela, A., Verschae, R. 2024. Active learning for image classification: A comprehensive analysis in agriculture. In: X.S. Yang, R.S. Sherratt, N. Dey, A. Joshi. (eds), Proc. 9th Int. Cong. on Information and Communication Technology. ICICT 2024. Singapore, Springer. pp. 607-616. DOI: https://doi.org/10.1007/978-981-97-5441-0_49
Gao, B., Montes, M., Davis, C., Goetz, A. 2009. Atmospheric correction algorithms for hyperspectral remote sensing data of land and ocean. Remote Sens. Environ. 113: S17-S24. DOI: https://doi.org/10.1016/j.rse.2007.12.015
Gao, J., Wang, B., Wang, Z., Wang, Y., Kong, F. 2020. A wavelet transform-based image segmentation method. Optik 208:164123. DOI: https://doi.org/10.1016/j.ijleo.2019.164123
Gao, L., Liu, H., Yang, M., Chen, L., Wan, Y., Xiao, Z., et al. 2021. STransFuse: fusing Swin transformer and convolutional neural network for remote sensing image semantic segmentation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 14:10990-11003. DOI: https://doi.org/10.1109/JSTARS.2021.3119654
Garioud, A., Gonthier, N., Landrieu, L., De Wit, A., Valette, M., Poupée, M., et al. 2023. FLAIR: A country-scale land cover semantic segmentation dataset from multi-source optical imagery. arXiv 2310.13336v1.
Garnot, V., Landrieu, L. (2021). Panoptic segmentation of satellite image time series with convolutional temporal attention networks. Proc. 18th IEEE/CVF Int. Conf. on Computer Vision (ICCV), Montreal. pp. 4852-4861. DOI: https://doi.org/10.1109/ICCV48922.2021.00483
Ghosh, R., Ravirathinam, P., Jia, X., Khandelwal, A., Mulla, D., Kumar, V. 2021. CalCROP21: A georeferenced multi-spectral dataset of satellite imagery and crop labels. Proc. 9th IEEE Int. Conf. on Big Data, Orlando. pp. 1625-1632. DOI: https://doi.org/10.1109/BigData52589.2021.9671569
Gilles, J. 2013. Empirical wavelet transform. IEEE T. Signal Process. 61:3999-4010. DOI: https://doi.org/10.1109/TSP.2013.2265222
Gonzalo-Martín, C., Lillo-Saavedra, M., García-Pedrero, A., Lagos, O., Menasalvas, E. 2017. Daily evapotranspiration mapping using regression random forest models. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 10:5359–5368. DOI: https://doi.org/10.1109/JSTARS.2017.2733958
Guijarro, M., Riomoros, I., Pajares, G., Zitinski, P. 2015. Discrete wavelets transform for improving greenness image segmentation in agricultural images. Comput. Electron. Agric. 118:396-407 DOI: https://doi.org/10.1016/j.compag.2015.09.011
Hadjimitsis, D., Papadavid, G., Agapiou, A., Themistocleous, K., Hadjimitsis, M., Retalis, A., et al. 2010. Atmospheric correction for satellite remotely sensed data intended for agricultural applications: Impact on vegetation indices. Nat. Hazards Earth Syst. Sci. 10:89-95. DOI: https://doi.org/10.5194/nhess-10-89-2010
Han, K., Wang, Y., Chen, H., Chen, X., Guo, J., Liu, Z., et al. 2023. A survey on vision transformer. IEEE T. Pattern Anal. Mach. Intell. 45:87-110. DOI: https://doi.org/10.1109/TPAMI.2022.3152247
Hassanein, M., Lari, Z., El-Sheimy, N. 2018. A new vegetation segmentation approach for cropped fields based on threshold detection from hue histograms. Sensors 18:1253. DOI: https://doi.org/10.3390/s18041253
He, J., Zhao, L., Yang, H., Zhang, M., Li, W. 2020. HSI-BERT: Hyperspectral image classification using the bidirectional encoder representation from transformers. IEEE T. Geosci. Remote Sens. 58:165-178. DOI: https://doi.org/10.1109/TGRS.2019.2934760
He, X., Zhou, Y., Zhao, J., Zhang, D., Yao, R., Xue, Y. 2022. Swin transformer embedding UNet for remote sensing image semantic segmentation. IEEE T. Geosci. Remote Sens. 60:4408715. DOI: https://doi.org/10.1109/TGRS.2022.3144165
Helber, P., Bischke, B., Dengel, A., Borth, D. 2019. EuroSAT: A novel dataset and deep learning benchmark for land use and land cover classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 12:2217-2226. DOI: https://doi.org/10.1109/JSTARS.2019.2918242
Holder, C.J., Shafique, M. 2022. On efficient real-time semantic segmentation: A survey. arXiv 2206.08605.
Hong, D., Han, Z., Yao, J., Gao, L., Zhang, B., Plaza, A., et al. 2022. SpectralFormer: Rethinking hyperspectral image classification with transformers. IEEE T. Geosci. Remote Sens. 60:5518615. DOI: https://doi.org/10.1109/TGRS.2021.3130716
Hong, D., Zhang, B., Li, X., Li, Y., Li, C., Yao, J., et al. 2024. SpectralGPT: Spectral remote sensing foundation model. IEEE T. Pattern Anal. Mach. Intell. 46:5227–5244. DOI: https://doi.org/10.1109/TPAMI.2024.3362475
Hua, Y., Marcos, D., Mou, L., Zhu, X., Tuia, D. 2022. Semantic segmentation of remote sensing images with sparse annotations. IEEE Geosci. Remote Sens. Lett. 19:3051053. DOI: https://doi.org/10.1109/LGRS.2021.3051053
Jia, W., Zheng, Y., Zhao, D., Yin, X., Liu, X., Du, R. 2018. Preprocessing method of night vision image application in apple harvesting robot. Int. J. Agric. Biol. Eng. 11:158-163. DOI: https://doi.org/10.25165/j.ijabe.20181102.2822
Jiang, J., Lyu, C., Liu, S., He, Y., Hao, X. 2020. RWSNet: A semantic segmentation network based on SegNet combined with random walk for remote sensing. Int. J. Remote Sens. 41:487-505. DOI: https://doi.org/10.1080/01431161.2019.1643937
Jiang, Y., Li, C. 2020. Convolutional neural networks for image-based high-throughput plant phenotyping: A review. Plant Phenomics 2020:4152816. DOI: https://doi.org/10.34133/2020/4152816
Jiang, Y., Tang, Y., Li, H. 2022. A review of trends in the use of sewage irrigation technology from the livestock and poultry breeding industries for farmlands. Irrig. Sci. 40:297-308. DOI: https://doi.org/10.1007/s00271-022-00794-y
Jonnala, N., Siraaj, S., Prastuti, Y., Chinnababu, P., Babu, B., Bansal, S., et al. 2025. AER U-Net: Attention-enhanced multi-scale residual U-Net structure for water body segmentation using Sentinel-2 satellite images. Sci. Rep. 15:16099. DOI: https://doi.org/10.1038/s41598-025-99322-z
Karlson, M., Ostwald, M., Bayala, J., Bazié, H. R., Ouedraogo, A. S., Soro, B., et al. 2020. The potential of Sentinel-2 for crop production estimation in a smallholder agroforestry landscape, Burkina Faso. Front. Environ. Sci. 8:85. DOI: https://doi.org/10.3389/fenvs.2020.00085
Ke, Q., Zhang, P. 2022. Hybrid-TransCD: A hybrid transformer remote sensing image change detection network via token aggregation. ISPRS Int J. Geo-Inf. 11:263. DOI: https://doi.org/10.3390/ijgi11040263
Kerner, H., Sundar, S., Satish, M. 2024. Multi-region transfer learning for segmentation of crop field boundaries in satellite images with limited labels. arXiv 2404.00179.
Khan, S., Narvekar, M. 2022. Novel fusion of color balancing and superpixel based approach for detection of tomato plant diseases in natural complex environment. J. King Saud Univ.-Comput. Inf. Sci. 34:3506-3516. DOI: https://doi.org/10.1016/j.jksuci.2020.09.006
Khan, S., Naseer, M., Hayat, M., Zamir, S., Khan, F., Shah, M. 2022. Transformers in vision: A survey. ACM Comput. Surv. 54:200. DOI: https://doi.org/10.1145/3505244
Koonce, B. 2021. MobileNetV3. In B. Koonce (ed.), Convolutional neural networks with Swift for Tensorflow: Image recognition and dataset categorization. Berkley, Apress. pp. 125-144. DOI: https://doi.org/10.1007/978-1-4842-6168-2_11
Lan, Y., Huang, K., Yang, C., Lei, L., Ye, J., Zhang, J., et al. 2021. Real-time identification of rice weeds by UAV low-altitude remote sensing based on improved semantic segmentation model. Remote Sens. 13:4370. DOI: https://doi.org/10.3390/rs13214370
LeCun, Y., Bengio, Y., Hinton, G. 2015. Deep learning. Nature 521:36–444. DOI: https://doi.org/10.1038/nature14539
Lei, L., Yang, Q., Yang, L., Shen, T., Wang, R., Fu, C. 2024. Deep learning implementation of image segmentation in agricultural applications: A comprehensive review. Artif. Intell. Rev. 57:149. DOI: https://doi.org/10.1007/s10462-024-10775-6
Li, J., Cai, Y., Li, Q., Kou, M., Zhang, T. 2024. A review of remote sensing image segmentation by deep learning methods. Int. J. Digit. Earth 17:2328827. DOI: https://doi.org/10.1080/17538947.2024.2328827
Li, J., Luo, W., Han, L., Cai, Z., Guo, Z. 2022. Two-wavelength image detection of early decayed oranges by coupling spectral classification with image processing. J. Food Compos. Anal. 111:104642. DOI: https://doi.org/10.1016/j.jfca.2022.104642
Li, Z., Chen, P., Shuai, L., Wang, M., Zhang, L., Wang, Y., et al. 2022. A copy paste and semantic segmentation-based approach for the classification and assessment of significant rice diseases. Plants 11:3174. DOI: https://doi.org/10.3390/plants11223174
Li, Z., Chen, G., Zhang, T. 2020. A CNN-Transformer hybrid approach for crop classification using multitemporal multisensor images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 13: 847-858. DOI: https://doi.org/10.1109/JSTARS.2020.2971763
Li, L., Zhang, W., Zhang, X., Emam, M., Jing, W. 2023. Semi-supervised remote sensing image semantic segmentation method based on deep learning. Electronics 12:348. DOI: https://doi.org/10.3390/electronics12020348
Li, X., Wen, C., Hu, Y., Yuan, Z., Zhu, X.X. 2024. Vision-language models in remote sensing: Current progress and future trends. IEEE Geosci. Remote Sens. Mag. 12:32–66. DOI: https://doi.org/10.1109/MGRS.2024.3383473
Liepa, A., Thiel, M., Taubenböck, H., Steffan-Dewenter, I., Abu, I.-O., Singh Dhillon, M., et al. 2024. Harmonized NDVI time-series from Landsat and Sentinel-2 reveal phenological patterns of diverse, small-scale cropping systems in East Africa. Remote Sens. Appl. Soc. Environ. 35:101230. DOI: https://doi.org/10.1016/j.rsase.2024.101230
Liu, B., Li, B., Liu, H., Li, S. 2024. ST-MDAMNet: Swin transformer combines multi-dimensional attention mechanism for semantic segmentation of high-resolution earth surface images. Adv. Space Res. 74:3691–3705. DOI: https://doi.org/10.1016/j.asr.2024.06.056
Liu, B., Li, B., Sreeram, V., Li, S. 2024. MBT-UNet: Multi-branch transform combined with UNet for semantic segmentation of remote sensing images. Remote Sens. 16:2776. DOI: https://doi.org/10.3390/rs16152776
Liu, B., Yu, A., Gao, K., Tan, X., Sun, Y., Yu, X. 2022. DSS-TRM: Deep spatial-spectral transformer for hyperspectral image classification. Eur. J. Remote Sens. 55:103-114. DOI: https://doi.org/10.1080/22797254.2021.2023910
Liu, F., Chen, D., Guan, Z., Zhou, X., Zhu, J., Ye, Q., et al. 2024. RemoteCLIP: A vision language foundation model for remote sensing. IEEE T. Geosci. Remote Sens. 62:5622216. DOI: https://doi.org/10.1109/TGRS.2024.3390838
Liu, S., Cao, S., Lu, X., Peng, J., Ping, L., Fan, X., et al. 2025. Lightweight deep learning model, ConvNeXt-U: An improved U-Net network for extracting cropland in complex landscapes from Gaofen-2 images. Sensors 25:261. DOI: https://doi.org/10.3390/s25010261
Liu, Y., Lan, Y., Chen, X. 2025. Partial convolutional biformer: A transformer architecture for diagnosing crop diseases under complex backgrounds. Crop Prot. 193:107007. DOI: https://doi.org/10.1016/j.cropro.2024.107007
Liu, Z., Lin, Y., Cao, Y., Hu, H., Wei, Y., Zhang, Z., et al. 2021. Swin transformer: Hierarchical vision transformer using shifted windows. Proc. IEEE/CVF Int. Conf. on Computer Vision, Montreal. pp. 10012-10022. DOI: https://doi.org/10.1109/ICCV48922.2021.00986
Lu, P., Zheng, W., Lv, X., Xu, J., Zhang, S., Li, Y., et al. 2024. An extended method based on the geometric position of salient image features: Solving the dataset imbalance problem in greenhouse tomato growing scenarios. Agriculture 14:1893. DOI: https://doi.org/10.3390/agriculture14111893
Lu, Y., Chen, D., Olaniyi, E., Huang, Y. 2022. Generative adversarial networks (GANs) for image augmentation in agriculture: A systematic review. Comput. Electron. Agric. 200:107208. DOI: https://doi.org/10.1016/j.compag.2022.107208
Luo, Z., Yang, W., Yuan, Y., Gou, R., Li, X. 2024. Semantic segmentation of agricultural images: A survey. Inf. Process. Agric. 11:172-186. DOI: https://doi.org/10.1016/j.inpa.2023.02.001
Luo, X., Liao, J., Zang, Y., Zhou, Z. 2016. Improving agricultural mechanization level to promote agricultural sustainable development. T. CSAE 32:1-11.
Lv, J., Shen, Q., Lv, M., Li, Y., Shi, L., Zhang, P. 2023. Deep learning-based semantic segmentation of remote sensing images: A review. Front. Ecol. Evol. 11:1201125. DOI: https://doi.org/10.3389/fevo.2023.1201125
Ma, D., Jiang, L., Li, J., Shi, Y. 2023. Water index and Swin Transformer Ensemble (WISTE) for water body extraction from multispectral remote sensing images. GIScience Remote Sens. 60:2251704. DOI: https://doi.org/10.1080/15481603.2023.2251704
Ma, N., Zhang, X., Zheng, H.-T., Sun, J. 2018. ShuffleNet V2: Practical guidelines for efficient CNN architecture design. Proc. European Conf. on Computer Vision (ECCV). Cham, Springer. pp. 122-138. Springer. DOI: https://doi.org/10.1007/978-3-030-01264-9_8
Mamat, N., Othman, M., Abdoulghafor, R., Belhaouari, S., Mamat, N., Hussein, S. 2022. Advanced technology in agriculture industry by implementing image annotation technique and deep learning approach: A review. Agriculture 12:1033. DOI: https://doi.org/10.3390/agriculture12071033
Marsocci, V., Scardapane, S., Komodakis, N. 2021. MARE: Self-supervised multi-attention REsu-Net for semantic segmentation in remote sensing. Remote Sens. 13:3275. DOI: https://doi.org/10.3390/rs13163275
Mehta, S., Rastegari, M. 2021. MobileViT: Light-weight, general-purpose, and mobile-friendly vision transformer. arXiv 2110.02178v2
Memon, M., Chen, S., Niu, Y., Zhou, W., Elsherbiny, O., Liang, R., et al. 2023. Evaluating the efficacy of Sentinel-2B and Landsat-8 for estimating and mapping wheat straw cover in rice-wheat fields. Agronomy 13:2691. DOI: https://doi.org/10.3390/agronomy13112691
Meng, X., Yang, Y., Wang, L., Wang, T., Li, R., Zhang, C. 2022. Class-guided Swin transformer for semantic segmentation of remote sensing imagery. IEEE Geosci. Remote Sens. Lett. 19:6517505. DOI: https://doi.org/10.1109/LGRS.2022.3215200
Naseer, M., Ranasinghe, K., Khan, S., Hayat, M., Khan, F., Yang, M. 2021. Intriguing properties of vision transformers. arXiv 2105.10497v3.
Nazeer, M., Ilori, C., Bilal, M., Nichol, J., Wu, W., Qiu, Z., et al. 2021. Evaluation of atmospheric correction methods for low to high resolutions satellite remote sensing data. Atmos. Res. 249:105308. DOI: https://doi.org/10.1016/j.atmosres.2020.105308
Ntakos, G., Prikaziuk, E., ten Den, T., Reidsma, P., Vilfan, N., van der Wal, T., et al. 2024. Coupled WOFOST and SCOPE model for remote sensing-based crop growth simulations. Comput. Electron. Agric. 225:109238. DOI: https://doi.org/10.1016/j.compag.2024.109238
O'Shea, K., Nash, R. 2015. An introduction to convolutional neural networks. arXiv 1511.08458v2.
Ou, Y., Yan, J., Liang, Z., Zhang, B. 2024. Hyperspectral imaging combined with deep learning for the early detection of strawberry leaf gray mold disease. Agronomy 14:2694. DOI: https://doi.org/10.3390/agronomy14112694
Padshetty, S., Umashetty, A. 2024. Agricultural innovation through deep learning: A hybrid CNN-Transformer architecture for crop disease classification. J. Spatial Sci. 1-32. DOI: https://doi.org/10.1080/14498596.2024.2355225
Panboonyuen, T., Charoenphon, C., Satirapod, C. 2023. MeViT: A medium-resolution vision transformer for semantic segmentation on Landsat satellite imagery for agriculture in Thailand. Remote Sens. 15:5124. DOI: https://doi.org/10.3390/rs15215124
Pei, H., Sun, Y., Huang, H., Zhang, W., Sheng, J., Zhang, Z. 2022. Weed detection in maize fields by UAV images based on crop row preprocessing and improved YOLOv4. Agriculture 12:975. DOI: https://doi.org/10.3390/agriculture12070975
Peng, Y., Wang, A., Liu, J., Faheem, M. 2021. A comparative study of semantic segmentation models for identification of grape with different varieties. Agriculture 11:997. DOI: https://doi.org/10.3390/agriculture11100997
Qing, J., Deng, X., Lan, Y., Li, Z. 2023. GPT-aided diagnosis on agricultural image based on a new light YOLOPC. Comput. Electron. Agric. 213:108168. DOI: https://doi.org/10.1016/j.compag.2023.108168
Raei, E., Asanjan, A., Nikoo, M., Sadegh, M., Pourshahabi, S., Adamowski, J. 2022. A deep learning image segmentation model for agricultural irrigation system classification. Comput. Electron. Agric. 198:106977. DOI: https://doi.org/10.1016/j.compag.2022.106977
Ramos, L., Sappa, A. 2025. Leveraging U-Net and selective feature extraction for land cover classification using remote sensing imagery. Sci. Rep. 15:784. DOI: https://doi.org/10.1038/s41598-024-84795-1
Rasmussen, C., Kirk, K., Moeslund, T. 2022. The challenge of data annotation in deep learning - A case study on whole plant corn silage. Sensors 22:1596. DOI: https://doi.org/10.3390/s22041596
Rehman, M., Liu, J., Nijabat, A., Faheem, M., Wang, W., Zhao, S. 2024. Leveraging convolutional neural networks for disease detection in vegetables: A comprehensive review. Agronomy 14:2231. DOI: https://doi.org/10.3390/agronomy14102231
Ronneberger, O., Fischer, P., Brox, T. 2015. U-Net: Convolutional networks for biomedical image segmentation. In: N. Navab, J. Hornegger, W.M. Wells, A.F. Frangi (eds.), Medical image computing and computer-assisted intervention – MICCAI 2015. Cham, Springer. pp. 234-241. DOI: https://doi.org/10.1007/978-3-319-24574-4_28
Ryo, M. 2022. Explainable artificial intelligence and interpretable machine learning for agricultural data analysis. Artif. Intell. Agric. 6:257–265. DOI: https://doi.org/10.1016/j.aiia.2022.11.003
Sakka, M.E., De Pourtales, C., Chaari, L., Mothe, J. 2025. AgriPotential: A novel multi-spectral and multi-temporal remote sensing dataset for agricultural potentials. arXiv 2506.11740.
Sani, D., Mahato, S., Saini, S., Agarwal, H. K., Devshali, C. C., Anand, S., et al. 2024. SICKLE: A multi-sensor satellite imagery dataset annotated with multiple key cropping parameters. arXiv 2312.00069v1. DOI: https://doi.org/10.1109/WACV57701.2024.00589
Shams, M., Gamel, S., Talaat, F. 2024. Enhancing crop recommendation systems with explainable artificial intelligence: A study on agricultural decision-making. Neural Comput. Appl. 36:5695–5714. DOI: https://doi.org/10.1007/s00521-023-09391-2
Shenoy, J., Zhang, X., Tao, B., Mehrotra, S., Yang, R., Zhao, H., et al. 2024. Self-supervised learning across the spectrum. Remote Sens. 16:3470. DOI: https://doi.org/10.3390/rs16183470
Singh, B.M., Komal, C., Victorovich, K.A. 2020. Crop growth monitoring through Sentinel and Landsat data based NDVI time-series. Comput. Opt. 44:409-419. DOI: https://doi.org/10.18287/2412-6179-CO-635
Sishodia, R., Ray, R., Singh, S. 2020. Applications of remote sensing in precision agriculture: A review. Remote Sens 12:3136. DOI: https://doi.org/10.3390/rs12193136
Solangi, K., Siyal, A., Wu, Y., Abbasi, B., Solangi, F., Lakhiar, I., et al. 2019. An assessment of the spatial and temporal distribution of soil salinity in combination with field and satellite data: A case study in Sujawal District. Agronomy 9:869. DOI: https://doi.org/10.3390/agronomy9120869
Song, X.-P., Huang, W., Hansen, M.C., Potapov, P. 2021. An evaluation of Landsat, Sentinel-2, Sentinel-1 and MODIS data for crop type mapping. Sci. Remote Sens. 3:100018. DOI: https://doi.org/10.1016/j.srs.2021.100018
Su, D., Kong, H., Qiao, Y., Sukkarieh, S. 2021. Data augmentation for deep learning based semantic segmentation and crop-weed classification in agricultural robotics. Comput. Electron. Agric. 190:106418. DOI: https://doi.org/10.1016/j.compag.2021.106418
Sui, J., Qin, Q., Ren, H., Sun, Y., Zhang, T., Wang, J., et al. 2018. Winter wheat production estimation based on environmental stress factors from satellite observations. Remote Sens. 10:962. DOI: https://doi.org/10.3390/rs10060962
Sun, J., Yang, S., Gao, X., Ou, D., Tian, Z., Wu, J., et al. 2023. MASA-SegNet: A semantic segmentation network for PolSAR images. Remote Sens. 15:3662. DOI: https://doi.org/10.3390/rs15143662
Sun, X., Wang, P., Lu, W., Zhu, Z., Lu, X., He, Q., et al. 2023. RingMo: A remote sensing foundation model with masked image modeling. IEEE T. Geosci. Remote Sens. 61 5612822. DOI: https://doi.org/10.1109/TGRS.2022.3194732
Sykas, D., Sdraka, M., Zografakis, D., Papoutsis, I. 2022. A Sentinel-2 multiyear, multicountry benchmark dataset for crop classification and segmentation with deep learning. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 15:3323-3339. DOI: https://doi.org/10.1109/JSTARS.2022.3164771
Tao, K., Wang, A., Shen, Y., Lu, Z., Peng, F., Wei, X. 2022. Peach flower density detection based on an improved CNN incorporating attention mechanism and multi-scale feature fusion. Horticulturae 8:904. DOI: https://doi.org/10.3390/horticulturae8100904
Tao, L., Zhang, H., Jing, H., Liu, Y., Yan, D., Wei, G., et al. 2025. Advancements in vision-language models for remote sensing: Datasets, capabilities, and enhancement techniques. Remote Sens. 17:162. DOI: https://doi.org/10.3390/rs17010162
Tianxiang, Z., Yuanxiu, C., Peixian, Z., Jiangyun, L. 2024. Remotely sensed crop disease monitoring by machine learning algorithms: A review. Unmanned Syst.12:161-171. DOI: https://doi.org/10.1142/S2301385024500237
Tong, X., Xia, G., Lu, Q., Shen, H., Li, S., You, S., et al. 2020. Land-cover classification with high-resolution remote sensing images using transferable deep models. Remote Sens. Environ. 237:111322. DOI: https://doi.org/10.1016/j.rse.2019.111322
Tsakiridis, N.L., Diamantopoulos, T., Symeonidis, A.L., Theocharis, J.B., Iossifides, A., Chatzimisios, P., et al. 2020. Versatile Internet of Things for agriculture: An eXplainable AI approach. In I. Maglogiannis, L. Iliadis, E. Pimenidis (eds.), Artificial intelligence applications and innovations. AIAI 2020. Cham, Springer. pp. 180–191. DOI: https://doi.org/10.1007/978-3-030-49186-4_16
Tseng, G., Zvonkov, I., Nakalembe, C.L., Kerner, H. 2021. CropHarvest: A global dataset for crop-type classification. Proc. 35th Conf. on Neural Information Processing Systems Datasets and Benchmarks Track.
Ullah, H., Bais, A. 2022. Evaluation of model generalization for growing plants using conditional learning. Artif. Intell. Agric. 6:189–198. DOI: https://doi.org/10.1016/j.aiia.2022.09.006
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., et al. 2017. Attention is all you need. arXiv 1706.03762v7.
Wan, L., Li, H., Li, C., Wang, A., Yang, Y., Wang, P. 2022. Hyperspectral sensing of plant diseases: principle and methods. Agronomy 12:1451. DOI: https://doi.org/10.3390/agronomy12061451
Wang, J., Ding, C., Chen, S., He, C., Luo, B. 2020. Semi-supervised remote sensing image semantic segmentation via consistency regularization and average update of pseudo-label. Remote Sens. 12:3603. DOI: https://doi.org/10.3390/rs12213603
Wang, J., Qi, Z., Wang, Y., Liu, Y. 2025. A lightweight weed detection model for cotton fields based on an improved YOLOv8n. Sci. Rep. 15:457. DOI: https://doi.org/10.1038/s41598-024-84748-8
Wang, Q., Qin, W., Liu, M., Zhao, J., Zhu, Q., Yin, Y. 2024. Semantic segmentation model-based boundary line recognition method for wheat harvesting. Agriculture 14:1846. DOI: https://doi.org/10.3390/agriculture14101846
Wang, R., Ma, L., He, G., Johnson, B., Yan, Z., Chang, M., et al. 2024. Transformers for remote sensing: A systematic review and analysis. Sensors 24:3495. DOI: https://doi.org/10.3390/s24113495
Wang, S., Chen, W., Xie, S., Azzari, G., Lobell, D. 2020. Weakly supervised deep learning for segmentation of remote sensing imagery. Remote Sens. 12:207. DOI: https://doi.org/10.3390/rs12020207
Wang, Y., Yang, L., Liu, X., Yan, P. 2024. An improved semantic segmentation algorithm for high-resolution remote sensing images based on DeepLabV3+. Sci. Rep. 14:9716. DOI: https://doi.org/10.1038/s41598-024-60375-1
Wang, Y., Zhang, X., Ma, G., Du, X., Shaheen, N., Mao, H. 2021. Recognition of weeds at asparagus fields using multi-feature fusion and backpropagation neural network. Int. J. Agric. Biol. Eng. 14:190-198 DOI: https://doi.org/10.25165/j.ijabe.20211404.6135
Wang, Z., Wang, J., Yang, K., Wang, L., Su, F., Chen, X. 2022. Semantic segmentation of high-resolution remote sensing images based on a class feature attention mechanism fused with DeeplabV3+. Comput. Geosci. 158:104969. DOI: https://doi.org/10.1016/j.cageo.2021.104969
Weiss, M., Jacob, F., Duveiller, G. 2020. Remote sensing for agricultural applications: A meta-review. Remote Sens. Environ. 236:111402. DOI: https://doi.org/10.1016/j.rse.2019.111402
Weng, L., Xu, Y., Xia, M., Zhang, Y., Liu, J., Xu, Y. 2020. Water areas segmentation from remote sensing images using a separable residual SegNet network. ISPRS Int. J. Geo-Inf. 9:256. DOI: https://doi.org/10.3390/ijgi9040256
Weng, X., Pang, C., Xia, G. 2025. Vision-language modeling meets remote sensing: Models, datasets, and perspectives. IEEE Geosci. Remote Sens. Mag. 13:3572702. DOI: https://doi.org/10.1109/MGRS.2025.3572702
Weyler, J., Magistri, F., Marks, E., Chong, Y., Sodano, M., Roggiolani, G., et al. 2024. PhenoBench: A large dataset and benchmarks for semantic image interpretation in the agricultural domain. IEEE T. Pattern Anal. Mach. Intell. 46:9583-9594. DOI: https://doi.org/10.1109/TPAMI.2024.3419548
Wu, H., Du, Z., Zhong, D., Wang, Y., Tao, C. 2025. FSVLM: A vision-language model for remote sensing farmland segmentation. IEEE T. Geosci. Remote Sens. 63:4402813. DOI: https://doi.org/10.1109/TGRS.2025.3532960
Wu, J., Pichler, D., Marley, D., Wilson, D., Hovakimyan, N., Hobbs, J. 2023. Extended agriculture-vision: An extension of a large aerial image dataset for agricultural pattern analysis. arXiv 2303.02460v1.
Wu, Z., Gao, Y., Li, L., Xue, J., Li, Y. 2019. Semantic segmentation of high-resolution remote sensing images using fully convolutional network with adaptive threshold. Connect. Sci. 31:169-184. DOI: https://doi.org/10.1080/09540091.2018.1510902
Xiang, J., Liu, J., Chen, D., Xiong, Q., Deng, C. 2023. CTFuseNet: A multi-scale CNN-Transformer feature fused network for crop type segmentation on UAV remote sensing imagery. Remote Sens. 15:1151. DOI: https://doi.org/10.3390/rs15041151
Xiao, A., Xuan, W., Wang, J., Huang, J., Tao, D., Lu, S., et al. 2025. Foundation models for remote sensing and Earth observation: A survey. IEEE Geosci. Remote Sens. Mag. 13:2–29. DOI: https://doi.org/10.1109/MGRS.2025.3576766
Xie, E., Wang, W., Yu, Z., Anandkumar, A., Alvarez, J., Luo, P. 2021. SegFormer: Simple and efficient design for semantic segmentation with transformers. Proc. 35th Annual Conf. on Neural Information Processing Systems. pp. 12077-12090.
Xie, Y., Guo, Y., Mi, Z., Yang, Y., Obaidat, M. 2023. Edge-assisted real-time instance segmentation for resource-limited IoT devices. IEEE Internet Things J. 10:473-485. DOI: https://doi.org/10.1109/JIOT.2022.3199921
Xing, J., Sieber, R., Kalacska, M. 2014. The challenges of image segmentation in big remotely sensed imagery data. Ann. GIS 20:233–244. DOI: https://doi.org/10.1080/19475683.2014.938774
Xu, M., Sun, J., Zhou, X., Tang, N., Shen, J., Wu, X. 2021. Research on nondestructive identification of grape varieties based on EEMD-DWT and hyperspectral image. J. Food Sci. 86:2011-2023. DOI: https://doi.org/10.1111/1750-3841.15715
Xu, S., Xu, X., Zhu, Q., Meng, Y., Yang, G., Feng, H., et al. 2023. Monitoring leaf nitrogen content in rice based on information fusion of multi-sensor imagery from UAV. Precis. Agric. 24:2327-2349. DOI: https://doi.org/10.1007/s11119-023-10042-8
Xu, Z., Zhang, W., Zhang, T., Yang, Z., Li, J. 2021. Efficient Transformer for remote sensing image segmentation. Remote Sens. 13:3585. DOI: https://doi.org/10.3390/rs13183585
Yang, Y., Wang, Y., Dong, J., Yu, B. 2024. A knowledge distillation-based ground feature classification network with multiscale feature fusion in remote-sensing images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 17:2347-2359. DOI: https://doi.org/10.1109/JSTARS.2023.3339642
Yasir, M., Wan, J., Liu, S., Sheng, H., Xu, M., Hossain, M. 2023. Coupling of deep learning and remote sensing: A comprehensive systematic literature review. Int. J. Remote Sens. 44:157-193. DOI: https://doi.org/10.1080/01431161.2022.2161856
Zhan, Y., Xiong, Z., Yuan, Y. 2025. SkyEyeGPT: Unifying remote sensing vision-language tasks via instruction tuning with large language model. ISPRS J. Photogramm. Remote Sens. 221:64-77. DOI: https://doi.org/10.1016/j.isprsjprs.2025.01.020
Zhang, D., Liu, Z., Shi, X. 2020. Transfer learning on EfficientNet for remote sensing image classification. Proc. 5th Int. Conf. on Mechanical, Control and Computer Engineering (ICMCCE), Harbin. pp. 2255-2258. DOI: https://doi.org/10.1109/ICMCCE51767.2020.00489
Zhang, N., Yang, G., Pan, Y., Yang, X., Chen, L., Zhao, C. 2020. A review of advanced technologies and development for hyperspectral-based plant disease detection in the past three decades. Remote Sens. 12:3188. DOI: https://doi.org/10.3390/rs12193188
Zhang, T., Wang, W., Wang, J., Cai, Y., Yang, Z., Li, J. 2022. Hyper-LGNet: Coupling local and global features for hyperspectral image classification. Remote Sens. 14:5251. DOI: https://doi.org/10.3390/rs14205251
Zhang, T., Xu, Z., Su, J., Yang, Z., Liu, C., Chen, W., et al. 2021. Ir-UNet: Irregular segmentation U-shape network for wheat yellow rust detection by UAV multispectral imagery. Remote Sens. 13:3892 DOI: https://doi.org/10.3390/rs13193892
Zhang, T., Yang, Z., Xu, Z., Li, J. 2022. Wheat yellow rust severity detection by efficient DF-UNet and UAV multispectral imagery. IEEE Sensors J. 22:9057-9068. DOI: https://doi.org/10.1109/JSEN.2022.3156097
Zhang, X., Zhang, S., Meng, X., Zhang, G., Zang, D., Han, Y., et al. 2024. Remote sensing image segmentation of gully erosion in a typical black soil area in Northeast China based on improved DeepLabV3+ model. Ecol. Inform. 84:102929. DOI: https://doi.org/10.1016/j.ecoinf.2024.102929
Zhang, Y., Lv, C. 2024. TinySegformer: A lightweight visual segmentation model for real-time agricultural pest detection. Comput. Electron. Agric. 218:108740. DOI: https://doi.org/10.1016/j.compag.2024.108740
Zhang, Y., Sun, J., Li, J., Wu, X., Dai, C. 2018. Quantitative analysis of cadmium content in tomato leaves based on hyperspectral image and feature selection. Appl. Eng. Agric. 34:789-798. DOI: https://doi.org/10.13031/aea.12679
Zhang, Z., Lu, Y., Zhao, Y., Pan, Q., Jin, K., Xu, G., et al. 2023. TS-YOLO: An all-day and lightweight tea canopy shoots detection model. Agronomy 13:1411. DOI: https://doi.org/10.3390/agronomy13051411
Zhang, Z., Zhao, T., Guo, Y., Yin, J. 2024. RS5M and GeoRSCLIP: A large-scale vision-language dataset and a large vision-language model for remote sensing. IEEE T. Geosci. Remote Sens. 62:5642123. DOI: https://doi.org/10.1109/TGRS.2024.3449154
Zhao, Y., Xie, J., Zhu, H., Luo, T., Xiong, Y., Fan, C., et al. 2025. Land-Unet: A deep learning network for precise segmentation and identification of non-structured land use types in rural areas for green urban space analysis. Ecol. Inform. 87:103078. DOI: https://doi.org/10.1016/j.ecoinf.2025.103078
Zhao, Y., Zhang, X., Sun, J., Yu, T., Cai, Z., Zhang, Z., et al. 2024. Low-cost lettuce height measurement based on depth vision and lightweight instance segmentation model. Agriculture 14:1596. DOI: https://doi.org/10.3390/agriculture14091596
Zhao, X., Wang, L., Zhang, Y., Han, X., Deveci, M., Parmar, M. 2024. A review of convolutional neural networks in computer vision. Artif. Intell. Rev. 57:99. DOI: https://doi.org/10.1007/s10462-024-10721-6
Zheng, K., Chen, Y., Wang, J., Liu, Z., Bao, S., Zhan, J., et al. 2025. Enhancing remote sensing semantic segmentation accuracy and efficiency through transformer and knowledge distillation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 18:4074–4092. DOI: https://doi.org/10.1109/JSTARS.2025.3525634
Zheng, Z., Yuan, J., Yao, W., Yao, H., Liu, Q., Guo, L. (2024). Crop classification from drone imagery based on lightweight semantic segmentation methods. Remote Sens. 16:4099. DOI: https://doi.org/10.3390/rs16214099
Zhong, B., Wei, T., Luo, X., Du, B., Hu, L., Ao, K., et al. 2023. Multi-Swin mask transformer for instance segmentation of agricultural field extraction. Remote Sens. 15:549. DOI: https://doi.org/10.3390/rs15030549
Zhu, H., Lin, C., Liu, G., Wang, D., Qin, S., Li, A., et al. 2024. Intelligent agriculture: Deep learning in UAV-based remote sensing imagery for crop diseases and pests detection. Front. Plant Sci. 15:1435016. DOI: https://doi.org/10.3389/fpls.2024.1435016
Zhu, H., Qin, S., Su, M., Lin, C., Li, A., Gao, J. 2024. Harnessing large vision and language models in agriculture: A review. arXiv:2407.19679. DOI: https://doi.org/10.3389/fpls.2025.1579355
Zhu, H., Wang, D., Wei, Y., Zhang, X., Li, L. 2024. Combining transfer learning and ensemble algorithms for improved citrus leaf disease classification. Agriculture 14:1549. DOI: https://doi.org/10.3390/agriculture14091549
Zhu, W., Feng, Z., Dai, S., Zhang, P., Wei, X. 2022. Using UAV multispectral remote sensing with appropriate spatial resolution and machine learning to monitor wheat scab. Agriculture 12:1785. DOI: https://doi.org/10.3390/agriculture12111785
Zhu, W., Sun, J., Wang, S., Shen, J., Yang, K., Zhou, X. 2022. Identifying field crop diseases using Transformer-embedded convolutional neural network. Agriculture 12:1083. DOI: https://doi.org/10.3390/agriculture12081083
Zhu, X., Chikangaise, P., Shi, W., Chen, W., Yuan, S. 2018. Review of intelligent sprinkler irrigation technologies for remote autonomous system. Int. J. Agric. Biol. Eng. 11:23-30. DOI: https://doi.org/10.25165/j.ijabe.20181101.3557
Zuo, Z., Gao, S., Peng, H., Xue, Y., Han, L., Ma, G., et al. 2024. Lightweight detection of broccoli heads in complex field environments based on LBDC-YOLO. Agronomy 14:2359. DOI: https://doi.org/10.3390/agronomy14102359

How to Cite



“A review of deep learning based agricultural remote sensing image segmentation” (2025) Journal of Agricultural Engineering [Preprint]. doi:10.4081/jae.2025.1954.