Detection of early collision and compression bruises for pears based on hyperspectral imaging technology

HTML: 4
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
Authors
Early detection of bruising is one of the major challenges in postharvest quality sorting processes for pears. In this study, visible/near infrared (VIS/NIR) hyperspectral imaging (400–1000 nm) was utilized for early detection of pear bruise type and timing (1, 12, and 24 h post-bruise). Spectral images of nonbruised and mechanically bruised pears (collision and compression) were captured at these intervals for modeling. Spectral data was processed using principal component analysis (PCA) and uninformative variable elimination (UVE) to select optimum wavelengths. Classification models were then built using an extreme learning machine (ELM) and support vector machine (SVM), and compared with a model combining genetic algorithm, sooty tern optimization algorithm, and SVM (STOA-GA-SVM). For PCA-ELM, UVE-ELM, PCA-SVM, and UVE-SVM models, the calibration set accuracies were 98.99%, 98.98%, 96.94%, and 99.23% respectively. And the validation set accuracies were 89.29%, 87.97%, 88.78%, and 88.78% respectively. The STOA-GA-SVM model shows the best performance, and the accuracy of the calibration set and validation set is determined to be 97.19% and 92.86%, respectively. This study shows that the use of the VIS/NIR hyperspectral imaging technique combined with the STOA-GA-SVM algorithm is feasible for the rapid and nondestructive identification of the bruise type and time for pears.
How to Cite

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
PAGEPress has chosen to apply the Creative Commons Attribution NonCommercial 4.0 International License (CC BY-NC 4.0) to all manuscripts to be published.