Deep Learning-Enabled Hyperspectral Classification of Zhambons

Document Type : Research Article

Authors

1 Biomedical Engineering Group, Department of Electrical Engineering and Information Technology, Iranian Research Organization for Science and Technology (IROST)

2 Information Technology and Intelligent Systems Group, Department of Electrical Engineering and Information Technology, Iranian Research Organization for Science and Technology

3 Department of Chemical Technologies, Iranian Research Organization for Science and Technology

Abstract

Food authenticity is a crucial aspect of consumer protection, food safety, and quality assurance. Conventional methods for meat authentication often require destructive, time-consuming, or labor-intensive processes. Hyperspectral imaging, which combines imaging and spectroscopy, has emerged as a non-destructive alternative for food classification. This study investigates the application of hyperspectral imaging for differentiating between beef, chicken, and turkey zhambons using one-dimensional convolutional neural networks and long short-term memory networks. Following preprocessing—including segmentation, noise reduction, and spatial averaging—spectral signatures were extracted and classified using deep learning models and then compared to traditional machine learning approaches. The long short-term architecture demonstrated superior performance by effectively modeling sequential spectral dependencies, achieving 99.94% accuracy in the binary classification of chicken versus beef and 98.12% accuracy in the three-class problem (beef, chicken, and turkey zhambons). The findings highlight the potential of hyperspectral imaging combined with machine learning approaches as an efficient tool for processed meat authentication.

Keywords

Main Subjects



Articles in Press, Accepted Manuscript
Available Online from 23 December 2025
  • Receive Date: 23 November 2025
  • Revise Date: 20 December 2025
  • Accept Date: 23 December 2025
  • First Publish Date: 23 December 2025
  • Publish Date: 23 December 2025