Table 3.
Agricultural product | Spectral range | Software package | Number of samples | Accuracy | Findings | References |
---|---|---|---|---|---|---|
Bell pepper | 1,600–2,400 nm | Win ISI II v. 1.5, MATLAB 2015a | 394 | 88.28–91.37% | Preliminary screening using SSC and dry matter was a success. The importance of SEP and SEL was discussed. | (54) |
Mulberry | 909–1,649 nm | PLS Toolbox v. 6.21, MATLAB R2009 | 468 | 84.1% | Dendrobium officinale Kimura et Migo (DOK) was distinguished from Dendrobium devonianum Paxt (DDP). | (7) |
Apple | 1,000–2,500 nm | MATLAB 7.11, Antaris II System | 180 | 74.44% | Among PCA, PCA+LDA, SDA, and DPLS, SDA was found to have better performance for feature extraction. | (58) |
Apple | 1,000–2,500 nm | Fiber Optic Solids cell, NIRWare Unscrambler, | 410 | 77.9% | Classification of apples according to various terrain types. | (51) |
Tangerine, red cabbage, cornichons, kale and applesauce | 1,100 nm and 2,100 nm | PAS LABS v. 1.2, SIMCA v. 14.1 | 15 | 99% | NIRS prediction was possible for commodities kept inside glass. OPLS-DA outperformed PCA and PLS-DA. | (59) |
Potato | 964.13–1645.01 nm and 2502.50–16666.67 nm | SpectralCube, OPUS v. 7.2, PLS-toolbox v. 8.6, Unscrambler v. 10.1, MATLAB R2017b | 240 | RP = 0.954 RMSEP = 0.421 |
A PLSR model was used to find the degree of doneness and predict the variety. | (57) |
Apple | 300–1,100 nm | ModelBuilder, R Statistical software | 640 | R2 values were 0.90 and 0.92 and RMSE were 0.67%. | Individual models for cultivars performed better than the combined model. | (53) |
Mango | 1,200–2,200 nm | Unscrambler | 1,310 | Alphonso and Banganapalli (99.07%, 99.58%), Dasheri and Malda (98.37%, 94%) | A distinct score plot allowed for more accurate classification. | (55) |
Apple | 400–1,021 nm | Ocean View, MATLAB R2014b | 300 | SPA-SVM 85.83% SPA-ELM 95% |
Among BPNN, SVM and ELM models, ELM performed better. Feature selection with SPA combined with ELM produced better results than PCA. | (60) |
Pears | 350–1,800 nm 350–1,000 nm 1,000–1,800 nm | Unscrambler v. 9.7 | 110 | R2 0.90–0.92 RMSEP 0.23–0.30 | Feature selection was obtained better with CARS than with MC-UVE and SPA. CARS-MLR and CARS-PLS accurately determined SSC. | (56) |
Apple | 1,000–2,500 nm | MATLAB R2014a | 208 | 98.1% | Geographical region had a significant effect on SSC. CARS feature selection and PLS-DA had good prediction accuracy. |
SSC, soluble solid content; SEP, standard error in prediction; SEL, standard error in laboratory; PCA, principle component analysis; LDA, linear discriminant analysis; SDA, stacked denoising autoencode; DPLS, dynamic partial least squares; OPLS, orthogonal partial least-squares; PLS-DA, partial least-squares discriminant analysis; PLSR, partial least squares regression; BPNN, back propagation neural network; SVM, support vector machines; ELM, extreme learning machines; SPA, successive projection algorithm; CARS, competitive adaptive reweighted sampling; MC-UVE, Monti Carlo–uninformative variable elimination; SPA-MLR, successive projection algorithm–multiple linear regression.