Enzymatic Browning in Fresh-Cut Apple Slices Measured by Different Kinds of Image Algorithms



Similar documents
Computer Vision for Quality Control in Latin American Food Industry, A Case Study

Multiscale Object-Based Classification of Satellite Images Merging Multispectral Information with Panchromatic Textural Features

CBIR: Colour Representation. COMPSCI.708.S1.C A/P Georgy Gimel farb

Overview. Raster Graphics and Color. Overview. Display Hardware. Liquid Crystal Display (LCD) Cathode Ray Tube (CRT)

Perception of Light and Color

Experiment #5: Qualitative Absorption Spectroscopy

A comparative study of two models for the seismic analysis of buildings

Environmental Remote Sensing GEOG 2021

CHARACTERIZATION OF MATURITY PARAMETERS IN ESTHER AVOCADO (Persea americana Mill.) FRUITS DURING POSTHARVEST COLD STORAGE

Video Camera Image Quality in Physical Electronic Security Systems

Mouse Control using a Web Camera based on Colour Detection

Face detection is a process of localizing and extracting the face region from the

Adaptive fusion of ETM+ Landsat imagery in the Fourier domain

How To Cluster

Analecta Vol. 8, No. 2 ISSN

Poker Vision: Playing Cards and Chips Identification based on Image Processing

Chemistry 111 Lab: Intro to Spectrophotometry Page E-1

Computer Vision. Color image processing. 25 August 2014

Fruit Color Estimation based on Mathematical Morphology

Thank you for choosing NCS Colour Services, annually we help hundreds of companies to manage their colours. We hope this Colour Definition Report

ICC Recommendations for Color Measurement

Fig.1. The DAWN spacecraft

Assessment. Presenter: Yupu Zhang, Guoliang Jin, Tuo Wang Computer Vision 2008 Fall

Introduction to Pattern Recognition

Theremino System Theremino Spectrometer Technology

Colour Image Segmentation Technique for Screen Printing

Susana Sanduvete-Chaves, Salvador Chacón-Moscoso, Milagrosa Sánchez- Martín y José Antonio Pérez-Gil ( )

LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK

Hierarchical Clustering Analysis

Digital Image Fundamentals. Selim Aksoy Department of Computer Engineering Bilkent University

WHITE PAPER. Methods for Measuring Flat Panel Display Defects and Mura as Correlated to Human Visual Perception

Reprint (R22) Avoiding Errors in UV Radiation Measurements. By Thomas C. Larason July Reprinted from Photonics Spectra, Laurin Publishing

Data Mining Clustering (2) Sheets are based on the those provided by Tan, Steinbach, and Kumar. Introduction to Data Mining

Determining optimal window size for texture feature extraction methods

Baeza, Rosa ; Mielnicki, Diana ; Zamora, María Clara ; Chirife, Jorge. Preprint del documento publicado en Food Control Vol.

WATER BODY EXTRACTION FROM MULTI SPECTRAL IMAGE BY SPECTRAL PATTERN ANALYSIS

DIFFERENTIATIONS OF OBJECTS IN DIFFUSE DATABASES DIFERENCIACIONES DE OBJETOS EN BASES DE DATOS DIFUSAS

How To Filter Spam Image From A Picture By Color Or Color

Sustainable Balanced Scorecard: Practical Application to a Services Company

AN EXPERT SYSTEM TO ANALYZE HOMOGENEITY IN FUEL ELEMENT PLATES FOR RESEARCH REACTORS

RESOLUTION MERGE OF 1: SCALE AERIAL PHOTOGRAPHS WITH LANDSAT 7 ETM IMAGERY

There are a number of different methods that can be used to carry out a cluster analysis; these methods can be classified as follows:

Pattern Recognition in Medical Images using Neural Networks. 1. Pattern Recognition and Neural Networks

Spectral Response for DigitalGlobe Earth Imaging Instruments

Data Mining and Visualization

COLOR-BASED PRINTED CIRCUIT BOARD SOLDER SEGMENTATION

ORIGINAL REPRODUCTIBILIDAD DEL INSTRUMENTO HC THE HC INSTRUMENT REPRODUCIBILITY

PIXEL-LEVEL IMAGE FUSION USING BROVEY TRANSFORME AND WAVELET TRANSFORM

LED Lighting - Error Consideration for Illuminance Measurement

Apples & Pears, a CELMA guiding paper: Why standardisation of performance criteria for LED luminaires is important

Data Mining Cluster Analysis: Basic Concepts and Algorithms. Lecture Notes for Chapter 8. Introduction to Data Mining

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

Outline. Quantizing Intensities. Achromatic Light. Optical Illusion. Quantizing Intensities. CS 430/585 Computer Graphics I

A Real Time Hand Tracking System for Interactive Applications

How Landsat Images are Made

2 Absorbing Solar Energy

STATISTICA. Clustering Techniques. Case Study: Defining Clusters of Shopping Center Patrons. and

International Year of Light 2015 Tech-Talks BREGENZ: Mehmet Arik Well-Being in Office Applications Light Measurement & Quality Parameters

CS 325 Computer Graphics

Color Management Terms

Nederland België / Belgique

Asian Journal of Food and Agro-Industry ISSN Available online at

Vision based Vehicle Tracking using a high angle camera

T O B C A T C A S E G E O V I S A T DETECTIE E N B L U R R I N G V A N P E R S O N E N IN P A N O R A MISCHE BEELDEN

Topographic Change Detection Using CloudCompare Version 1.0

ARTÍCULO DE INVESTIGACIÓN

Modelling, Extraction and Description of Intrinsic Cues of High Resolution Satellite Images: Independent Component Analysis based approaches

SAMPLE MIDTERM QUESTIONS

A Genetic Algorithm-Evolved 3D Point Cloud Descriptor

Defect detection of gold-plated surfaces on PCBs using Entropy measures

Resolution for Color photography

Inspection and Illumination Systems for Visual Quality Assurance. Large selection Attractive prices Tailored solutions. optometron.

Dyna ISSN: Universidad Nacional de Colombia Colombia

High Resolution Spatial Electroluminescence Imaging of Photovoltaic Modules

Calculation of Minimum Distances. Minimum Distance to Means. Σi i = 1

High-resolution Imaging System for Omnidirectional Illuminant Estimation

Improved predictive modeling of white LEDs with accurate luminescence simulation and practical inputs

Sympa, un gestor de listas de distribución para las universidades

Department of Mechanical Engineering, King s College London, University of London, Strand, London, WC2R 2LS, UK; david.hann@kcl.ac.

Neural Network based Vehicle Classification for Intelligent Traffic Control

Big Data: Rethinking Text Visualization

Spectroscopy Using the Tracker Video Analysis Program

HSI BASED COLOUR IMAGE EQUALIZATION USING ITERATIVE n th ROOT AND n th POWER

Supervised Classification workflow in ENVI 4.8 using WorldView-2 imagery

DESIGN OF ACTION PLANS AGAINST NOISE IN NAVARRE, SPAIN

Computer Vision: Machine Vision Filters. Computer Vision. Optical Filters. 25 August 2014

Vision-Based Blind Spot Detection Using Optical Flow

Fast Subsequent Color Iris Matching in large Database

Volume 2, Issue 9, September 2014 International Journal of Advance Research in Computer Science and Management Studies

BDL4681XU BDL4675XU. Video Wall Installation Guide


Diagrams and Graphs of Statistical Data

Optical Design Tools for Backlight Displays

Multispectral stereo acquisition using 2 RGB cameras and color filters: color and disparity accuracy

THREE-DIMENSIONAL CARTOGRAPHIC REPRESENTATION AND VISUALIZATION FOR SOCIAL NETWORK SPATIAL ANALYSIS

6.2.8 Neural networks for data mining

Speed Performance Improvement of Vehicle Blob Tracking System

Cafcam: Crisp And Fuzzy Classification Accuracy Measurement Software

PHYS 222 Spring 2012 Final Exam. Closed books, notes, etc. No electronic device except a calculator.

Component Ordering in Independent Component Analysis Based on Data Power

Transcription:

Enzymatic Browning in Fresh-Cut Apple Slices Measured by Different Kinds of Image Algorithms Loredana Lunadei 1, Belén Diezma 1, Lourdes Lleó 2, Pámela Galleguillos 3 1 LPF-TAGRALIA, Departamento de Ingeniería Rural, ETSI Agrónomos, Universidad Politécnica de Madrid, España. E- mail: loredana.lunadei@upm.es 2 Departamento de Ciencia y Tecnologías Aplicadas a la Ingeniería Técnica Agrícola, E.U.I.T. Agrícolas, Universidad Politécnica de Madrid, Spain. E-mail: lourdes.lleo@gmail.com 3 Centro de Estudios Postcosecha CEPOC, Departamento de Producción Agrícola, Facultad de Ciencias Agronómicas, Universidad de Chile, Chile. E-mail: Pamela.galleguillos@gmail.com Resumen El objetivo final es el desarrollo de un sistema di visión multiespectral que permita asignar manzanas cortadas a clases de distinto nivel de pardeamiento. Se ha analizado un total de 24 imágenes IRRB y RGB, correspondientes a 24 gajos de manzanas de la variedad Granny Smith (12 gajos = Set 1; 12 gajos = Set 2). Se analizaron 24 gajos por día: a tiempo cero y después de 1, 3, 7 y 9 días de almacenamiento a 7,5ºC. A las imágenes virtuales obtenidas como combinación del canal rojo y azul (, y ()/(R+B)) se aplicó un procedimiento de clasificación no supervisada que, en todo los casos, generó tres clases de referencia. A la segunda serie de muestras (Set 2), sometidas los mismos tratamientos, se aplicó una validación externa, obteniendo un alto porcentaje de muestras correctamente clasificadas. La clasificación de las cámaras IRRB y RGB se evaluó de acuerdo a parámetros colorimétricos y sensoriales y las imágenes virtuales ()/(R+B) y mostraron la mejor sensibilidad para reflejar el cambio de color asociado con el pardeamiento. Abstract The main objective of this study was to develop a vision system able to classify fresh-cut apple slices according to the development of enzymatic browning. The experiment was carried out on Granny Smith apple slices stored at 7.5 C (Set 1 = 12 n). Twenty-four samples were analyzed per day: at zero time and after storage for 1, 3, 7 and 9 days. Digital images were acquired by employing an IRRB camera and by employing a cheaper vision system, consisting in a RGB digital camera. A classification procedure was applied to the histograms of the following virtual images, acquired by the IRRB and by the RGB camera: ()/(R+B), R B and. In all cases, a non-supervised classification procedure was able to generate three image-based browning reference classes. An internal and an external validation (Set 2 = 12 n) were carried out, with a high percentage of corrected classified samples. The camera classification was evaluated according to reference parameters (colorimetric and sensorial measurements) and the best results were obtained with the ()/(R+B) and virtual images. Keywords (Palabras Clave): Fresh-cut apples; enzymatic browning; RGB image, IRRB image; multispectral images (Manzanas recién cortadas; pardeamiento enzimático; imagen RGB; imagen IRRB; imagen multiespectral). 1. INTRODUCTION The act of cutting fresh produce invariably enhances a range of degradative changes, which present additional challenges to the fresh-cut industry to maintain quality for an acceptable marketing period. An important factor causing loss of quality in much produce is the

development of browning on cut surfaces. Browning has a significant impact on the quality of apples and their products, because it results in changes in the appearance and organoleptic properties of the food, which can affect market value and, in some cases, result in exclusion of the food product from certain markets (Pristijono et al. 6). The control of cut-surface browning is critical to maintaining the quality and safety of fresh-cut produce. Traditionally, enzymatic browning has been quantified using browning indicators through a biochemical index, e.g., polyphenol oxidase activity or physical indicators, such as surface color. In the case of physical indicators based on color, CIE L*a*b* coordinates have been the most extensively used color space. Based on CIE L*a*b* or CIE XYZ coordinates, browning indicators in fruits have been developed (Lu et al. 7). Browning index (BI), defined as brown color purity, is one of the most common indicators of browning in sugar containing food products (Buera et al. 1986). In order to carry out a detailed characterization of the color of a food item, and thus to more precisely evaluate its quality, it is necessary to know the color value of each point of its surface (León et al. 6). However, the available commercial colorimeters do not allow a global analysis over entire surface, and their measurements are not representative in heterogeneous materials like food products (Papadakis et al. ; Mendoza et al. 4). On the contrary, computer vision systems (CVS) let to acquire digital images of entire samples that can be analyzed pixel by pixel, allowing an accurate measurement of color coordinates in each point of the surface. Recently, Leon et al. (6) demonstrated a computer vision system (CVS) for measuring color in L*a*b* coordinates computed from RGB space. Some studies have been undertaken to apply that approximation to food (Pedreschi et al. 7; Quevedo et al. 8). During the description of browning kinetics using color information, L* mean value is generally assumed. That is, an average of the L* values is calculated using a CVS for an analyzed area. However, in apple slices, the development of non-uniform color patterns during browning (specifically L*) was observed. Yoruk et a. (4) (Yoruk et al. 4) adopted a sub-color space derived from the RGB space, in which each color axis (red, green, blue) was divided by eight so that the colors were regrouped in 8 x 8 x 8 = 512 ranges. The aim of this work was to classify fresh-cut apple slices on the basis of their browning state by employing a vision system endowed with three band-pass filters centered at 8 nm (infrared, IR), 68 nm (red, R), and 45 nm (blue, B) (IRRB images). The main objective was to identify proper virtual images as a combination of monochromatic ones in order to detect changes in color related to the browning process. A second goal was to evaluate the feasibility of assessing browning in apples by employing a cheaper and an easier vision system, consisting in an RGB digital camera. 2. MATERIALS and METHODS The experiment was carried out on two sets (Set 1, the calibration set, and Set 2, the validation set) of fifteen Granny Smith apples stored at 7.5 C for 9 days. Apples were peeled and cut into eight equal slices, resulting in a total of 24 slices (Set 1 = 12 n, Set 2 = 12 n). Twentyfour samples of each set were analyzed per day: at zero time and after storage for 1, 3, 7 and 9 days, which corresponds to treatments t, t 1, t 3, t 7 and t 9 respectively (Figure 1). t t 1 t 3 t 7 t 9 A fresh-cut apple slice at zero time (t ) and after 1 (t 1), 3 (t 3), 7 (t 7) and 9 (t 9) days of storage (T = 7.5 ºC).

Digital images were acquired from the samples by employing a 3-CCD camera centered at the infrared (IR, 8±2 nm), red (R, 68±2 nm) and blue (B, 45±2 nm) wavelengths (IRRB images) and by employing a RGB camera centered at 625 nm (red, R), 54 nm (green, G), and 475 nm (blue, B) (RGB images). In the case of the IRRB vision system, a light source was provided by six W/22 V halogen lamps, and the object distance between the lens system and the sample was 6 cm. The angle between the camera lens axis and the lighting source axis was 45 because the diffuse reflection responsible for the color occurs at 45 from the incident light (Francis et al. 1975; Marcus et al. 1998). The images were acquired using a black background. A black canvas was put around the vision test station to create a uniform light field around the object. In the case of the RGB camera, a hemispheric cap endowed with white reflecting walls was constructed to eliminate any effect of environmental light. The light source consisted in four 15 W incandescent lamps, attached at equidistant points on the inside of cap. The camera was adjusted to a vertical position, and the lens was 61 cm above the object of interest. The images were acquired using a black background. RGB images were also transformed into L*a*b* images (where L* is the luminance component, while a* and b* are color coordinates related respectively with the red/green and yellow/blue spectral ranges) by applying in-core Matlab functions. For each kind of digital image, apple slice samples (the region of interest, ROI) were distinguished from the background through the Otsu method (Otsu 1979), a segmentation technique very commonly used in the bibliography. This technique computes threshold level on the basis of the image histogram distribution. In both cases, the segmentation process was performed on the R images, since they presented the greatest difference between the gray levels corresponding to samples (the region of interest) and to background. This operation results in a binary image which could be considered as an image mask, in which only the pixels of ROI were white, while the pixels corresponding to the background were black. This image mask was multiplied for the different type of images performed in this research. Further analysis were based on the relative histograms of the described virtual images, computed as relative frequency of pixels over the intensity range of the image. In order to obtain reference values, apple slices were evaluated visually, according to a visual color scale of 1-5 (where 1 corresponds to fresh samples without any browning and 5 to samples with severe discoloration), obtaining a sensory evaluation index (I SE ) for each sample. Finally, Visible (VIS) relative reflectance spectra (36-74 nm), CIE L*a*b* color coordinates and a Browning Index (BI), calculated as reported in equation (1), where x is the chromaticity coordinate calculated from the XYZ values, one of the most common indicators of browning, were obtained from the samples through a Minolta CM-5I portable spectrophotometer. BI = x -.31.172 (1) In order to identify the most related wavelengths to enzymatic browning evolution, unsupervised pattern recognition analysis of VIS reflectance spectra was performed by principal components analysis (PCA) on the autoscaled data. On the basis of the PCA results, opportune combinations of monochromatic images (virtual images) were computed. A non-supervised classification according to Ward s method was thus applied to the histograms of the virtual images in order to define browning reference classes (BRC) based on Set 1 histograms. All the intensity levels of the histograms were considered as the dimensions of a multidimensional space, where a single histogram was represented as a single point. The matrix of Euclidean distances between each pair of individuals (histograms) was computed in order to group the closest ones and to hierarchically merge individuals whose

combination gave the least Ward Linkage distance (that is the minimum increase within sum of squares of the new-formed group). As an advantage to other classification methods, Ward s method takes into account all histograms of the data set at every level of the grouping, producing very well structured and homogeneous groups. Besides, this method allowed successful results in precedent works investigating fruit ripeness (Herrero A. et al. 21; Lunadei et al. 211). A MATLAB devoted code was developed in order to generate automatically the groups on the basis of an input maximum Ward linkage distance, derived from the analysis of the cluster tree features. The average histogram was computed for each generated group and defined as BRC. External validation was carried out by assigning each anonymous individual into the previously generated BRC (each one defined by the average histogram of the class) to which it computed the minimum Euclidean distance (E d ). In order to test the robustness of the model based on Set 1 data, an external validation procedure was performed: the observed classification of Set 2 samples was compared with the predicted classification of each anonymous histogram through the minimum E d to the BRC generated with the set 1 population. The color reference parameters (BI and CIE L*a*b* coordinates) and the I SE were compared to the classification based on the histograms of virtual images of each index. 3. RESULTS and DISCUSSION After performing PCA on the VIS reflectance spectra, maximum loading values corresponding to the B and R areas were observed (Figure 2)..4.3.3.2 Red range.2.1 PC1 (82.3%).1 Loadings for PC1 -.1 -.1 -.2 -.2 -.3 Violet-Blu range -.3 24 48 72 96 12 Sample Number -.4 36 398 436 474 512 55 588 626 664 72 74 Wavelength (nm) On the left: PC1 scores plot for autoscaled and normalized reflectance data with 95% Limits. The X axis corresponds to the sample number and the Y axis to the sample scores for PC1. Vertical lines separate the samples based on treatment. On the right: PC1 loadings plot. The X axis corresponds to the wavelength (nm) and the Y axis to the loading values for PC1. On the basis of the results obtained from the analysis of VIS spectra, proper virtual images were calculated as a combination of red and blue images of the samples acquired by the IRRB and the RGB camera. Since the reflectance values corresponding to the red range increased from t to t 9, whereas those corresponding to the blue region decreased, the virtual images were calculated to amplify these differences. Therefore, ()/(R+B) (in the rest of the document: R-+B), R B and digital images were computed.

t t 1 t 3 t 7 t 9 R-+B R-+B R-+B R-+B R-+B R-+B 7 8 9 7 8 9 7 8 9 7 8 9 7 8 9 7 8 9 7 8 9 7 8 9 7 8 9 7 8 9 7 8 9 7 8 9 7 8 9 7 8 9 7 8 9 R-+B (a), (b), and (c) virtual images of an apple slice sample computed at zero time (t ) and after storage for 1, 3, 7 and 9 days (t 1, t 3, t 7 and t 9). IRRB images Figure 3 shows an example of the virtual images calculated from the IRRB ones for one sample in each treatment. In all cases, from t to t 9 treatment, changes in color were observed in the samples, which corresponded to a change in pixel intensity values. In R-+B and R B images, the pixel intensity values increased during the storage period. In the images, the pixel intensity value decreased. These changes in color did not occur uniformly in the analyzed samples since the same samples presented regions whose pixels turned to higher (or lower) intensity values faster than others. Besides, the irregular shape of the samples in same case could cause problems of shadow, affecting the accuracy of the acquiring process. For each virtual image the average of the twenty-four ROI-histograms obtained during each one of the treatments t, t 1, t 3, t 7 and t 9 was calculated, obtaining five average histograms per image combination (Figure 4). The average histograms of R-+B and images shifted to higher intensity values, while those of images shifted to lower intensity values. After applying the non-supervised classification to the IRRB images, three image-based browning reference classes (BRC) were generated: Cluster A (corresponding to the t samples), Cluster B (t 1 and t 3 samples) and Cluster C (t 7 and t 9 samples) (Figure 4). On the basis of the internal and an external validations results, the best classifications were obtained with the R-+B and image histograms (internal validation: 99.2% of samples correctly classified for both virtual images; external validation: 84% with ()/(R+B) and 81% with ). Besides, for both validation phases a*, b*, BI and I SE increased while L* values decreased with imagebased class number, thereby reflecting their browning state. Figure 4 shows 3-D plots of L*, BI and I SE index values of the samples categorized in their corresponding image-based cluster. RGB images For each virtual image calculated from the RGB ones, the trend of the change in pixel intensity values and of the average histograms were according to the trend observed during the analysis of the IRRB images. Also in this case, after applying the non-supervised classification procedure to the histograms of the images, three reference classes were

generated (Clusters A to C) (Figure 5). On the basis of the validations results, the best classifications were obtained with the R-+B image histograms (internal validation: 75% of samples correctly classified; external validation: 68%). This result was a little less consistent than that obtained with the IRRB camera, probably due to the more proper illumination source (halogen lamps set at 45 with the camera lens axis) and to the more stable conditions (the presence of black canvas) utilized for acquiring the images. Regarding to the reference parameters, the trends observed with the IRRB images were confirmed, since for both validation phases a*, b*, BI and I SE increased while L* values decreased with imagebased class number. a b c Relative frequency.3.25.2.15.1.5 R +B t t1 t3 t7 t9 5 15 25 Euclidean distance (relative frequency).6.5.4.3.2.1 119 4 6252627 118314121715 213 516 3 7 9 8232811222212429 Individuals SE values 5 4 3 2 1 8 6 4 BI values 2 6 8 7 L values 9 cluster A cluster B cluster C Relative frequency.3.25.2.15.1.5 t t1 t3 t7 t9 5 15 25 Euclidean distance (relative frequency).6.5.4.3.2.1 1516 223 117 3 41 5 822 9227 626 712251329141128181932124 Individuals SE values 5 4 3 2 1 8 6 4 BI values 2 6 8 7 L values 9 cluster A cluster B cluster C Relative frequency.2.15.1.5 t t1 t3 t7 t9 5 15 25 Euclidean distance (relative frequency).6.5.4.3.2.1 122 4 6252627293121714 115 213 516 3 7 9 8211192118232824 Individuals SE values 5 4 3 2 1 8 6 4 BI values 2 6 8 7 L values 9 cluster A cluster B cluster C Figures a plot the average histograms calculated for each treatment from Set 1 IRRB images based on R-+B (upper panel), (central panel) and (lower panel); figures b report the relative dendrogram generated by applying Ward s non-supervised classification (horizontal lines represent the maximum Ward Linkage distance within groups (pixel relative frequency =.3)) and figures c show 3-D plots of L, BI and I SE index values of the samples categorized in their corresponding image-based cluster (A, B and C).

t Frequency.25.2.15.1.5 t1 t3 t7 t9 Euclidean distance(pixels relative frequency).6.5.4.3.2.1 5 15 25 2818162715292522122 1123 219 5 9 3 614 41724 7 8121132613 Individuals Average histograms (on the left) calculated from the samples for each treatment from Set 1 RGB images based on R-+B and relative dendrograms (on the right) generated by applying Ward s non-supervised classification Horizontal lines in the cluster trees represent the maximum Ward Linkage distance within groups (pixel relative frequency =.3). Lab images Figure 6 reports an example of L*, a* and b* images obtained from L*a*b* images of a Set 1 sample for each treatment. In a* and b* images changes in color were observed from t to t 9, corresponded to a change in pixel intensity values, while the gray level of the L* images did not seem to exhibit any consistent variation. These results were confirmed after analyzing the image histograms (Figure 7): in the case of a* and b* images, the average histograms shifted to higher intensity values, while those of L* images presented a different trend: they did not shift to higher or lower intensity values, but from t to t 9 the shape of the histograms changed, since the peaks get down and the profile of the histograms widened, due to a wider range of intensities comprising the object image. The trend of a* and b* histograms was according to the colorimetric measurements, since from t to t 9 histograms moved to higher intensity values, as well as the a* and b* color coordinates increased with the storage time. On the contrary, the trend of L* histograms was not so consistent compared to that of the L* color parameter (that decrease from t to t 9 ) and it could be due to the illumination conditions utilized for acquiring RGB images (utilized for calculating the L*a*b* ones). In fact, since L* is the lightness component, it could be more sensitive than the a* and b* color coordinates (related to the red/green and blue/yellow colors) to the variations of the light field around the object. After applying the classification process, no reference classes were generated, indicating that the analysis of this kind of images were not the most appropriate for the proposed classification.

t t 1 t 3 t 7 t 9 L 7 8 9 7 8 9 7 8 9 7 8 9 7 8 9 a 7 8 9 7 8 9 7 8 9 7 8 9 7 8 9 b 7 8 9 7 8 9 7 8 9 7 8 9 7 8 9 L*, a* and b* images of a fresh-cut apple slice at zero time (t ) and after 1 (t 1), 3 (t 3), 7 (t 7) and 9 (t 9) days of storage (T = 7.5). p.3.25.2 day 1 day 2 day 3 day 4 day 5.14.12.1 day 1 day 2 day 3 day 4 day 5.7.6.5 day 1 day 2 day 3 day 4 day 5 Frequency.15 Frequency.8.6 Frequency.4.3.1.4.2.5.2.1 5 15 25 5 15 25 5 15 25 Average histograms of L* (on the left), a* (in the center), and b*(on the right) images of set 1 apple slice samples computed at zero time (t ) and after storage for 1 (t 1), 3 (t 3), 7 (t 7) and 9 (t 9) days. CONCLUSIONS In the present study an image algorithm was proposed to classify fresh-cut apple slices according to enzymatic browning evolution. Two vision systems were employed: the first one was based on an IRRB multispectral camera and the second one was based on an easier system, consisting in a RGB camera. The method utilized relative histograms of virtual images, i.e., R +B, R B, and, as well as combinations of red (R, 68 nm) and blue (B, 45 nm) images of the samples. The red and blue spectral ranges contained enough information for the proposed method to adequately classify sample images. On the basis of our internal classification results, all the indexes were sufficient to detect changes in browning by classifying the samples into three reference classes (A C). In all cases, Clusters A C presented decreasing lightness and increasing a*, b*, BI and I SE values. The robustness of the classification procedure was determined by applying an external validation to a second set of samples. It was possible to correctly classify a high percentage of images from fruit in the second testing set with the model generated with the first set. The classification based on R +B and images exhibited the best sensitivity for reflecting the change in colors associated with browning. 8

All these results confirmed the potential of the proposed method for characterizing fresh-cut apples according to their browning state. This method could be used as a potential criterion for establishing the optimal shelf-life of fresh-cut apple slices under refrigeration conditions with or without additional inhibitory treatments. In addition, this method allows for a more spatially detailed determination compared to other colorimetric techniques, which analyze a small portion of a sample and lead to errors and inaccurate results if the analysis is not repeated in different zones on the surface. By comparing the two vision systems, it was found that the IRRB camera gave more consistent results than those obtained with the RGB one, probably due to the more stable illumination conditions utilized in the first case. Since RGB vision systems are quite cheaper than the IRRB ones, in a future work it could be interesting to improve their performance by restructuring the design of the light field around the object and by changing the position and the kind of the light source. 4. ACKNOWLEDGEMENTS This research was carried out in the Universidad Politécnica de Madrid (UPM, Spain) and it was supported and funded by the UPM through the project DURASFRUT II (AL11-P(I+D)- 6). 5. REFERENCES Buera, M. P., Lozano, R. D.,Petriella, C. (1986). Definition of colour in the non enzymatic browning process. Die. Farbe, Vol. 32, 318-322. Francis, F. J.,Clydesdale, F. M. (1975). Food Colorimetry: Theory and Applications. Westport: AVI Publ. Co. Herrero A., Lunadei L., Lleó L., Diezma B.,M., R. (21). Multispectral Vision for Monitoring Fruit Ripeness. Journal of Food Science, Vol. 76 (2), E178-E187. León, K., Mery, D., Pedreschi, F.,León, J. (6). Color measurement in L*a*b* units from RGB digital images. Food Research International, Vol. 39 (1), 184-191. Lu, S., Luo, Y., Turner, E.,Feng, H. (7). Efficacy of sodium chlorite as an inhibitor of enzymatic browning in apple slices. Food Chemistry, Vol. 14 (2), 824-829. Lunadei, L., Galleguillos, P., Diezma, B., Lleó, L.,Ruiz-Garcia, L. (211). A multispectral vision system to evaluate enzymatic browning in fresh-cut apple slices. Postharvest Biology and Technology, Vol. 6 (3), 225-234. Marcus, R. T.,Kurt, N. (1998). Chapter 2 The measurement of color(eds.).azimuth (pp. 31-96). North-Holland. Mendoza, F.,Aguilera, J. M. (4). Application of Image Analysis for Classification of Ripening Bananas. Journal of Food Science, Vol. 69 (9), E471 - E477. Papadakis, S. E., Malek, S. A., Emery, R. E.,Yam, K. L. (). A Versatile and Inexpensive Technique for Measuring Color of Foods. Food Technology, Vol. 54, 48-51. Pedreschi, F., Bustos, O., Mery, D., Moyano, P., Kaack, K.,Granby, K. (7). Color kinetics and acrylamide formation in NaCl soaked potato chips. Journal of Food Engineering, Vol. 79 (3), 989-997. Pristijono, P., Wills, R. B. H.,Golding, J. B. (6). Inhibition of browning on the surface of apple slices by short term exposure to nitric oxide (NO) gas. Postharvest Biology and Technology, Vol. 42 (3), 256-259. Quevedo, R., Aguilera, J. M.,Pedreschi, F. (8). Color of salmon fillets by computer vision and sensory panel. Food and Bioprocess Technology, Vol. doi:1.7/s11947-8-16-6. 9

Yoruk, R., Yoruk, S., Balaban, M. O.,Marshall, M. R. (4). Machine vision analysis of antibrowning potency for oxalic acid: A comparative investigation on banana and apple. Journal of Food Science and Technology (Mysore), Vol. 69, E281-E289. 1