Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Optimal lighting of RGB LEDs for oral cavity detection

Open Access Open Access

Abstract

In this paper the optimal lighting for oral cavity detection is proposed. The illuminants consist of several LEDs with different intensity ratios and peak wavelengths, which can enhance the color difference between normal and abnormal regions in the oral cavity. An algorithm combined with multi-spectral imaging (MSI) and color reproduction technique is applied to find the best enhancement of this difference. The colored LEDs of the optimal lighting, the Color Rendering Index (CRI) of the illuminants, and comparison with traditional illuminants are discussed. The calculations show that color enhancement ability in the oral cavity is not entirely a function of the higher CRI of some illuminants, as the narrowband illuminants (LEDs) produce an image with greater contrast than the broadband spectra and higher CRI of traditional illuminants in the reddish oral environment. Accordingly, an illuminant with specific intensity ratio of red, green, and blue LEDs is proposed, which has optimal color enhancement for oral cavity detection. Compared with the fluorescent lighting commonly in the use now, the color difference between normal and inflamed tissues can be improved from 21.5732 to 30.5532, a 42% increase, thus making medical diagnosis more efficient, so helping patients receive early treatment.

©2012 Optical Society of America

1. Introduction

In recent years, the development of LEDs has led to breakthroughs in the solid-state lighting industry, resulting in various new lighting applications. For example, in indoor lighting applications, D. Corell et al used LEDs as an alternative ambient illumination in a photolithography environment [1]. They found that LEDs were better able to render colors despite a low color rendering index (CRI), thus LEDs replaced the yellow fluorescent light tubes (YFT) then being used in photolithography rooms. In energy saving applications, C. H. Tsuei et al proposed an algorithm for combining a sunlight concentrator and LEDs, in which the concentrator could use the energy of sunlight more efficiently by using the visible rays for illumination and the non-visible rays for providing electrical power for the LEDs [2]. In biomedical applications, M. Rahman et al developed a low-cost, multimodal, and portable screening system for early detection of oral cancer [3]. Blue LEDs, white LEDs and polarized LEDs were used in the system to obtain a fluorescence image, a reflectance image, and a low scattering image, respectively. These new applications of LEDs manifest their potential to replace traditional lighting.

Despite the important breakthroughs that have already happened in the solid-state lighting industry, studies of LEDs based on color science have only just begun. Color rendering is defined by the CIE (Commission internationale de l'éclairage) as the “effect of an illuminant on the color appearance of objects by conscious or subconscious comparison with their color appearance under a reference illuminant” [4]. The Color Rendering Index (CRI) is the only internationally recognized evaluation method for illuminants [5]. It was developed to evaluate the color rendering properties of then-new fluorescent lamps in the 1960s. With the progress of the solid-state lighting industry in recent years and the expected progress to come, LEDs will become the most common illuminants. Some studies indicate that the CRI is not suitable for evaluation of narrowband illuminants with spiky spectral peaks such as LEDs [6]. Researchers are constantly looking for other evaluation methods. One avenue of research is Gamut-based computation methods, such as the Color Discrimination Index (CDI) and the Feeling of Contrast Index (FCI). The other avenue is the search for improved methods based on the CRI, such as the Color Quality Scale (CQS) proposed in 2010 [79]. Most of these evaluation methods are only for white light sources; the study of colored illuminants is sparse.

However, white light sources are not a good option in certain situations: archaeologists have found that it is difficult to identify relics in an area covered with dust or in a dark cave by using white light, divers need colored illuminant for their underwater tasks, and museum designers desire lighting that can reproduce the artists’ real visual experiences. For these requirements, optimal illuminants need to be designed for different environments and purposes. Narrowband illuminants have a well-known property of color contrast enhancement, bringing considerable benefits for identification of a specific object [10]. The purpose of this study is to find an illuminant which can enhance the color difference between normal and abnormal regions in the reddish oral cavity, making medical diagnosis easier and faster, leading to the earlier, and therefore usually more successful, treatment.

The spectral characteristics of LEDs make them one of the best choices for fabricating a narrowband illuminant. The most direct way to achieve the optimal illumination design, is to calculate the color difference of the oral cavity illuminated by various LEDs and make comparisons. However this is a very time-consuming method. Multi-spectral imaging (MSI), which uses optics, mathematics, and color science for spectral estimation, is a good solution that gives accurate results while saving much time. The colorimetric specification of an object can be reproduced by multiplying its estimated reflectance under different illuminants by the illuminants’ spectra [11]. Traditionally, a multi-spectral camera and a set of filters are required for capturing multi-band images in the MSI system, but the expensive equipment and its poor portability limit clinical application [12]. In contrast, a simple-to-operate, low-cost, small-size commercial digital camera is more suitable for use in the rapid diagnosis of specific lesions [13]. The difficulty is that, with only the three RGB channels, the images produced by a digital camera can have color differences compared with the real vision of human eyes, and thus can affect the accuracy of spectral estimation. The application of color correction can overcome these deficiencies, resulting in improved simulated images under different illuminants, thus producing simulation results that are clearer and more convincing.

This study is organized as follows: In Section 2, an algorithm combined with MSI and a color reproduction technique is introduced that is used to achieve the spectral estimation and the simulation of color performance under different illuminants. In Section 3, the images of oral herpangina caused by an enterovirus infection are demonstrated. The relationship between the CRI and color enhancement ability using white light sources for oral cavity detection is discussed in Section 4. A survey of optimal illumination is presented, and an optimal illuminant with the best color enhancement is proposed in Section 5. To address the benefits of the MSI system, we have developed a novel, open-source, easy-to-install software package to control this high-speed, multiple-function MSI system. The windows of this software, which integrates a number of modular functions through a graphical user interface (GUI), are shown in Section 6. Finally, the conclusions are given in Section 7.

2. Multispectral imaging system

An algorithm combined with MSI and a color reproduction technique is applied in this study. First, the spectra of the 24 Macbeth color checkers are measured by a spectrophotometer (Konica Minolta CS1000A) under the illumination of a particular uniform artificial light, and the reflection spectrum of each color checker in the visible light region (380nm to 780nm) is obtained. These spectra are arrayed as a matrix, [D]401x24, the rows of which are the intensities of the wavelengths at 1 nm intervals, and the columns of which are the numbers of the color checkers. By determining the eigen-system and applying the principal component analysis (PCA), six eigen vectors that make the greatest contribution are selected to be the basis of the spectral estimation, and arrayed as a matrix [E]6x401. The corresponding eigen values of these six eigen vectors [α]6x24 can be determined as follows:

[α]T=[D]Tpinv[E]
where “pinv” denotes the pseudoinverse of the matrix. Simultaneously, the color checkers are captured by a digital camera under the same illumination condition, for which the output format is sRGB (JPEG image files) [14]. The red, green and blue values (from 0 to 255) of each color checker's image are obtained using computer programs, and are then plotted on a scale of , and (form 0 to 1). These RGB values can be transferred into CIE XYZ tristimulus values by the following formula [14]:
[XYZ]=[T][f(Rsrgb)f(Gsrgb)f(Bsrgb)]
where

[T]=[0.41240.35760.18050.21260.71520.07220.01930.11920.9505]
f(n)={(n+0.0551.055)2.4,n>0.04045(n12.92),otherwise

Due to the reference white of the sRGB color space being illuminated by a CIE standard light source D65, which is different from the artificial light used for measuring the spectra of the color checkers, these RGB values are corrected for chromatic adaptation by applying CMCCAT2000 [15]. Taking the accuracy of spectral estimation into account, color correction of the camera is also required. The reflection spectra measured by the spectrophotometer are transferred into CIE XYZ tristimulus values by using Eqs. (5) to (8). In these calculations, S(λ) is the relative spectral power distribution of the artificial light, R(λ) is the spectral reflectance of the respective color checker, x¯(λ), y¯(λ) and z¯(λ) are the color matching functions,

X=k380nm780nmS(λ)R(λ)x¯(λ)dλ
Y=k380nm780nmS(λ)R(λ)y¯(λ)dλ
Z=k380nm780nmS(λ)R(λ)z¯(λ)dλ
where

k=100/380nm780nmS(λ)y¯(λ)dλ

After the chromatic adaptation transform, the RGB values corresponding to the new XYZ values are calculated by the inverse procedures of Eqs. (2) to (4), then set as standard matrix [A]. The color relationship between the spectrophotometer and the camera is found by using the third-order polynomial regression for the red, green and blue components separately, and the regression matrix [C] is determined as follows:

[C]=[A]pinv[F]
where
[F]=[1,R,G,B,RG,GB,BR,R2,G2,B2,RGB,R3,G3,B3,RG2,RB2,GR2,GB2,BR2,BG2]T
and “R”, “G”, and “B” are the respective RGB values of the color checkers captured by the camera. The corrected RGB values are obtained from Eq. (11). Here [K] presents the RGB values captured from any image that expanded into a format such as the original matrix [F]. The calculation processes of the CMCCAT2000-corrected RGB values of the digital camera are obtained from reference 15.The corrected RGB values of the color checkers are then transferred into CIE XYZ values and arrayed as a matrix, [β].
[CorrectedRGB]=[C][K]
Finally, a transform matrix [M] between the spectrophotometer and the camera is obtained as follows:

[M]=[α]pinv[β]

For each pixel in any image captured by the camera, the RGB values multiplied by the regression matrix [C] and the corresponding XYZ values are calculated using Eqs. (2) to (4). The estimated spectra in the visible light region (380nm to 780nm) are obtained by

[Spectra]380780nm=[E][M][XYZ]
The spectrum of the artificial light is also measured by the spectrophotometer. By dividing the estimated spectra by the spectrum of the artificial light and then multiplying the result by the spectrum of a new light, the color reproduction under different illumination conditions is achieved. To verify the color reproduction ability of this algorithm, the color differences between each color checker before and after the estimation are calculated. Color difference is calculated as follows:

  • 1. Transfer the corresponding CIE XYZ values of two objects into coordinates (L*,a*,b*) in the CIE 1976 color space, where
    L*=116f(YYn)16
    a*=500[f(XXn)f(YYn)]
    b*=200[f(YYn)f(ZZn)]
    f(n)=n13,forn>0.008856otherwisef(n)=7.787n+0.137931
  • 2. Calculate the Euclidean distance ΔE*ab between the two points in the CIE 1976 color space by
    ΔEab*=(ΔL*)2+(Δa*)2+(Δb*)2

The average color difference between each color checker before and after the estimation is about 2.51. In the CIELAB color space, a value of color difference (ΔE*ab) smaller than three represented the tolerance for color difference, i.e., cannot be distinguished by eye, a value between three and six represented the acceptable color difference of vision, i.e., could be distinguished by eye by some people [16]. These results indicate that this algorithm has an acceptable accuracy for color reproduction, thus it is used for estimating the color representation of any image. The required color depth for the use of the camera is 24 bit that supports true color for three RGB colors. Especially in computer processing, method of representing and storing graphical image information in an RGB color space such that a very large number of colors, shades, and hues can be displayed in an image, such as in high quality photographic images or complex graphics. In this study, the less color depth distorts color image reproduction. Deep color means higher color depth than 24 bit such as 30 or 36 bit that is a very time-consuming calculation for color image reproduction.

3. Images of oral herpangina caused by enterovirus infection

Oral diseases are a significant health problem worldwide, and researchs are constantly looking for methods of early diagnosis. The traditional biopsy method is invasive, as well as requiring a time-consuming and high-cost pathological analysis. In addition, only doctors having high levels of surgical skills and experience are allowed to do the task. For the oral diseases which require early diagnosis for effective treatment, biopsy is not the best option. Oral cancer, for example, is studied by many researchers for the possibility of using illuminants for diagnosis. These studies are based on confirmed cases of oral cancer; studies of illumination for early detection are few. Previous studies show that LEDs have good performance not only for exciting autofluorescence, but also for enhancing the color contrast of microvasculature in narrowband imaging [1719].

A commercial digital camera (Fujifilm, F100fd) was used to capture oral images at different times. Figure 1 shows oral images of a 4-year-old child suffering from herpangina caused by enterovirus infection. Figure 1(a) shows a normal oral cavity image with no infection, and (b) to (d) show oral cavity images on Days One, Two and Three of the infection, respectively. All images have been captured under the same conditions (light source: fluorescent lamp; illuminance: 250 Lux). Arrows indicate the pathological changes in the oral cavity. It is easy for physicians to know the child is infected with an enterovirus from Images (c) and (d). But more than 80% of doctors will see Image (b) when they diagnose the child. From Image (b), it is not clear to most physicians whether the patient has caught a cold or an enterovirus. There are two ways of solving this problem. The first one is that Image (b) will look like Image (c) under a special-colored illuminant for the oral cavity. The second one is the estimated development of the disease consistent with reflection spectra analysis of mucosal lesions. This is the key issue of this study. From these images, the locations of lesions and their color changes can be clearly identified. By using the color reproduction system mentioned above, the reflection spectrum of each lesion at any site in these images can be estimated rapidly. In addition, by dividing the estimated reflection spectrum by the spectrum of the original light source and then multiplying the result by the spectrum of a new light source, the images were reproduced, and the changes of color differences between normal and inflamed tissues in the oral cavity under several standard illuminants, a tungsten lamp, a fluorescent lamp, W-LEDs (white LEDs using YAG phosphors) and RGB W-LEDs (white LEDs made by combing red, green, and blue LEDs) with various correlated color temperatures (CCTs) on Day Two were calculated. Normal, inflamed and mucosal lesion regions are defined as shown in Fig. 2 , in which “Lesion*” refers to an infected site, inflamed on Day Two, but having developed mucosal lesions on Day Three.

 figure: Fig. 1

Fig. 1 Oral images of a 4-year-old child suffering from herpangina caused by enterovirus infection. (a) normal oral cavity with no infection. (b) Day One, (c) Day Two and (d) Day Three of oral cavity infection.

Download Full Size | PDF

 figure: Fig. 2

Fig. 2 Different regions of the oral cavity defined for analysis, in which “Lesion*” refers to an infected site, inflamed on Day Two, but having developed mucosal lesions on Day Three.

Download Full Size | PDF

4. Oral cavity detection by using white light sources

Figure 3(a) shows the CRI of various illuminants as given CCTs, where W-LEDs denote the typical white LEDs using YAG phosphors with various color ranks and RGB W-LEDs denote the white LEDs made by combining red, green, and blue LEDs with appropriate intensity ratio [20]. The CRI is calculated by the following procedure:

  • 1. Find the chromaticity coordinates (u,v) of the test source in the CIE 1960 color space.
  • 2. Determine the CCT of the test source [21].
  • 3. If the CCT is under 5000K, use a black body with the same CCT as the reference illuminant, otherwise use CIE standard illuminant D with the same CCT.
  • 4. Multiply the spectra of the test source and reference illuminant by the reflection spectra of the standard color samples, then calculate the corresponding CIE XYZ values.
  • 5. Apply the chromatic adaptation using the von Kries transform, then transfer these CIE XYZ values into the coordinates (U*,V*,W*) in the CIE 1964 color space [22].
  • 6. Calculate the Euclidean distanceΔEi between the two points in the CIE 1964 color space of each sample illuminated by the test source and reference illuminant separately.
  • 7. By using the formula Ri = 100 − 4.6ΔEi, calculate the special CRI of each color sample.
  • 8. The general CRI is obtained by calculating the arithmetic mean of the special CRIs.
CIE XYZ values are transferred into the chromaticity coordinates (u,v) in the CIE 1960 color space as follows:
u=4XX+15Y+3Z
v=6YX+15Y+3Z
The transformation between the CIE 1960 color space and the coordinates (U*,V*,W*) in the CIE 1964 color space are as follows:
U*=13W*(uu0)
V*=13W*(vv0)
W*=25Y1317
where u0 and v0 are the CIE 1960 UCS chromaticity coordinates of a perfect reflecting diffuser.

 figure: Fig. 3

Fig. 3 (a) CRI and (b) average color difference at given CCTs under various illuminants.

Download Full Size | PDF

The color differences ΔEi of the chromaticity coordinates (U*,V*,W*) between the test sources and the reference illuminants for each sample are calculated as follows:

ΔEi=(ΔU*)2+(ΔV*)2+(ΔW*)2

The average color differences between normal and inflamed tissues in the oral cavities of ten patients using these illuminants are shown in Fig. 3(b). The results show that the CRIs of the illuminants have no direct relationship with color enhancement ability in the oral cavity, particularly for RGB W-LEDs. Furthermore, it can be seen that the maximum color difference appears when using the RGB W-LED with a CCT of about 4000K to 5000K. Under illumination with CCT around this range, the image represents a color performance between red and yellow. This result can be explained by the effect called Crispening in color science [23]. In the oral cavity, which is reddish colored, when both normal and inflamed tissues are illuminated by a light source, the color of which is between their colors, the color contrast will be enhanced. When the CCT of the light source is below 4000K, a red-over-saturated image will make it difficult to differentiate normal and inflamed areas, whereas with a CCT above 5000K, the image will become bluer and lead to the same result. It is evident that the color enhacement abilities of RGB W-LEDs are much better than those of the other light sources. Therefore, RGB W-LEDs are good options for optimal illumination for oral cavity detection. The color image reproductions of an oral cavity with various RGB W-LED CCTs are shown in Fig. 4 .

 figure: Fig. 4

Fig. 4 The color image reproduction of an oral cavity at Day Two with different RGB W-LEDs CCTs light sources

Download Full Size | PDF

Figure 5 shows the simulated images of an oral cavity under the illumination of a RGB W-LED with CCT of 6525K (CRI:60), a W-LED with CCT of 7113K (CRI:79), CIE C light (CRI:98), and CIE D65 (CRI:100), (a) to (d) respectively. The differences of color between Fig. 5(a) and the others are not as great as the differences of their CRIs. This result shows that even when using white lights with lower CRIs as illumination, the resulting visible color contrasts are not necessarily equally low, especially in certain environments.

 figure: Fig. 5

Fig. 5 The simulated images of oral cavity under the illumination of (a) RGB W-LED with CCT of 6525K (CRI:60), (b) W-LED with CCT of 7113K (CRI:79), (c) CIE C light (CRI:98), and (d) CIE D65 (CRI:100).

Download Full Size | PDF

5. A survey of optimal illumination with color difference enhancement

Compared with the other light sources tested, RGB W-LEDs are good options for optimal illumination for oral cavity detection. However, as indicated in Section 4, white light might not be the best choice, so multicolored LEDs should be tested in the same way. A suitable mix of colored LEDs can be expected to be found. A computer program has been written to resolve the complex computation process. Spectra of a commercial RGB LED with peak wavelengths at 460nm, 530nm, and 617nm for blue, green, and red respectively, are applied to illuminate the oral cavity and calculate the color difference by changing the intensity ratio of each peak. By using this method, the color enhancement ability of every single point in the CIE 1931 chromaticity diagram is calculated. The distribution of the maximum color difference also is found. After the computation of color difference for all illuminants, the maximum, the minimum and the median values are determined, and denoted as ΔEab*max, ΔEab*min, and ΔEab*mid, respectively. Finally, a color difference distribution diagram is plotted by the program, which represents the average color differences by using the formula of RGB digital counts (Rd, Gd, Bd) as follows:

Rd=255(ΔEab*maxΔEab*)[255/(ΔEab*maxΔEab*min)]
Gd=255|ΔEab*ΔEab*mid|[255/(ΔEab*maxΔEab*mid)]
Bd=255(ΔEab*ΔEab*min)[255/(ΔEab*maxΔEab*min)]
where “Rd”, “Gd”, and “Bd” are the degrees of difference between a color difference ΔEab* and ΔEab*max, ΔEab*min, and ΔEab*mid, respectively.

The program is applied to the selected regions in Fig. 2. By changing the intensity ratio of the red, green, and blue peaks by intervals of 0.01, creating about a million different light sources spectra, the color differences between the normal, inflamed and impending-lesion (Lesion*) regions are calculated. Figure 6(a) shows color difference distribution diagram which represents the average color difference between normal and inflamed tissues in the oral cavities of ten patients under the illumination of commercial RGB LEDs, in which “x” and “y” are the chromaticity coordinates of the illuminant in the CIE 1931 standard colorimetric system. The color bar at the right-side of the diagram refers to the color of the corresponding average color difference calculated using Eqs. (25) to (27), “R”, “G”, and “B” denote the chromaticity coordinates of red, green, and blue LEDs, respectively. The chromaticity coordinates of any mixing ratio of these LEDs will be bounded in the area of these three points. A large color difference will appear red, a small one, blue. It can be seen that the illuminant with a specific intensity ratio of red, green, and blue LEDs with (x, y) = (0.3668, 0.5612) has the best color enhancement ability, producing a color difference of 30.5532, as shown in the inserted plate at the top right in the diagram. Figure 6(b) shows an average color difference distribution diagram of the inflamed and impending-lesion (Lesion*) regions, where the distribution of the maximum values appears at a similar position to that in Fig. 6(a), and the color difference is 9.6969. The result shows that blue LEDs are not necessary due to their negligible contribution. In other words, an illuminant consisting of only red and green LEDs has the best color enhancement ability. Similarly, about 2.5 million different light sources spectra are calculated for RGBY LEDs by varying the intensity ratio by intervals of 0.025. In these LEDs, a simulated peak wavelength at 586 nm for yellow is added [6]. Figures 6(c) and 6(d) represent the results from RGBY LEDs corresponding to Figs. 6(a) and 6(b), respectively. The maximum color difference appears at (x, y) = (0.3651, 0.5625), and the values of the color differences are 30.5444 and 9.6918 respectively, which are close to the results for the RGB LEDs. These results express that, of the color components, tested, only red and green make significant contributions to the color enhancement of the oral cavity.

 figure: Fig. 6

Fig. 6 Color difference distribution diagrams of normal, inflamed and impending- lesion (Lesion*) regions in the oral cavities of ten patients under the illumination of different types of LEDs, (a) and (b) are illuminated by commercial RGB LEDs, and (c) to (f) by RGBY LEDs. The left-side figures, (a), (c), and (e), are the results from the normal and inflamed tissues, and the right-side figures, (b), (d), and (f), are the results from the inflamed and impending-lesion (Lesion*) regions.

Download Full Size | PDF

Different peak wavelengths are also taken into account. In Fig. 6, (e) and (f) show the color difference distribution diagrams of the normal, inflamed and impending-lesion (Lesion*) regions under the illumination of simulated RGBY LEDs with peak wavelengths at 627nm, 512nm, 447nm, and 573nm respectively. The maximum average color differences of these two diagrams appear at (x, y) = (0.2567, 0.5635) and (x, y) = (0.2543, 0.5649), and have values of 28.3046 and 9.7335 respectively. These results indicate that the average color difference distribution is shifted with the peak wavelengths of the LEDs, and the maximum average color difference is determined by the peak wavelengths of the red and green LEDs. All of this leads to the same result, that is, the chromaticity coordinates of the optimal illumination are at the position of mixing of the red and green LEDs regardless of whether the peak wavelengths of the LEDs are changed or not. That green light can enhance the color contrast in the reddish oral cavity, especially the absorption and scattering of light by the hemoglobin, is proposed in previous literature [19]. That a few red components can increase this enhancement can be explained by the previously mentioned effect called Crispening in color science [23].

In Figs. 7(a) to 7(c) show the simulated images of an oral cavity under the illumination of a fluorescent lamp, a blue LED, and a red-green LED light source, respectively are shown. The white circle areas present the inflamed and impending-lesion (Lesion*) tissues in an oral cavity for highlighting the areas of interest. Under the illumination of the fluorescent lamp, the color difference between the normal and inflamed tissues in the oral cavity is 21.5732, and between the inflamed and impending-lesion (Lesion*) tissues it is 5.3232. It is obvious from visual inspection that the color enhancement ability of the red-green LED in Fig. 7(c) is better than that of the blue LED in Fig. 7(b), coinciding wih the results shown in Fig. 6(a) and 6(c).

 figure: Fig. 7

Fig. 7 The simulated images of an oral cavity under the illumination of (a) a fluorescent lamp, (b) a blue LED, and (c) a red-green LED light source.

Download Full Size | PDF

These simulations demonstrate that there exists an optimal illuminant which can enhance the color difference between normal and abnormal tissues in the oral cavity. Under the illumination of the optimal illuminant, the green-red LEDs, the maximum color difference between normal and inflamed tissues in an oral cavity can be enhanced by 42%, from 21.5732 to 30.5532, and that between inflamed and impending-lesion (Lesion*) tissues by 87%, from 5.3232 to 9.6969, compared with the now commonly used fluorescent lamp.

To test whether the position of the maximum color difference changes with a different observed object, this procedure is applied to an image of a hand with inflammation caused by an allergy, as shown in Fig. 8(a) , in which the arthrosis of the finger (white circle areas) are redder than the normal tissues. By using the same process, an optimal illuminant is found. The color difference distribution diagram of which is shown in Fig. 8(b). The result shows that the position of the maximum color difference is shifted to the region between green light and white light.

 figure: Fig. 8

Fig. 8 (a) An image of inflammation of a hand caused by allergies and (b) its color difference distribution diagram.

Download Full Size | PDF

6. MSI control in the GUI interface

The computer program was designed by Microsoft Visual Studio 2010 Express to integrate modular functions into a seamless interface which allows the user to execute a range of spectral estimation and color reproduction routines [24]. The modularity of this program’s underlying routines enables researchers to add or subtract functions as their experimental requirements demand. The following provides a short summary, referenced to the GUI panels of the program’s modular functions indicated in Fig. 9 . The parts of the program are:

 figure: Fig. 9

Fig. 9 The computer program written for spectral estimation and color reproduction in this study. The parts of the program are: (a) input of the original image, (b) selection of color difference comparison areas, (c) replacement of light sources, (d) spectral power distribution of the light sources, (e) the simulation results, and (f) calculation of color differences. The characteristics of the light source such as color rendering properties, correlated color temperature (CCT) and white point are also shown in the interface. When the simulation process is complete, the estimated spectrum can be viewed by clicking the button “Spectrum” as shown at the top of Part (b). The camera settings can change the color correction formula and its white balance mode, which can make the simulation more accurate. http://140.123.76.138/hcw_lab/colorfactoryen.zip.

Download Full Size | PDF

  • (a) Input of the original image: In this subroutine, the original image can be loaded with Joint Photographic Experts Group (JPEG), Graphics Interchange Format (GIF), Tagged Image File Format (TIFF), Bitmap image file (BMP), and other commonly used photo files. The area of interest in an image is selected and enlarged from the loaded image.
  • (b) Selection of color difference comparison areas: From the upper left window, two areas of the loaded image are selected for the color difference calculation. These two areas, selected using a computer mouse, will be shown in the “Area Comparison”. The “Exclude” means the excluded parts of the two areas.
  • (c) Replacement of light sources: In this part, we can select a new illuminant for color image reproduction, which comes from either the program-build or input by user command. The color-rendering values, such as color rendering index (CRI), color quality scale (CQS), and feeling of contrast index (FCI), of a newly specified illuminant are displayed on the computer monitor. The model of camera that captures the loaded image, its white balance, and the illuminance of the lighting can be chosen by the researchers. The camera settings can be used to change the color correction formula and its white balance mode, which can make the simulation more accurate.
  • (d) Spectral power distribution of the light sources: The spectral power distribution of an illuminant is shown at the bottom right.
  • (e) Simulation results: The color image reproduction of a new illuminant is shown in the upper right window by clicking on “Convert” at the top, just to the left of the window.
  • (f) Calculation of color differences: When the simulation process is complete, the estimated spectrum of the image can be obtained by clicking the button “Spectrum” as shown at the top of Part (b). The “Color Difference” means the color difference calculation in the color reproduction image of the selected areas in Part (b). The “Two Point Comparison” produces the color difference calculation between the original and color reproduction images for a selected area.

7. Conclusion

In this study, RGB LEDs are found to be better than fluorescent light for oral cavity detection of inflamed and impending-lesion tissues. RGB W-LEDs had optimal color enhancement ability in the CCT range of 4000 to 5000K. When using multicolor LEDs as illumination, the illuminant consisting of red and green LEDs in a specific ratio produces the best enhancement. Comparing normal and inflamed tissues, the color difference is increased from the 21.5732 of a common fluorescent lamp to 30.5532, an enhancement of 42%. Comparing inflamed and ompending-lesion tissues, the color difference is increased from 5.3232 to 9.6969, an enhancement of 87%. These enhancements provide for faster and more effective diagnosis by doctors. Future research will determine whether the algorithm can be used to develop optimal lighting for various other applications.

Acknowledgment

This research was supported by the National Science Council, The Republic of China (Taiwan), under Grant No. NSC 100-2221-E-194-043.

References and links

1. D. Corell, H. Ou, C. Dam-Hansen, P. M. Petersen, and D. Friis, “Light Emitting Diodes as an alternative ambient illumination source in photolithography environment,” Opt. Express 17(20), 17293–17302 (2009). [CrossRef]   [PubMed]  

2. C. H. Tsuei, W. S. Sun, and C. C. Kuo, “Hybrid sunlight/LED illumination and renewable solar energy saving concepts for indoor lighting,” Opt. Express 18(S4Suppl 4), A640–A653 (2010). [CrossRef]   [PubMed]  

3. M. Rahman, P. Chaturvedi, A. M. Gillenwater, and R. Richards-Kortum, “Low-cost, multimodal, portable screening system for early detection of oral cancer,” J. Biomed. Opt. 13(3), 030502 (2008). [CrossRef]   [PubMed]  

4. CIE 17.4–1987, “International lighting vocabulary,” No.845–02–59 (1987).

5. CIE 13.3–1995, “Method of measuring and specifying colour rendering properties of light sources,” (1995).

6. Y. Ohno, “Spectral design considerations for white LED color rendering,” Opt. Eng. 44(11), 111302 (2005). [CrossRef]  

7. W. A. Thornton, “Color-Discrimination Index,” J. Opt. Soc. Am. 62(2), 191–194 (1972). [CrossRef]   [PubMed]  

8. K. Hashimoto, T. Yano, M. Shimizu, and Y. Nayatani, “New method for specifying color-rendering properties of light sources based on feeling of contrast,” Color Res. Appl. 32(5), 361–371 (2007). [CrossRef]  

9. W. Davis and Y. Ohno, “Color quality scale,” Opt. Eng. 49(3), 033602 (2010). [CrossRef]  

10. J. A. Worthey, “Color rendering: Asking the question,” Color Res. Appl. 28(6), 403–412 (2003). [CrossRef]  

11. H. C. Wang, Y. T. Chen, J. T. Lin, C. P. Chiang, and F. H. Cheng, “Enhanced visualization of oral cavity for early inflamed tissue detection,” Opt. Express 18(11), 11800–11809 (2010). [CrossRef]   [PubMed]  

12. M. B. Bouchard, B. R. Chen, S. A. Burgess, and E. M. C. Hillman, “Ultra-fast multispectral optical imaging of cortical oxygenation, blood flow, and intracellular calcium dynamics,” Opt. Express 17(18), 15670–15678 (2009). [CrossRef]   [PubMed]  

13. M. Yamaguchi, “Medical application of a color reproduction system with a multispectral camera,” Dig. Color Imaging Biomed. , 33–38 (2001)

14. M. Anderson, R. Motta, S. Chandrasekar, and M. Stokes, “Proposal for a standard default color space for the internet: sRGB,” IS&T/SID 4th Color Imaging Conference Proc. 238 (1996), also see http://www.w3.org/Graphics/Color/sRGB.html

15. C. J. Li, M. R. Luo, B. Rigg, and R. W. G. Hunt, “CMC 2000 chromatic adaptation transform: CMCCAT2000,” Color Res. Appl. 27(1), 49–58 (2002). [CrossRef]  

16. P. Green and L. MacDonald, Colour Engineering: Achieving Device Independent Colour (Wiley, 2002).

17. E. Svistun, U. Utzinger, R. Jacob, R. K. Rebecca, A. El-Naggar, and A. Gillenwater, “Optimal visual perception and detection of oral cavity neoplasia reflectance and fluorescence,” Biomedical Topical Meeting TuA3 (2002).

18. P. M. Lane, T. Gilhuly, P. Whitehead, H. Zeng, C. F. Poh, S. Ng, P. M. Williams, L. Zhang, M. P. Rosin, and C. E. MacAulay, “Simple device for the direct visualization of oral-cavity tissue fluorescence,” J. Biomed. Opt. 11(2), 024006 (2006). [CrossRef]   [PubMed]  

19. D. Roblyer, C. Kurachi, A. El-Naggar, M. D. Williams, A. M. Gillenwater and R. Richards-Kortum, “Multispectral and hyperspectral in vivo imaging of the oral cavity for neoplastic tissue detection,” Biomedical Optics BTuD1 (2008).

20. C. S. McCamy, “Correlated color temperature as an explicit function of chromaticity coordinates,” Color Res. Appl. 17(2), 142–144 (1992). [CrossRef]  

21. A. Žukauskas, R. Vaicekauskas, and M. Shur, “Solid-state lamps with optimized color saturation ability,” Opt. Express 18(3), 2287–2295 (2010). [CrossRef]   [PubMed]  

22. J. von Kries, Chromatic Adaptation, (Festschrift der Albrecht-Ludwigs-Universität 1902). Translation from D.L. MacAdam, Colorimetry-Fundamentals (SPIE Milestone Series MS 77 1993).

23. M. D. Fairchild, Color Appearance Models (John Wiley & Sons, 2005) p. 114.

24. http://msdn.microsoft.com/zh-tw/express/aa718373.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 Oral images of a 4-year-old child suffering from herpangina caused by enterovirus infection. (a) normal oral cavity with no infection. (b) Day One, (c) Day Two and (d) Day Three of oral cavity infection.
Fig. 2
Fig. 2 Different regions of the oral cavity defined for analysis, in which “Lesion*” refers to an infected site, inflamed on Day Two, but having developed mucosal lesions on Day Three.
Fig. 3
Fig. 3 (a) CRI and (b) average color difference at given CCTs under various illuminants.
Fig. 4
Fig. 4 The color image reproduction of an oral cavity at Day Two with different RGB W-LEDs CCTs light sources
Fig. 5
Fig. 5 The simulated images of oral cavity under the illumination of (a) RGB W-LED with CCT of 6525K (CRI:60), (b) W-LED with CCT of 7113K (CRI:79), (c) CIE C light (CRI:98), and (d) CIE D65 (CRI:100).
Fig. 6
Fig. 6 Color difference distribution diagrams of normal, inflamed and impending- lesion (Lesion*) regions in the oral cavities of ten patients under the illumination of different types of LEDs, (a) and (b) are illuminated by commercial RGB LEDs, and (c) to (f) by RGBY LEDs. The left-side figures, (a), (c), and (e), are the results from the normal and inflamed tissues, and the right-side figures, (b), (d), and (f), are the results from the inflamed and impending-lesion (Lesion*) regions.
Fig. 7
Fig. 7 The simulated images of an oral cavity under the illumination of (a) a fluorescent lamp, (b) a blue LED, and (c) a red-green LED light source.
Fig. 8
Fig. 8 (a) An image of inflammation of a hand caused by allergies and (b) its color difference distribution diagram.
Fig. 9
Fig. 9 The computer program written for spectral estimation and color reproduction in this study. The parts of the program are: (a) input of the original image, (b) selection of color difference comparison areas, (c) replacement of light sources, (d) spectral power distribution of the light sources, (e) the simulation results, and (f) calculation of color differences. The characteristics of the light source such as color rendering properties, correlated color temperature (CCT) and white point are also shown in the interface. When the simulation process is complete, the estimated spectrum can be viewed by clicking the button “Spectrum” as shown at the top of Part (b). The camera settings can change the color correction formula and its white balance mode, which can make the simulation more accurate. http://140.123.76.138/hcw_lab/colorfactoryen.zip.

Equations (27)

Equations on this page are rendered with MathJax. Learn more.

[α] T = [D] T pinv[E]
[ X Y Z ]=[T][ f(Rsrgb) f(Gsrgb) f(Bsrgb) ]
[T]=[ 0.4124 0.3576 0.1805 0.2126 0.7152 0.0722 0.0193 0.1192 0.9505 ]
f(n)={ ( n+0.055 1.055 ) 2.4 , n>0.04045 ( n 12.92 ) ,otherwise
X=k 380nm 780nm S(λ)R(λ) x ¯ (λ)dλ
Y=k 380nm 780nm S(λ)R(λ) y ¯ (λ)dλ
Z=k 380nm 780nm S(λ)R(λ) z ¯ (λ)dλ
k=100/ 380nm 780nm S(λ) y ¯ (λ)dλ
[C]=[A]pinv[F]
[F]= [ 1,R,G,B,RG,GB,BR, R 2 , G 2 , B 2 ,RGB, R 3 , G 3 , B 3 ,R G 2 ,R B 2 ,G R 2 ,G B 2 ,B R 2 ,B G 2 ] T
[ Corrected RGB ]=[ C ][ K ]
[M]=[α]pinv[β]
[Spectra] 380780nm =[E][M][ X Y Z ]
L * =116f( Y Y n )16
a * =500[ f( X X n )f( Y Y n ) ]
b * =200[ f( Y Y n )f( Z Z n ) ]
f(n)= n 1 3 ,for n>0.008856 otherwise f(n)=7.787n+0.137931
Δ E ab * = (Δ L * ) 2 + (Δ a * ) 2 + (Δ b * ) 2
u= 4X X+15Y+3Z
v= 6Y X+15Y+3Z
U * =13 W * (u u 0 )
V * =13 W * (v v 0 )
W * =25 Y 1 3 17
Δ E i = (Δ U * ) 2 + (Δ V * ) 2 + (Δ W * ) 2
R d =255(Δ E ab * max Δ E ab * )[255/(Δ E ab * max Δ E ab * min )]
G d =255| Δ E ab * Δ E ab * mid |[255/(Δ E ab * max Δ E ab * mid )]
B d =255(Δ E ab * Δ E ab * min )[255/(Δ E ab * max Δ E ab * min )]
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.