Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Flame front tracking by laser induced fluorescence spectroscopy and advanced image analysis

Open Access Open Access

Abstract

This paper presents advanced image analysis methods for extracting information from high speed Planar Laser Induced Fluorescence (PLIF) data obtained from turbulent flames. The application of non-linear anisotropic diffusion filtering and of Active Contour Models (Snakes) is described to isolate flame boundaries. In a subsequent step, the detected flame boundaries are tracked in time using a frequency domain contour interpolation scheme. The implementations of the methods are described and possible applications of the techniques are discussed.

©2001 Optical Society of America

1 Introduction

The influence of fluid physics on the development of flames is of fundamental importance for the design of more efficient and environmentally friendly combustion devices [1]. Turbulence, for example, has a major effect on the shape of the flame surface, which is defined as the interface between reacted and unreacted components of the combustible mixture. Turbulence can wrinkle and stretch this surface, thus steepening concentration gradients and increasing the rate of diffusive mixing across this interface. The higher efficiency of mixing afforded by turbulence in turn leads to an increase of the burning rate of the mixture. This is often a desirable feature: Turbulent mixing, for example, has a direct effect on the efficiency of an automotive engine. However, excessive turbulence can wrinkle and stretch the flame boundary to such an extent, that molecular transport starts to compete with chemical reactions. At very high strain, the heat released by reactions does not suffice to support the strong gradients and as a result the flame may become extinct. A detailed understanding of these processes is both of fundamental and practical interest and, despite much progress over recent years, is far from complete.

Recently, useful tools have emerged which allow the study of flame front development in a direct way by comparing experimental data to numerical simulations. One of the most widely used and advanced experimental techniques in this respect is Planar Laser Induced Fluorescence (PLIF) imaging which allows images of thin cross-sectional slices through the flame to be obtained. In particular, by imaging flame generated chemical radicals which are directly produced in the reaction zone, PLIF can provide very accurate data on the flame front topology. Such data can be used to provide input for advanced theoretical descriptions of the same processes. Direct numerical simulations (DNS) and Large eddy simulations (LES) are particularly promising approaches in this respect, since data generated by these techniques is directly comparable to data provided by PLIF. Model assumptions which have to be made in these two approaches can thus be tested or developed based on PLIF imaging data.

In the past PLIF flame visualizations have mostly been limited to 2 dimensions and/or single events in time which do not resolve the true dynamic nature of turbulent reactive flows. Recently however, the application of high repetition rate PLIF imaging has been reported [2]. This technique has been applied for the quantitative study of spark ignition and for performing detailed comparisons to model simulations [3]. Time resolved imaging of the type described in [2, 3] offers the possibility to track the velocity and the development of flame front topology in time thus offering a more complete characterization of the process. In a variant of the technique, where the flame is rapidly sliced in space, even three dimensional reconstructions of the reaction front can be made [4].

The resulting image data create large demands on image processing and data reduction techniques. A number of image processing techniques have been used for studying structures and velocities in flame images obtained by PLIF. Widely used tools for segmenting flame fronts include simple and adaptive thresholding. To set the threshold values some resort to studying the intensity histograms in the images [5]. These rather simple approaches do not work well in complex cases often resulting in loss of details or the appearance of holes in the segmented structures. Others have used spatial image gradients rather than intensities for setting the threshold values [6]. Nonetheless this still suffers from the problems of determining the correct thresholds and may also result in contour gaps and noise falsely detected as signal.

Techniques for studying the properties of the segmented flame contours or surfaces include using fractal geometry concepts for describing the wrinkled flame fronts [7, 8]. These methods have been used to provide estimates of the turbulent flame velocities [9] nevertheless they introduce many difficulties such as the determination of the correct fractal parameters like the fractal dimension.

In this paper we focus on the extraction of flame topology data based on high repetition rate PLIF image sequences using advanced image processing and image analysis methods. The image processing part involves a non-linear anisotropic diffusion filtering approach for image enhancement [10]. The image analysis part includes flame segmentation using Active Contour Models [11] for the identification of flame front boundaries. The contours evaluated for discrete images in a measurement series are then interpolated in time using a frequency domain shape representation. These interpolations can be applied to time sampled data to provide information on flame front velocities, as well as spatially sampled data, to produce 3D-renderings of reaction surfaces.

2 Experimental setup

All results presented here are based on PLIF images of OH radicals obtained from premixed, low turbulence flames (turbulent Reynolds number ReT <500). The principle of PLIF is to form a light sheet from a laser beam, using suitable optics, which traverses the flame. If the wavelength is tuned to match a molecular resonance line of OH then light from the sheet is inelastically scattered from the OH radicals present in the interaction region (see Fig. 1). This scattered light (fluorescence) is captured at right angles using a camera which is focused to image the illuminated flame cross-section. The local intensity in the recorded image is a function of the local OH concentration in the flame. Since OH is formed in the reaction zone of the flame and is rapidly quenched by cold unreacted gases, it is a good indicator of the flame front position in flames where the reaction zone is thin. In hot combusted gases, OH is removed more slowly, and a certain equilibrium concentration prevails depending on local temperatures and burnt gas composition. The purpose of the image processing techniques presented here is to extract the flame boundary marked by OH concentrations from such data as accurately as possible and to describe its dynamics as clearly as possible.

Detailed information about the experimental set-up used here is given in [3, 12]. Briefly, a multiple high power Nd:YAG laser cluster was used to pump a frequency doubled dye laser providing tunable radiation around 283 nm. The laser coincided with a temperature insensitive line of OH in the A 2+X 2Π electronic band. The laser output was passed through sheet forming optics before traversing the combustion system of interest. A combustion cell featuring two opposing tungsten electrodes was used to ignite mixtures of methane and air. Controlled degrees of turbulence could be imposed on the mixture via four high speed rotating fans.

 figure: Fig. 1.

Fig. 1. Schematic setup for time resolved PLIF of turbulent spark ignition.

Download Full Size | PDF

The images were captured by a high speed digital framing camera (Imacon 468, DRS Hadland, UK). Essentially this camera consists of 8 independent gated CCD sensors which are aligned along a common optical axis. However, a maximum of 4 sequential images were captured in the experiments reported here to follow single ignition events (time separations typically 1 ms).

3 Image processing stage

As described above, our image data contained sequences of PLIF images reflecting OH concentrations of turbulent flames. Each sequence comprised 4 frames (corresponding to the 4 laser pulses) which were first processed to correct for some experimental factors and then enhanced to reduce existing noise.

3.1 Preprocessing of raw image data

Three steps are performed to correct experimental defects found in the raw recorded images. Geometrical transformations are used to correct small shifts and rotations between images recorded with different CCD channels [3]. These distortions and displacements, which cause the imaged regions not to overlap perfectly, are a result of the non-perfect alignment of the individual CCD modules with respect to the optical axis. They are corrected by means of a geometrical transform that maps the images from different CCDs to a common reference image. The raw data images are also processed to remove existing background levels which are determined by capturing an image without any illumination and subtracting it from the data image. Finally, laser profile referencing is done where the laser beam profile is monitored online and saved with each PLIF image, allowing subsequent compensation for beam profile non-uniformities and shot to shot fluctuations. An example of preprocessed raw data is shown in Fig. 2.

 figure: Fig. 2.

Fig. 2. An example showing a typical experimental image sequence. (a)-(d) Four images (381×291 pixels, 0.1408 mm/pixel) captured with 1.7 ms time increments respectively.

Download Full Size | PDF

3.2 Noise reduction using non-linear diffusion filtering

To improve the signal to noise ratios and simplify the segmentation of the OH boundaries by image smoothing, the images were processed using non-linear anisotropic diffusion filtering. The method is based on an original approach formulated by Perona and Malik [10]. The principle of the method is to smooth out noise locally by diffusive flow whilst preventing flow across physically important boundaries. By a proper choice of the diffusion kernel, object boundaries may be enhanced and physical gradients sharpened thus simplifying subsequent object segmentation. In our case, we filter the imaged data using the equation

tI=div(g((Gσ*I))I)

where I represents the intensity of the image under consideration, and g(|∇(Gσ *I)|) represents a locally adaptive diffusive strength. The latter is made proportional to the gradient ∇I in the image itself (after smoothing with a Gaussian kernel Gσ of width σ, which is done for stability reasons [13]). The numerical implementation of this scheme is described in [14]. An example is shown in Fig. 3 and in the related animation file nldf.mov.

 figure: Fig. 3.

Fig. 3. Sample image corresponding to a single shot PLIF image of OH. To the left the raw data is shown. The same data is shown to the right after 30 iterations using the non-linear diffusion algorithm described in the text. In the lower section of the figure, cross-sectional profiles corresponding to the horizontal green line in the images are shown. Contours are clearly enhanced and local noise is efficiently filtered out. An animation of the diffusion process is presented in the related animation file nldf.mov (202 KB).

Download Full Size | PDF

4 Image analysis stage

In order to segment the flames in our measurement series we use Active Contour Models which results in the identification of flame front boundaries. These contours are then interpolated in time and the interpolations used to estimate the flame front velocities.

4.1 Segmentation using Active Contour Models (ACM)

Active Contour Models (also known as Snakes) refer to an advanced segmentation technique that guarantees the continuity and smoothness of the detected contours or boundaries [11]. In this technique a contour model (snake) is initialized on the image and then deformed in a way that minimizes its total energy by, for example, the application of forces on the contour in an iterative manner. The energy minimization process corresponds to moving the snake towards desired image features (e.g. edges) while maintaining its smoothness.

A snake in the continuous spatial domain is represented by a 2D parametric contour curve v (s)=(x (s), y (s)) where s ∊ [0, 1]. In a discrete setting the snake is defined as a set of N nodes, v i (n)=(xi (n), yi (n)) where xi (n) and yi (n) are the x- and y-coordinates of node n at iteration i, n=1, 2, …,N. Various forces act on the nodes yielding the following equation for updating their position

vi(n)=vi1(n)+w1Fitens(n)+w2Fiflex(n)+w3Fiext(n)+w4Fiinf(n)

where ω 1, ω 2, ω 3 and ω 4 are scalar factors weighting the different forces that we incorporate to deform the snake during its iterations [15]. Fitens (n) is a tensile force, which resists stretching of the snake, acting on node n at iteration i and is given in discrete form as

Fiflex(n)=2vi(n)vi(n1)vi(n+1)

Fiflex (n) is a flexural force which resists bending of the snake and is given as

Fiflex(n)=2Fitens(n)Fitens(n1)Fitens(n+1)

Fiinf (n) is an inflation force designed to move the snake nodes in a direction normal to the contour they form. In the cases where the snake is a closed contour, as in our application images, this means the nodes will move inwards or outwards. This will either inflate or deflate the snake towards the target boundary which enables us to initialize the snake at locations far from the target object that we want to segment. Fiinf (n) is defined as

Fiinf(n)=F(Is(xi(n),yi(n)))ni(n)

where n i (n) is the unit vector in the direction normal to the contour at node n and Is (xi (n), yi (n)) is the intensity of the pixel (xi (n), yi (n)) in a smoothed version of the image. The binary function

F(I(x,y))={+1,I(x,y)T1,otherwise

links the inflation force to the image data by determining if the snake is to be deflated or inflated and T is an image intensity threshold. Fiext (n) is an external force that is derived from the image data in a way that causes the snake nodes to move towards regions of higher intensity gradient (mainly edges) in the image and is defined as

Fiext(n)=P(xi(n),yi(n))

where P is the image gradient reflecting high intensity changes commonly present at boundary points.

Our discrete active contour model also incorporates an adaptive subdivision scheme where the snake nodes are resampled to give the snake an appropriate resolution (node separation) thus allowing it to latch onto the varying levels of detail of the target’s boundary. The resampling decision (whether nodes are inserted, removed, or unaltered) is not only based on the distance between the nodes along the snake, for example as in [16], but is also dependent on the local curvature. More points are added when high snake curvature is detected whereas nodes are removed when low curvature is detected so as not to clutter the snake with nodes which might cause problems like selfcrossing. In other words, our subdivision method is based on the distance between neighboring nodes and also on the angle between neighboring snake segments.

Equation 2 is used to deform the snake nodes iteratively until the solution converges. This is achieved when the changes in the snake nodes’ locations between subsequent iterations become very small (i.e. below a certain predetermined threshold). An example showing the results of using ACM for segmenting the flame contours in one of the PLIF image sequences is shown in Fig. 4 and in the related animation file snake.mov.

 figure: Fig. 4.

Fig. 4. An example illustrating the progress of the snake iterations: (a) The original raw image. (b) The initial snake (in red) applied on the non-linear diffusion filtered image. (c)-(e) The snake after 1, 25, and 85 iterations, respectively. (f) The final result overlaid on the original raw image. See the related animation file snake.mov (1 MB).

Download Full Size | PDF

4.2 Temporal interpolation of the flame contours

The detected flame contours can be used to study the flame front dynamics in time. In particular, we are interested in the extraction of local flame front velocities from discretely sampled data. To facilitate this task we interpolate between discrete contours in time as described below.

To reiterate, remember that after applying the active contour segmentation to a PLIF frame, we end up with a single snake contour locating the flame boundary in that frame. Thus for each frame j we have a snake contour {v (n, j)=(x (n, j), y (n, j)), n=1, 2, …,N} for j=1, 2, …, F where F is the number of frames (which in the present case is 4). In order to interpolate, we start by re-parameterizing each of the original flame contours with a new shape representation. This is most efficiently done by transforming from the spatial into the frequency domain. An advantage is that the need for a node-to-node correspondence between different contours (snakes) is avoided. The one-dimensional discrete cosine transform (DCT) of the sequence of x (n, j) contour coordinates (and similarly for the y (n, j) coordinates), n=1, 2, …,N, is defined as

follows [17]:

X(k,j)=w(k)n=1Nx(n,j)cosπ(2n1)(k1)2N

where

w(k)={1N,k=12N,2kN

and k=1, 2, …,N. Using the DCT as a new frequency domain shape parameterization has many advantages: It produces real coefficients, has excellent energy compaction properties, as well as having coefficients which correspond (opposed to spatial contour points with no point-to-point correspondence). Now armed with these frequency coefficients as new curve parameters, we can directly perform the actual interpolation. In our implementation cubic spline interpolation between corresponding frequency coefficients was utilized. Finally the Inverse Discrete Cosine Transform (IDCT) is used to transform the interpolated components back into the spatial domain:

x(n,j)=w(k)k=1NX(k,j)cosπ(2n1)(k1)2N

where n=1, 2, …,N and j′ spans the interpolated frames (including the original ones).

This interpolation method was tested on synthetically generated data for validation purposes. A single synthetic example consists of a shape sequence represented by a set of coordinates. Each sequence consists of F shapes and each shape contains L nodes. Both the x and y coordinates of each node move throughout the sequence (in time) according to sinusoidal functions with different amplitudes and frequencies, which causes spatial shape deformations in time. The coordinates are also scaled differently between frames according to sinusoidal functions in order to produce dynamic shapes that shrink and expand with time. To quantify the error (difference) between the original (known) synthetic sequences and the interpolated ones, we define the following error measure for each shape in the sequence

ε=AiAoAiAoAo

where Ao and Ai are the areas enclosed within the original and interpolated shapes, respectively. Fig. 5 illustrates some of these test examples and reports the corresponding average error values (over all frames in each test sequence) between the original and the interpolated sequences. Fig. 6 shows some further qualitative testing examples.

The described interpolation scheme was finally applied to the PLIF image data sequences. An example is shown in Fig. 7a and the related animation file interp.mov, where ten contours were interpolated between each pair of original contours.

4.3 Flame front velocity estimation

Once the contours are temporally interpolated they can be used to estimate the flame front velocities. The direction of the velocity vector at each snake node in a particular contour (frame) is locally perpendicular to the contour curve at that node. Subsequently, the intersection of this normal vector with the following contour in the next frame defines the target location to which the node in question has moved within the time elapsed between the two frames. Results of applying our velocity estimation method on the time interpolated frames illustrated in Fig. 7a are shown in Fig. 7b and c. Note how local velocities are affected by turbulence with a wide range of velocities present. An example for an application of this method is to compare images such as 7c with mean velocities to obtain instantaneous fluctuations which are due to turbulence.

 figure: Fig. 5.

Fig. 5. Temporal interpolation: Three validation tests (a)-(c) using synthetic examples. Top: Original synthetic sequences comprising F=16 shapes (generated by evolving a shape in time using predetermined controlled deformations). Centre: Four original shapes extracted from the synthetic sequence to be input into the interpolation algorithm. These 4 curves can be seen overlaid on the top figures in magenta, red, blue and cyan. Bottom: The interpolation result (reconstruction of 16 frames from 4 only). (a) Elliptical shapes: Error=3.53%. (b) Star shapes: Error=5.25%. (c) Shapes based on deforming a real flame boundary: Error=0.75%.

Download Full Size | PDF

5 Discussion and Conclusions

The capability to track the flame contour in time as shown here for the first time provides a unique way to study turbulent flame dynamics. In [18] we have reported on simultaneous flow field measurements by particle imaging velocimetry (PIV) and high speed PLIF of OH. In that work, the effects of local strain by velocity gradients and convective stress could be seen to disrupt the flame front and lead to local extinction. The technique described in Section 4.2 could be used to compare the flame front movement quantitatively with the mass flow field obtained by PIV and thus turbulent and chemical time scales could be isolated. A further application lies in the rendering of three dimensional flame contour data obtained from discrete data slices through the flame [4] where the interpolation is in the spatial rather than the temporal domain.

The methods outlined in Section 3.2 and Section 4.1 are efficient for the extraction of flame front contours from large experimental data sets. This is useful in several ways for comparisons with numerical data. For example, the degree of flame wrinkling can be extracted from the flame contour. This is defined as the ratio of the flame surface area of the turbulent flame to the corresponding area of a laminar flame and can be directly obtained by integration of the flame contours evaluated with the present methods. The degree of flame wrinkling is in turn related to local reaction rates [1]. The reaction boundary can also be used to define a reaction progress variable c which is defined as c=0 for fresh gases (which corresponds to the region outside the detected flame boundaries in the examples presented here) and c=1 for burnt gases (inside the boundaries). The conservation equations for premixed flames may be directly expressed in terms of such a progress variable. For LES this equation may be filtered so that only large scale structures are preserved and a model assumption is used to include the effects of small scale wrinkling in the calculation (a so called subgrid scale model). Knikker et al [6] have shown how experimentally determined progress variables may be used to validate such subgrid scale models.

 figure: Fig. 6.

Fig. 6. Temporal interpolation: (a-f) Different validation tests on synthetic examples. The original 4 contours are displayed in black and the interpolated ones in color.

Download Full Size | PDF

 figure: Fig. 7.

Fig. 7. An example of temporal interpolation on PLIF data: (a) The flame contours in an image sequence. The four original contours are shown in thick black and the interpolated ones in between are shown in random colors. (b) The calculated velocity vectors overlaid on the original contours. See related animation file interp.mov (144 KB). (c) The color map representation of the flame velocity values. The image shows the magnitude (in m/s) of the flame front velocity at each boundary point in each frame using the colormap shown to the right.

Download Full Size | PDF

The current ACM implementation segments single objects. However, at high turbulence levels where the flame becomes corrugated to an extent that isolated features appear, the need arises for incorporating modifications that enable the active contours to handle multiple objects. A solution would be to initialize multiple independent snakes. Other more elaborate methods include, for example, using topology adaptive snakes [15] or geodesic active contours [19].

In summary we present here various advanced image processing and analysis algorithms which can be used for quantitative extraction of reaction boundaries in turbulent flames. The techniques are fast and efficient and are suitable for reducing the large amounts of data obtained by sequential imaging of turbulent reactive flows with limited contrast and signal to noise ratios. An interpolation scheme is described which can be used to extract flame front velocity data from discretely sampled image sets. The method works well in medium to low turbulence premixed flames where the flame contours remain singly connected. Future work is aimed at adapting the method so that it can be used even at large turbulence intensities, where several separated objects may appear in the image.

References and links

1. J. Warnatz, U. Maas, and R.W. Dibble, Combustion - physical and chemical fundamentals, modeling and simulation, experiments, pollutant formation (Springer-Verlag, Heidelberg1996).

2. C.F. Kaminski, J. Hult, and M. Aldén, “High repetition rate planar laser induced fluorescence of OH in a turbulent non-premixed flame,” Appl. Phys. B 68, 757–760 (2000). [CrossRef]  

3. A. Dreizler, S. Lindenmaier, U. Maas, J. Hult, M. Aldén, and C.F. Kaminski, “Characterisation of a spark ignition system by planar laser-induced fluorescence of OH at high repetition rates and comparison with chemical kinetic calculations,” Appl. Phys. B 70, 287–294 (2000). [CrossRef]  

4. J. Hult, A. Omrane, J. Nygren, C.F. Kaminski, B. Axelsson, R. Collin, P.-E. Bengtsson, and M. Aldén, “Quantitative three dimensional imaging of soot volume fraction in turbulent non-premixed flames”, (in preparation).

5. G.J. Smallwood, O.L. Gulder, D.R. Snelling, B.M. Deschamps, and I. Gokalp, “Characterization of flame front surfaces in turbulent premixed methane/air combustion,” Combustion and Flame 101(4), 461–470 (1995). [CrossRef]  

6. R. Knikker, D. Veynante, J.C. Rolon, and C. Meneveau, “Planar Laser-Induced Fluorescence in a Turbulent Premixed Flame to analyze Large Eddy Simulation Models,” in Proceedings of the 10th international Symposium on Turbulence, Heat and Mass Transfer, Lisbon (2000). http://in3.dem.ist.utl.pt/downloads/lxlaser2000/pdf/26 3.pdf

7. B.D. Haslam and P.D. Ronney, “Fractal properties of propagating fronts in a strongly stirred fluid,” Phys. Fluids 7(8), 1931–1937 (1995). [CrossRef]  

8. Y.-C. Chen and M.S. Mansour, “Topology of turbulent premixed flame fronts resolved by simultaneous planar imaging of LIPF of OH radical and rayleigh scattering,” Experiments in Fluids 26, 277–287 (1999). [CrossRef]  

9. O.L. Gulder, G.J. Smallwood, R. Wong, D.R. Snelling, R. Smith, B.M. Deschamps, and J.-C. Sautet, “Flame front surface characteristics in turbulent premixed propane/air combustion,” Combustion and Flame 120(4), 407–416 (2000). [CrossRef]  

10. P. Perona and J. Malik, “Scale-space and edge detection using anisotropic diffusion,” IEEE Trans. on Pattern Analysis and Machine Intelligence 12(7), 629–639 (1990). [CrossRef]  

11. M. Kass, A. Witkin, and D. Terzopoulos, “Snakes: Active Contour Models,” International Journal on Computer Vision 1(4), 321–331 (1988). [CrossRef]  

12. C.F. Kaminski, J. Hult, M. Aldén, S. Lindenmaier, A. Dreizler, U. Maas, and M. Baum, “Complex turbulence/chemistry interactions revealed by time resolved fluorescence and direct numerical simulations,” Proc. Combust. Inst. 28, The Combustion Institute, Pittsburgh, in press (2000).

13. F. Catté, P.-L. Lions, J.-M. Morel, and T. Coll, “Image selective smoothing and edge detection by nonlinear diffusion,” SIAM J. Numer. Anal. 29, 182–193 (1992). [CrossRef]  

14. H. Malm, J. Hult, G. Sparr, and C.F. Kaminski, “Non-linear diffusion filtering of images obtained by planar laser induced florescence spectroscopy,” JOSA A 17, 2148–2156 (2000). [CrossRef]  

15. T. McInerney and D. Terzopoulos, “T-Snakes: Topology adaptive snakes,” Medical Image Analysis 4, 73–91 (2000). [CrossRef]   [PubMed]  

16. S. Lobregt and M. Viergever, “A discrete dynamic contour model,” IEEE Trans. on Medical Imaging 14(1), 12–24 (1995). [CrossRef]  

17. A. Jain, Fundamentals of digital image processing (Prentice Hall, 1989).

18. J. Hult, G. Josefsson, M. Aldén, and C.F. Kaminski, “Flame front tracking and simultaneous flow field visualization in turbulent combustion,” in Proceedings of the 10th International Symposium on Application of Laser Techniques to Fluid mechanics, Lisbon (2000). http://in3.dem.ist.utl.pt/downloads/lxlaser2000/pdf/26 2.pdf

19. V. Caselles, R. Kimmel, and G. Sapiro, “Geodesic active contours,” in Proceedings of the International Conference on Computer Vision, 694–699 (1995).

Supplementary Material (3)

Media 1: MOV (202 KB)     
Media 2: MOV (1019 KB)     
Media 3: MOV (144 KB)     

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. Schematic setup for time resolved PLIF of turbulent spark ignition.
Fig. 2.
Fig. 2. An example showing a typical experimental image sequence. (a)-(d) Four images (381×291 pixels, 0.1408 mm/pixel) captured with 1.7 ms time increments respectively.
Fig. 3.
Fig. 3. Sample image corresponding to a single shot PLIF image of OH. To the left the raw data is shown. The same data is shown to the right after 30 iterations using the non-linear diffusion algorithm described in the text. In the lower section of the figure, cross-sectional profiles corresponding to the horizontal green line in the images are shown. Contours are clearly enhanced and local noise is efficiently filtered out. An animation of the diffusion process is presented in the related animation file nldf.mov (202 KB).
Fig. 4.
Fig. 4. An example illustrating the progress of the snake iterations: (a) The original raw image. (b) The initial snake (in red) applied on the non-linear diffusion filtered image. (c)-(e) The snake after 1, 25, and 85 iterations, respectively. (f) The final result overlaid on the original raw image. See the related animation file snake.mov (1 MB).
Fig. 5.
Fig. 5. Temporal interpolation: Three validation tests (a)-(c) using synthetic examples. Top: Original synthetic sequences comprising F=16 shapes (generated by evolving a shape in time using predetermined controlled deformations). Centre: Four original shapes extracted from the synthetic sequence to be input into the interpolation algorithm. These 4 curves can be seen overlaid on the top figures in magenta, red, blue and cyan. Bottom: The interpolation result (reconstruction of 16 frames from 4 only). (a) Elliptical shapes: Error=3.53%. (b) Star shapes: Error=5.25%. (c) Shapes based on deforming a real flame boundary: Error=0.75%.
Fig. 6.
Fig. 6. Temporal interpolation: (a-f) Different validation tests on synthetic examples. The original 4 contours are displayed in black and the interpolated ones in color.
Fig. 7.
Fig. 7. An example of temporal interpolation on PLIF data: (a) The flame contours in an image sequence. The four original contours are shown in thick black and the interpolated ones in between are shown in random colors. (b) The calculated velocity vectors overlaid on the original contours. See related animation file interp.mov (144 KB). (c) The color map representation of the flame velocity values. The image shows the magnitude (in m/s) of the flame front velocity at each boundary point in each frame using the colormap shown to the right.

Equations (11)

Equations on this page are rendered with MathJax. Learn more.

t I = div ( g ( ( G σ * I ) ) I )
v i ( n ) = v i 1 ( n ) + w 1 F i tens ( n ) + w 2 F i flex ( n ) + w 3 F i ext ( n ) + w 4 F i inf ( n )
F i flex ( n ) = 2 v i ( n ) v i ( n 1 ) v i ( n + 1 )
F i flex ( n ) = 2 F i tens ( n ) F i tens ( n 1 ) F i tens ( n + 1 )
F i inf ( n ) = F ( I s ( x i ( n ) , y i ( n ) ) ) n i ( n )
F ( I ( x , y ) ) = { + 1 , I ( x , y ) T 1 , otherwise
F i ext ( n ) = P ( x i ( n ) , y i ( n ) )
X ( k , j ) = w ( k ) n = 1 N x ( n , j ) cos π ( 2 n 1 ) ( k 1 ) 2 N
w ( k ) = { 1 N , k = 1 2 N , 2 k N
x ( n , j ) = w ( k ) k = 1 N X ( k , j ) cos π ( 2 n 1 ) ( k 1 ) 2 N
ε = A i A o A i A o A o
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.