Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Fast associative filtering based on two-dimensional discrete Walsh transform by a volume holographic correlator

Open Access Open Access

Abstract

An optical fast associative filtering method based on a multichannel volume holographic correlator to search the database as a front-end filter is described. The features of searching query are parallelly extracted by a volume holographic correlator based on two-dimensional discrete Walsh transform, and are used to measure the similarity between the query and all the database records. The best matches are picked up for further searching. An experiment is carried out, and the experimental results prove the validity of the method.

©2009 Optical Society of America

1. Introduction

A volume holographic correlator (VHC) is a multi-channel optical correlator based on high density holographic storage. Its output correlation pattern of each channel is a suppressed correlation function [1]. By the use of speckle modulation [2] and 2D interleaving method [3], the output results can be more accuracy. It inherently has the characteristics of high-speed, high parallelism and multichannel processing, which give it potential in applications that need fast, real time calculations, such as database searching [4], associative retrieval [5], and pattern recognition [6].

In the traditional VHC-used database searching method [7], there are two problems. The first is that all the database records have to be stored into the VHC, thus the number of database records is restricted by the storage capacity of VHC. The second is that the searching accuracy is restricted because of the analog nature of the optical process.

The first problem can be solved thorough the 2D orthogonal transformation. Transform functions (TFs) are encoded and stored into VHC instead of all the database records. When a query is input, the inner products of the query and TFs are calculated by VHC and form the features of the query. The features are then used to measure similarity between the query and database records. In this case, the library images needed to be stored are reduced from all the database records to a few TFs. In other words, using the same storage capacity, this method can search a much larger database than the traditional one.

There are optical correlations have been reported to be used in performing 2D orthogonal transformations. 2D orthogonal transformation can be considered as a set of inner products of the data array to be transformed and the TFs. When an optical correlator is used to perform 2D transformations, the calculation speed depends on two important points: the number of the inner product calculation channels and the resolution of the stored TFs. Vander Lugt correlator (VLC) [8] can be used to perform 2D transformation. But it is a serial process, thus the number of calculation channel is only one. Leger and Lee proposed an optical implementation of 2D transformation using a planar computer generated hologram (CGH) [9, 10]. But the number of calculation channels and the resolution of TFs are restricted by the cross talk among the planar holograms and the practical resolution of the CGH. Lenslet array can also be used to perform 2D transformation [11–13]. But the resolution of TFs is restricted by the image section width of the lens [13]. Generally, the resolution is not better than 32×32 [13]. There are obvious advantages to use VHC to perform 2D orthogonal transformations than the methods mentioned above. The number of calculation channels can be more than 2000. And the resolution of TFs is the same as the SLM, which can be better than 1024×1024 now.

The second problem can be solved by a two-step searching scheme consists of the front-end approximate search and the back-end accurate search. In the first step, the VHC is used to take an approximate search of the whole database very fast and pick up a small set of several nearly matching records. In the second step, the digital search engine is used to find out the best match record in the small picked up set with a much higher accuracy. By using this two-step searching scheme, the required accuracy of the first step is reduced and could be satisfied by VHC. This scheme would fully take advantage of the parallelism and high speed of VHC and the high accuracy of the digital processor [14].

 figure: Fig. 1.

Fig. 1. System scheme

Download Full Size | PDF

In this paper, we suggest a fast associative filtering method based on 2D discrete Walsh transform (DWT) implemented by VHC to approximately search the database. The system scheme is shown in Fig. 1. In the front-end searching process, the VHC serves as a fast optical filter, which performs 2D DWT swiftly to get the features of the input query. The features are then used to approximately measure the similarity between the query and database records. As the query is compressed to the features with much less elements, the measuring time is tiny. Therefore the front-end searching can approach a very high process speed. Then the best matches are picked up and delivered to the back-end digital search engine for further accurate searching. There are some advantages of the method. First, the images needed to be stored are reduced from all the database records to a few Walsh functions. In other words, with the same storage capacity, this method can search a much larger database than the traditional one. Second, the two-step searching scheme would fully take the advantage of the parallelism and high processing speed of the multi-channel VHC and the high accuracy of the digital processor.

2. Feature extraction performed by VHC using 2D DWT

DWT is characterized by the energy packing capability. Furthermore, Walsh functions only have two states and are determined, thus possess many special properties. They are easy to be represented, and the calculation speed of DWT is high. Thereby it is widely used in feature extraction [15].

The 2D DWT of a data array x i,j of N×N points is [15]

Xm,n=1N2i=0N1j=0N1xi,jWAL(m,n,i,j),m,n=0,1,,N1

where WAL(m,n,i, j) is the 2D discrete Walsh function, and m,n is the order. Eq. (1) shows that 2D DWT is essentially a set of inner products of the data array and a series of 2D discrete Walsh functions.

VHC is often used to parallelly calculate the 2D inner products of the input image and all the stored images. The output of VHC can be expressed as

g(xc=xm,yc=ym)dx0dy0f(x0,y0)fm(x0,y0),m=1,2,3,

where f′(x 0, y 0) is the input image, fm(x 0, y 0) is the mth stored image. Eq. (2) shows that the output of each channel is the inner product of f′(x 0, y 0) and fm(x 0, y 0). Comparing Eq. (1) and Eq. (2), if X i,j and WAL(m,n,i, j) are encoded to f′(x 0, y 0) and fm(x 0, y 0) respectively, then VHC can be used to perform the 2D DWT.

The amplitude-modulation SLM which is used to upload images can only express nonnegative real quantities. In order to encode WAL(m,n,i, j) , it should be decomposed as

WAL(m,n,i,j)=WAL0(m,n,i,j)+(1)WAL1(m,n,i,j)

where WAL 0(m,n,i, j) and WAL 1(m,n,i, j) are all nonnegative quantities. For simply, assuming xi,j to be nonnegative, Eq. (1) can therefore be expressed as

Xm,n=1N2[xi,jWAL0(m,n,i,j)xi,jWAL1(m,n,i,j)]

where the symbol ● represents the inner product calculation.

For each order m,n, WAL 0(m,n,i, j)and WAL 1(m,n,i, j), each of which has N 2 elements, can be encoded to two basis images f m,n 0 and fm,n 1 respectively. An important characteristic of Walsh function is that the element values of the function only comprise 1 and -1 [14]. Thus WAL 0(m,n,i, j) and WAL 1(m,n,i, j) are both composed of 1 and 0, and can therefore be encoded to binary images: fm,n 0 is divided equally by N 2 blocks fm,n 0 (i, j) , i, j = 0,1,⋯N -1, which are used to represent the elements of WAL 0(m,n,i, j). Completely white blocks are used to represent element value 1 and completely dark ones are used to represent 0. So does fm,n 1. Here is an example: There are 16 Walsh functions when N=4. The 32 basis images obtained by encoding them are shown in Fig. 2.

 figure: Fig. 2.

Fig. 2. 32 basis images obtained by encoding Walsh functions with N=4. (a) fm,n0, (b) fm,n1.

Download Full Size | PDF

xi,j can also be encoded to a data image f′. The filling ratio of each block is used to represent the normalized value of an element.

WAL 0(m,n,i, j) , WAL 1(m,n,i, j) and Xi, j are encoded to fm,n 0, fm,n 1 and f′ respectively in the way described above. Thus

xi,jWAL0(m,n,i,j)ffm,n0
xi,jWAL1(m,n,i,j)ffm,n1

Substitute Eq. (5) to Eq. (4)

Xm,nffm,n0f'fm,n1

X m,n are the transform coefficients. Eq. (6) shows that 2D DWT can be performed by VHC. Duo to the energy packing capability of DWT, it can be used to reduce information redundancy because only a subset of the transform coefficients is necessary to preserve the most important features. And the subset of coefficient is enough to approximately search the records in the front-end search. Means only the corresponding Walsh functions are needed to be stored into VHC instead of the whole database.

The method that uses 2D DWT performed by VHC to extract the features has several advantages. First, it can parallelly extract the features of the data array and the calculation time does not increase with enlarged data arrays, thus this method can reach a high processing speed. Assuming 2000 images are stored in VHC, with the resolution of 1024×1024, and the frame rate of SLM and CCD are both 500fps, then the equivalent computation speed of VHC is about 1×1012 MAC/s (MAC: Multiply-Accumulate Calculation), which is 2–3 orders of magnitude higher than that of a general microprocessor. Second, the images needed to be stored are reduced from all the database records to a few Walsh functions, means this method can search a much larger database than the traditional method using the same storage capacity. Furthermore, the method has a flexibility to extract features using other transformations if the transform functions are encoded properly and stored into VHC.

3. Front-end filtering experiment

The experiment setup is shown in Fig. 3. A diode-pumped solid-state laser (DPSSL, λ=532nm) is the light source. A holographic diffuser of 0.2° scattering angle used as a speckle modulation device [2] is inserted behind the SLM. The holographic recording material is a Fe:LiNbO3 crystal. A CCD camera (MINTRON MTV-1881EX) is used to read the output.

 figure: Fig. 3.

Fig. 3. Experimental setup for demonstrating the method.

Download Full Size | PDF

To demonstrate the fast front-end parallel filtering method, we use a database containing 2000 records which is also used in reference [7]. Fig. 4 shows one of the records. The energy of these records concentrates in low frequency, thus 16 low-order transform coefficients X0,0, X0,1,…X3,3 are used to approximately measure the similarity between the query and database records in front-end searching. Generally speaking, the larger the number of database records is, or the higher the spatial frequency of records, the more coefficients would be necessary.

 figure: Fig. 4.

Fig. 4. One of the 2000 records

Download Full Size | PDF

The 16 Walsh functions WAL(0,0,i, j), WAL(0,1,i, j) , …, WAL(3,3,i, j) are encoded to 32 basis images fm,n 0 and fm,n 1 , m,n = 0,1,2,3, which are then stored into a common location of the crystal in order f 0,0 0, f 0,0 1, f 0,1 0, f 0,1 1, …, f 3,3 0, f 3,3 1 by angle multiplexing.

When a searching query is input to VHC, the output of VHC is detected by the CCD. The experimental output of the 1350th record input as the query is shown in Fig. 5. The 32 spots (The brightness of the second spot is zero) are the correlation results of the query and the 32 basis images respectively. Square root of each spot’s brightness is taken to get the inner product of the query and the corresponding basis image. In the experiment, the error of the detected brightness is less than 5%.

 figure: Fig. 5.

Fig. 5. Experimental output of the 1350th record

Download Full Size | PDF

The transform coefficients Xm,n can be calculated by Eq. (6) from the inner products obtained above and form the features of the query. Then the features are used to measure the similarity between the query and each database record using the cross-correlation value R defined as

R=m,nXm,nX̅m,nm,nXm,n2X̅m,n2

where Xm,n are the features of the query, and X̅m,n are that of a database record. Fig. 6 shows the similarity between the 1350th record and all the database records. The best two matches are marked by red circle.

 figure: Fig. 6.

Fig. 6. Similarity between the 1350th record and all the database records in the front-end searching.

Download Full Size | PDF

The two best matching records (1350th, 1412th) are picked up and delivered to the backend. Then the digital search engine can easily find that the query is the 1350th record from the two. As the delivered set is much smaller than the whole database (2 of 2000), the time consuming of back-end searching is tiny. The processing speed of the method mainly depends on the frame rate of SLM and CCD. In the experiment, the frame rate of VHC is 25 fps, which is mainly limited by the CCD we used.

To test the system performance, we search all the 2000 records one by one. If the most matching record picked up by front-end is directly used as the final searching result, there are 1991 records that can be searched out correctly and 9 errors. If the two or more best matching records picked up by front-end are delivered to back-end for further searching, all the records can be found out correctly. This searching result is better than the 37 errors given by the traditional method [7], while the library images needed to be stored are reduced from 2000 to 32.

4. Conclusion

A fast front-end associative filtering method based on 2D DWT implemented by VHC is presented. With necessary Walsh functions stored in to VHC, the features of an input query are swiftly extracted by VHC and used to approximate measure the similarity between the query and database records. The best matching records are picked up and delivered to backend digital search engine for further accurate searching. Using the same storage capacity, the method can search a much larger database than the traditional one. The two-step searching scheme fully takes advantage of the parallelism and high speed of VHC and the high accuracy of the digital processor. In the experiment, all the record can be searched out correctly, which well proves the validity of the method.

Acknowledgment

This work is sponsored by National Natural Science Foundation of China (No.60677037).

References and links

1. C. Gu, H. Fu, and J. R. Lien, “Correlation patterns and cross-talk noise in volume holographic optical correlators,” J. Opt. Soc. Am. A 12, 861–868 (1995). [CrossRef]  

2. C. Ouyang, L. C. Cao, Q. S. He, Y. Liao, M. X. Wu, and G. F. Jin, “Sidelobe suppression in volume holographic optical correlators by use of speckle modulation,” Opt. Lett. 28, 1972–1974 (2003). [CrossRef]   [PubMed]  

3. K. Ni, Z. Y. Qu, L. C. Cao, P. Su, Q. S. He, and G. F. Jin, “Improving accuracy of multichannel volume holographic correlators by using a two-dimensional interleaving method,” Opt. Lett. 32, 2972–2974 (2007). [CrossRef]   [PubMed]  

4. G. W. Burr, S. Kobras, H. Hanssen, and H. Coufal, “Content-addressable data storage by use of volume holograms,” Appl. Opt. 38, 6779–6784 (1999). [CrossRef]  

5. G. W. Burr, “Holography for information storage and processing,” Proc. SPIE 5181, 70–84 (2003). [CrossRef]  

6. A. Heifetz, J. T. Shen, J. K. Lee, R. Tripathi, and M. S. Shahriar, “ Translation-invariant object recognition system using an optical correlator and a super-parallel holographic random access memory,” Opt. Eng. 45, 025201 (2006). [CrossRef]  

7. Y. Liao, Y. B. Guo, L. C. Cao, X. S. Ma, Q. S. He, and G. F. Jin, “Experiment on parallel correlated recognition of 2030 human faces based on speckle modulation,” Opt. Express 12, 4047–4052 (2004). [CrossRef]   [PubMed]  

8. A. V. Lugt, “Signal detection by complex spatial filtering,” IEEE Transactions on Information Theory 10, 139–145 (1964). [CrossRef]  

9. J. R. Leger and S. H. Lee, “Coherent optical implementation of generalized two-dimensional transforms,” Opt. Eng. 18, 518–523 (1979).

10. J. R. Leger and S. H. Lee, “Hybrid optical processor for pattern recognition and classification using a generalized set of pattern functions,” Appl. Opt. 21, 274–287 (1982). [CrossRef]   [PubMed]  

11. I. Glaser, “Noncoherent parallel optical processor for discrete two-dimensional linear transformations,” Opt. Lett. 5, 449–451 (1980). [CrossRef]   [PubMed]  

12. I. Glaser, “Lenslet array processors,” Appl. Opt. 21, 1271–1280 (1982). [CrossRef]   [PubMed]  

13. I. Glaser, “Compact lenslet-array-based holographic correlator/convolver,” Opt. Lett. 20, 1565–1567 (1995). [CrossRef]   [PubMed]  

14. F. Grawert, S. Kobras, G. W. Burr, H. Coufal, H. Hassen, M. Riedel, C. M. Jefferson, and M. Jurich, “Content-addressable holographic database,” Proc. SPIE 4109, 177–188 (2000) [CrossRef]  

15. K. G. Beauchamp, Walsh Functions and Their Application (Academic Press, London1975).

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. System scheme
Fig. 2.
Fig. 2. 32 basis images obtained by encoding Walsh functions with N=4. (a) fm,n 0 , (b) fm,n 1 .
Fig. 3.
Fig. 3. Experimental setup for demonstrating the method.
Fig. 4.
Fig. 4. One of the 2000 records
Fig. 5.
Fig. 5. Experimental output of the 1350th record
Fig. 6.
Fig. 6. Similarity between the 1350th record and all the database records in the front-end searching.

Equations (8)

Equations on this page are rendered with MathJax. Learn more.

Xm,n=1N2 i=0N1j=0N1xi,jWAL(m,n,i,j),m,n=0,1,,N1
g (xc=xm,yc=ym) d x0 d y0 f (x0,y0) fm (x0,y0) , m =1,2,3,
WAL(m,n,i,j)=WAL0(m,n,i,j)+(1)WAL1(m,n,i,j)
Xm,n=1N2[xi,jWAL0(m,n,i,j)xi,jWAL1(m,n,i,j)]
xi,j WAL0 (m,n,i,j) f fm,n0
xi,j WAL1 (m,n,i,j) f fm,n1
Xm,n f fm,n0 f'fm,n1
R=m,nXm,nX̅m,nm,nXm,n2X̅m,n2
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.