Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Hands-on with optical tweezers: a multitouch interface for holographic optical trapping

Open Access Open Access

Abstract

We report the implementation of a multitouch console for control of a holographic optical tweezers system. This innovative interface enables the independent but simultaneous interactive control of numerous optical traps by multiple users, overcoming the limitations of traditional interfaces and placing the full power of holographic optical tweezing into the operators’ hands.

©2009 Optical Society of America

1. Introduction

Optical Tweezers have been used extensively to investigate numerous areas within physics [1, 2, 3], chemistry [4] and biology [5, 6]. Rapidly evolving from their original configuration, such devices are now an indispensable micro-manipulation tool in many labs. While some groups still use the original design [7], most have added additional optics to enable the control of numerous optical traps at once. Such additions include the use of spatial light modulators for holographic optical tweezers [8, 9, 10] or “generalised phase contrast” based systems [11, 12, 13], the use of acoustic optic modulators [14], and the use of rapid scanning mirrors [15].

The widespread deployment of multiple-trap optical tweezers has led to experimentation with a variety of interfaces [16, 17, 18]. The most common method uses a PC and some form of control program to enable the real-time updating of trap locations. For example, Leach et al. [19] control their trap locations by either positioning a digital marker over a live video feed of the sample or by modifying simple coordinate values. While simple to develop and implement these “point-and-click” interfaces may not be sufficiently user-friendly for use by non-specialists and seldom allow for the control of more than one trap at a time. Alternatives such as a joystick “gripper” allow the user to perform a repetitive task such as the translation, rotation and scaling of a predefined group of optical traps [16]. Similarly, an interface making use of fiducial markers attached to the user’s fingertips, which were imaged and tracked by a standard web-camera, allowed for more flexibility and limited control over the z-axis [17]. In a similar vein, animation gloves have also been used to map optical traps to the user’s fingertips [18]. Unfortunately these systems are highly optimised for specific tasks and, for all their elegance, focus mainly on the manipulation of only a handful of particles.

Multitouch interfaces, such as those recently demonstrated by Han [20], provide the user with the ability to interact with a computer in multiple locations simultaneously. In this manner several digital objects can be controlled independently. The applications of this technology have so far been limited to areas such as computer games [21], digital image manipulation [22] and musical instruments [23]. In the remainder of this paper we show how the unique capabilities of this new interface give even an untrained user unprecedented control over a holographic optical tweezers system.

 figure: Fig. 1.

Fig. 1. (a) The basic principle behind frustrated total internal reflection on a multitouch table is shown. A PMMA waveguide is edge-illuminated with 880 nm LEDs, and the light saturates the table via total internal reflection. The blue layer represents a diffusive layer of paper coated with silicone rubber. When a user presses a finger on the surface the diffusive layer scatters the light downward where it is collected by a CCD camera. (b) An example of the image seen by the blob tracking software. Scattered light from the user’s fingers appears as white spots against a dark background.

Download Full Size | PDF

2. Experimental configuration

2.1. Hardware

Our multitouch console uses frustrated total internal reflection (FTIR), as originally used by Han [20], in order to track the positions of each user’s fingertips. The console is constructed around a sheet of poly(methyl methacrylate) (PMMA), measuring 100 cm × 80 cm × 1 cm (Aaron Plastics, UK), with countersunk holes for LEDs drilled into the edge of the sheet at 2.5 cm intervals. 880 nm LEDs (SFH486, Osram) are used to illuminate the console uniformly, effectively making it a waveguide with minimal losses arising from scattering. An overlay of drafting paper, which acts as a diffuser for a data projector (MP771, BenQ), is coated with lightly textured silicone rubber to facilitate the coupling between the user’s fingers and the waveguide. When the user presses his/her hand onto the console infra-red light is scattered, as shown in Fig. 1, before being collected by a 1.3MP CCD camera (Dragonfly2, Point Grey Research). An 880nm bandpass filter in the optical path immediately prior to the camera reduces noise generated through ambient light. As a result the camera detects a single “spot” of light at each location where a finger is pressed on the console.

This console is connected to a holographic optical tweezers system [24], designed around a commercially available inverted microscope (Axiovert 200, Zeiss), a 1.3 NA 100 × Plan-Neufluor objective (Zeiss), and a motorised xyz translation stage (MS-2000, ASI). The optical traps are powered by a titanium-sapphire laser (Coherent 899), producing 4 W at 800 nm, which is pumped by a solid-state laser (Verdi-V18, Coherent). The wavefront is shaped by an optically-addressed spatial light modulator (X8267-14DB, Hamamatsu). A live video feed is provided by a CMOS camera (EC1280, Prosilica) or by a high-speed camera (microCam-640, Durham Smart Imaging) [25].

2.2. Software

The system has been designed so that the multitouch tracking and interface software operate on separate computers. This modular design facilitates “hot swapping” of multiple interfaces, such as joystick, fiducial marker tracking or the traditional point-and-click control. A further benefit is that the processing speed of each individual system is increased, with data passed between computers over the local network using UDP datagrams. Figure 2 shows how the hardware and software combine to form the entire system.

 figure: Fig. 2.

Fig. 2. A schematic of the combined holographic optical tweezers and multitouch table. The components of the multitouch table are within the dashed line on the right hand side of the image, while a brief optics layout is displayed on the left. An overlay in the upper right corner shows an example image detected by CCD2. This is processed so that the centres of the spots can be detected and passed in their x,y coordinates to the interface computer. The interface computer displays the image obtained with CCD1, adds overlays showing where the user’s fingers were detected, and then calculates and stores locations of optical traps before passing control to the hologram calculation machine. A second overlay on the left of the image shows an example of the hologram generated. Visual feedback is immediately available to the user via the live microscope feed.

Download Full Size | PDF

The multitouch software is designed to detect the positions of each user-generated event. Figure 1(b) shows a typical image captured by the multitouch console’s CCD camera. Five spots of infrared light can be clearly discerned above the background, corresponding to a user’s fingers pressing on the surface. The software subtracts the background signal and after processing the image determines the centre of each spot, passing its coordinates to the interface software. Tracking in this manner has been implemented using both the open source library Touchlib (NuiGroup) [26] and LabViews blob detection and tracking algorithms (National Instruments), on a dual core PC operating at 3.4 GHz, at a rate of approximately 15 Hz. This rate is primarily due to the time it takes for the data to be passed using the firewire protocol.

The user’s gestures on the multitouch table are interpreted in real-time by the interface software, operating on a dual core 2.8 GHz PC, which determines the intended action at approximately 15 Hz. This is again limited by presence of a firewire camera obtaining images of the sample being investigated. The entire display is sensitive to user input, and is roughly divided into a “manipulation” area containing the microscope image and a “service” area which houses the buttons and sliders necessary for more complex operations. Optical traps are represented by coloured circles in the manipulation area, and are created by pressing a finger against the display in the desired location. Once released, traps are destroyed allowing for a very intuitive interaction with the sample. Selected traps can be preserved or collected into groups using controls in the service area. Once designated as persistant, traps or groups of traps are not destroyed by a user removing their fingers, and may be reselected by a simple touch and manipulated as before. Translations are accomplished by dragging, while sliders in the service area allow for control of the z-position and relative intensity of each trap.

Groups of traps are identified by a common colour, and can be manipulated as single entities by the selection of a visual marker. Transformations such as rotations, translations and scaling in 3D can be performed by moving any group member relative to the relevant marker, with the transformation applied to the group as a whole. Each trap within the group can still be manipulated individually by selecting it directly.

The position of each optical trap is then passed to a third computer for hologram calculation. Holograms are calculated using the “Gratings and Lenses” algorithm [8, 19, 27] on a dual-core 3.0 GHz PC. Holograms are typically calculated at rates of 5 Hz for more than 25 optical traps, and varies as traps are removed until a rate of 15 Hz is achieved for five traps. This speed is then limited by the firewire update rates. Future improvements to the real-time algorithm implementation are likely to yield hologram calculation at SLM update rates.

3. Demonstrations of the system

One of the limitations of conventional control interfaces for multiple optical tweezers is that they do not allow simultaneous and independent interactive-control of multiple traps. In experiments where a rapid response to changing conditions is required, or in experiments where optical traps must be manipulated at the same time, traditional interfaces fail. For example, achieving independent translation of two trapped silica beads in two dimensions is impossible with the traditional “point-and-click” interface. Figure 4 shows how simple this two-sphere problem becomes with the multitouch console.

To demonstrate another feature of the multitouch console three microspheres are trapped using the above process and associated via the grouping functions. This group is then transformed by translating, rotating and scaling using two fingers. All three of the operations are performed first sequentially, as would be achieved using a traditional mouse interface, and then simultaneously. This is shown in Fig. 5 (Media 2).

Another major strength of the system comes in how intuitive and powerful it is when trapping non-spherical objects. This is demonstrated in Fig. 6, where a 300 nm diameter, 12 μm long cadmium sulphide (CdS) rod is trapped perpendicular to the optical axis and manipulated. The high refractive index of CdS (n = 2.26 at 800 nm) makes optical trapping more challenging, and the morphology of the rod is such that the optical traps need to be applied virtually instantly at both ends of the rod to prevent it aligning its long axis with the beam. The ability of a user to create an optical trap in any position in the plane and make rapid adjustments immediately prior to trapping means that the rapid Brownian fluctuations of the rod are not overly troublesome. A typical mouse point-and-click interface requires traps to be switched off or be well away from the item of interest, with the user forced to wait until the rod is in exactly the right place before switching the traps back on or rapidly moving them into position. Such a series of operations is neither intuitive nor obvious. However, as can be seen from Fig. 6 (Media 3), such an operation is trivial using the multitouch console.

 figure: Fig. 3.

Fig. 3. The multitouch console in action: two users interact with the system to reposition groups of SiO2 spheres. Further detail is given in Fig. 7.

Download Full Size | PDF

 figure: Fig. 4.

Fig. 4. A user creates an optical trap by touching the screen at any desired location over the live video feed. Optical traps follow the user’s fingertips as they move over the surface. When the user removes his or her fingers the optical traps are destroyed. By pressing one of the buttons the traps are made persistent, or classified as a group. Seen here two 3 μm silica microspheres are trapped and manipulated, individually and then together, demonstrating both temporary and persistent operating modes (Media 1).

Download Full Size | PDF

The grouping operations available in the system also simplify the movement of multiple optical traps. It is possible to transform multiple groups of traps at once and to operate the system with multiple users. An example of two users transforming three groups of microspheres is shown in Fig. 7 (Media 4). Again the video shows that working with multiple users and multiple groups of traps is routine on the system.

 figure: Fig. 5.

Fig. 5. The group transformation algorithms in action: a group of three 3 μm SiO2 spheres are optically trapped and the resulting triangle is transformed using two fingers. As seen translations, rotations and scales can all be performed sequentially or simultaneously - an advantage over virtually all other interfaces (Media 2).

Download Full Size | PDF

 figure: Fig. 6.

Fig. 6. A CdS rod is optically trapped using the multitouch. Being able to place two traps at either end of the rod allows the user to quickly trap and control the rod (Media 3).

Download Full Size | PDF

 figure: Fig. 7.

Fig. 7. Three groups of microspheres are manipulated by two users simulataneously, showing functionality not possible with more traditional interfaces (Media 4).

Download Full Size | PDF

4. Conclusions

We have implemented an innovative multitouch interface system for the interactive real-time control of a holographic optical tweezers system. Users who have virtually no training in optical tweezers are able to control optical traps and perform complex operations with just the touch of a finger. We believe that the capacity for true real-time independent control of numerous simultaneous traps, coupled with visual feedback directly beneath the user’s fingertips, besides providing increased experimental throughput and faster training of the operator, will open many doors in interdisciplinary research. For example, this system could be used in studies of motile cells, or single-cell microsurgery. To our knowledge, this is the first time the multitouch human-computer interface has been used to control a complex physical instrument. Such interfaces do not need to be limited to research in the physical sciences, and can be adapted for other devices and systems where interactive simultaneous multiple point control is required.

Acknowledgments

We would like to thank Mr L Ikin for supplying the CdS nanorods. This project is funded by a Basic Technology Grant through the Research Councils of the United Kingdom.

References and links

1. M. E. J. Friese, A. G. Truscott, H. Rubinsztein-Dunlop, and N. R. Heckenberg, “Three-Dimensional Imaging with Optical Tweezers,” Appl. Opt. 38, 6597–6603 (1999). [CrossRef]  

2. G. M. Wang, J. C. Reid, D. M. Carberry, D. R. M. Williams, E. M. Sevick, and D. J. Evans, “Experimental study of the fluctuation theorem in a nonequilibrium steady state,” Phys. Rev. E 71, 046,142 (2005). [CrossRef]  

3. D. C. Benito, D. M. Carberry, S. H. Simpson, G. M. Gibson, M. J. Padgett, J. G. Rarity, M. J. Miles, and S. Hanna, “Constructing 3D crystal templates for photonic band gap materials using holographic optical tweezers,” Opt. Express 16, 13,005-13,015 (2008). [CrossRef]  

4. J. P. Reid, H. Meresman, L. Mitchem, and R. Symes, “Spectroscopic studies of the size and composition of single aerosol droplets,” Int. Rev. Phys. Chem. 26, 139–192 (2007). [CrossRef]  

5. A. Ashkin and J. M. Dziedzic, “Internal cell manipulation using infrared laser traps,” Proc. Natl. Acad. Sci. USA 86, 7914–7918 (1989). [CrossRef]   [PubMed]  

6. C. Bustamante, J. C. Macosko, and G. J. L. Wuite, “Grabbing the cat by the tail: manipulating molecules one by one,” Nat. Rev. Mol. Cell. Biol. 1, 130–136 (2000). [CrossRef]  

7. A. Ashkin, J. M. Dziedzic, J. E. Bjorkholm, and S. Chu, “Observation of a single-beam gradient force optical trap for dielectric particles,” Opt. Lett. 11, 288 (1986). [CrossRef]   [PubMed]  

8. M. Reicherter, T. Haist, E. U. Wagemann, and H. J. Tiziani, “Optical particle trapping with computer-generated holograms written on a liquid-crystal display,” Opt. Lett. 24, 608–610 (1999). [CrossRef]  

9. J. E. Curtis, B. A. Koss, and D. G. Grier, “Dynamic holographic optical tweezers,” Opt. Commun. 207, 169–175 (2002). [CrossRef]  

10. P. Jordan, J. Leach, M. Padgett, P. Blackburn, N. Isaacs, M. Goksor, D. Hanstorp, A. Wright, J. Girkin, and J. Cooper, “Creating permanent 3D arrangements of isolated cells using holographic optical tweezers,” Lab on a Chip 5, 1224–1228 (2005). [CrossRef]   [PubMed]  

11. J. Glückstad, “Phase contrast image synthesis,” Opt. Commun. 130, 225–230 (1996). [CrossRef]  

12. I. R. Perch-Nielsen, P. J. Rodrigo, C. A. Alonzo, and J. Gluckstad, “Autonomous and 3D real-time multi-beam manipulation in a microfluidic environment,” Opt. Express 14, 12,199-12,205 (2006). [CrossRef]   [PubMed]  

13. H. Ulriksen, J. Thøgersen, S. Keiding, I. Perch-Nielsen, J. Dam, D. Palima, H. Stapelfeldt, and J. Glückstad, “Independent trapping, manipulation and characterization by an all-optical biophotonics workstation,” J. Europ. Opt. Soc. Rap. Public. 3, 08,034 (2008).

14. K. C. Neuman and S. M. Block, “Optical trapping,” Rev. Sci. Instrum. 75, 2787–2809 (2004). [CrossRef]  

15. C. Mio, T. Gong, A. Terray, and D. W. M. Marr, “Design of a scanning laser optical trap for multiparticle manipulation,” Rev. Sci. Instrum. 71, 2196–2200 (2000). [CrossRef]  

16. G. Gibson, L. Barron, F. Beck, G. Whyte, and M. Padgett, “Optically controlled grippers for manipulating micron-sized particles,” New J. Phys. 9, 14 (2007). [CrossRef]  

17. G. Whyte, G. Gibson, J. Leach, M. Padgett, D. Robert, and M. Miles, “An optical trapped microhand for manipulating micron-sized objects,” Opt. Express 14, 12,497-12,502 (2006). [CrossRef]   [PubMed]  

18. I.-Y. Park, S.-Y. Sung, J.-H. Lee, and Y.-G. Lee, “Manufacturing micro-scale structures by an optical tweezers system controlled by five finger tips,” J. Micromech. Microeng. 17, N82–N89 (2007). [CrossRef]  

19. J. Leach, K. Wulff, G. Sinclair, P. Jordan, J. Courtial, L. Thomson, G. Gibson, K. Karunwi, J. Cooper, Z. J. Laczik, and M. Padgett, “Interactive approach to optical tweezers control,” Appl. Opt. 45, 897–903 (2006). [CrossRef]   [PubMed]  

20. J. Han, “Low-Cost Multi-Touch Sensing through Frustrated Total Internal Reflection,” 18th Annual ACM Symposium on User Interface Software and Technology (ACM, Seattle, Washington, USA, 2005).

21. E. Tse, S. Greenberg, C. Shen, C. Forlines, and R. Kodama, “Exploring True Multi-User Multimodal Interaction over a Digital Table,” in Proceedings of DIS , vol. 8, pp. 25–27 (2008).

22. Y. Gingold, P. Davidson, J. Han, and D. Zorin, “A direct texture placement and editing interface,” in Symposium on User Interface Software and Technology: Proceedings of the 19th annual ACM symposium on User interface software and technology, vol. 15, pp. 23–32 (2006).

23. M. Kaltenbrunner, S. Jorda, G. Geiger, and M. Alonso, “The reacTable*: A collaborative musical instrument,” in 15th IEEE International Workshop on Enabling Technologies - Infrastructure for Collaborative Enterprises (WET ICE2006), S. M. Reddy, ed., pp. 406–411 (Manchester, ENGLAND, 2006).

24. G. Gibson, D. M. Carberry, G. Whyte, J. Leach, J. Courtial, J. C. Jackson, D. Robert, M. Miles, and M. Padgett, “Holographic assembly workstation for optical manipulation,” J. Opt. A 10, 044,009 (2008). [CrossRef]  

25. G. M. Gibson, J. Leach, S. Keen, A. J. Wright, and M. J. Padgett, “Measuring the accuracy of particle position and force in optical tweezers using high-speed video microscopy,” Opt. Express 16, 14,561-14,570 (2008). [CrossRef]   [PubMed]  

26. Nuigroup, “TouchLib A Multitouch Development Kit,” (2008), http://nuigroup.com/touchlib/.

27. J. Liesener, M. Reicherter, T. Haist, and H. J. Tiziani, “Multi-functional optical tweezers using computer-generated holograms,” Opt. Commun. 185, 77–82 (2000). [CrossRef]  

Supplementary Material (4)

Media 1: MOV (7683 KB)     
Media 2: MOV (5968 KB)     
Media 3: MOV (7996 KB)     
Media 4: MOV (5006 KB)     

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. (a) The basic principle behind frustrated total internal reflection on a multitouch table is shown. A PMMA waveguide is edge-illuminated with 880 nm LEDs, and the light saturates the table via total internal reflection. The blue layer represents a diffusive layer of paper coated with silicone rubber. When a user presses a finger on the surface the diffusive layer scatters the light downward where it is collected by a CCD camera. (b) An example of the image seen by the blob tracking software. Scattered light from the user’s fingers appears as white spots against a dark background.
Fig. 2.
Fig. 2. A schematic of the combined holographic optical tweezers and multitouch table. The components of the multitouch table are within the dashed line on the right hand side of the image, while a brief optics layout is displayed on the left. An overlay in the upper right corner shows an example image detected by CCD2. This is processed so that the centres of the spots can be detected and passed in their x,y coordinates to the interface computer. The interface computer displays the image obtained with CCD1, adds overlays showing where the user’s fingers were detected, and then calculates and stores locations of optical traps before passing control to the hologram calculation machine. A second overlay on the left of the image shows an example of the hologram generated. Visual feedback is immediately available to the user via the live microscope feed.
Fig. 3.
Fig. 3. The multitouch console in action: two users interact with the system to reposition groups of SiO2 spheres. Further detail is given in Fig. 7.
Fig. 4.
Fig. 4. A user creates an optical trap by touching the screen at any desired location over the live video feed. Optical traps follow the user’s fingertips as they move over the surface. When the user removes his or her fingers the optical traps are destroyed. By pressing one of the buttons the traps are made persistent, or classified as a group. Seen here two 3 μm silica microspheres are trapped and manipulated, individually and then together, demonstrating both temporary and persistent operating modes (Media 1).
Fig. 5.
Fig. 5. The group transformation algorithms in action: a group of three 3 μm SiO2 spheres are optically trapped and the resulting triangle is transformed using two fingers. As seen translations, rotations and scales can all be performed sequentially or simultaneously - an advantage over virtually all other interfaces (Media 2).
Fig. 6.
Fig. 6. A CdS rod is optically trapped using the multitouch. Being able to place two traps at either end of the rod allows the user to quickly trap and control the rod (Media 3).
Fig. 7.
Fig. 7. Three groups of microspheres are manipulated by two users simulataneously, showing functionality not possible with more traditional interfaces (Media 4).
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.