Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Scalable visible light communications with a micro-LED array projector and high-speed smartphone camera

Open Access Open Access

Abstract

Solid-state camera systems are now widely available in portable consumer electronics, providing potential receivers for visible light communications in every device. Typically, data rates with camera receivers are limited by the 60 fps frame rate of both the image sensor and projector systems. Recent developments in high-frame rate microdisplays and slow-motion cameras for smartphones now permit high-speed, spatially structured signals to be transmitted and captured. Here, we present a method for transmitting data to a smartphone using a CMOS-controlled micro-LED projector system. Spatial patterns are projected onto a wall at a refresh rate of 480 Hz, which can be captured by the smartphone’s 960 fps camera. Data transfer is performed over meter scale distances, and the use of an alignment frame gives the system a level of tolerance to motion and misalignment. The current system allows data transmission at a peak rate of 122.88 kb/s using a 16 × 16 micro-LED array, which can be readily scaled to Mb/s rates with a higher resolution transmitter.

Published by The Optical Society under the terms of the Creative Commons Attribution 4.0 License. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.

1. Introduction

Visible light communications (VLC) offers complementarity to radio frequency (RF) technology, with licence-free operation, low-cost components, improved security due to localised signals, and the potential for integration with solid-state lighting [1]. It is expected that VLC will be employed to ease the strain on RF communications caused by the limited spectral availability and ever-increasing volumes of data traffic. While many optical communication systems utilise high-speed photodetectors, an alternative method known as optical camera communications (OCC) makes use of image sensors as receivers [2]. This is an attractive approach for communicating with consumer products such as smartphones and laptops, as such devices are already supplied with high resolution image sensors. Using these existing sensors, or optical receivers, in a multi-functional manner allows data communication with the optical spectrum without the need for additional hardware. However, a fundamental issue in OCC is the image sensor frame rate, typically 30 or 60 fps. If each exposure captures a single state of an optical signal, the frame rate limits the receiver’s sampling frequency. In a signal encoded with on-off keying (OOK) a maximum of one bit can be received for each frame, resulting in 30 or 60 b/s. Additionally, using lower frequency optical signals causes visible flickering of the optical transmitter, which may be unacceptable in practical scenarios. For OCC to perform in a competitive manner to other VLC systems, data rates must see significant improvement.

One approach to improve data rate and avoid flicker is to use under sampling methods with phase-shift keying (PSK), or quadrature amplitude modulation (QAM) [3, 4]. With this approach, the limiting factor is the shutter speed of the camera, rather than the frame rate. However, even with high order modulation schemes achieved data rates have been less than 1 kb/s. An alternative method is to exploit the “rolling shutter” effect of the image sensor [5–8]. As each row of pixels is read out sequentially, higher frequency signals can be seen in the camera image as bright and dark bands, and data rates of several kb/s are achievable. Data throughput can be further increased by implementing wavelength division multiplexing (WDM) [9] or multi-level optical signals, reaching data rates of 10s of kb/s [10].

In this paper, we follow a different approach that makes use of the spatial resolution of the image sensors to transmit data streams in a highly parallel manner. Instead of a single emitter, our transmitter is a display system showing dynamic visual codes, similar to quick response (QR) codes, where each pixel in the image transmits a data sequence [11]. Previous implementations of such a scheme with a high density set of pixels, such as in a smartphone display, enabled data rates of 100s of kb/s [12, 13], but were limited by the refresh rate of the display in addition to the camera. In order to push to high data rates additional dimensions, such as colour, intensity and cell shape can be exploited within the transmitted image. With this highly multiplexed, “high-density modulation” (HDM) system, screen-to-camera communication from one smartphone to another has been demonstrated at a data rate of a few Mb/s [14]. However, such systems are still fundamentally limited by both the 60 fps frame rate of the transmitting display system and the frame rate of the receiver camera. In addition, significant computational effort, including the use of a neural network, is required to process the images and the tolerance to motion or misalignment is low.

This paper demonstrates how active matrix micro-light-emitting diode (micro-LED) displays provide a technological route to overcome these bottlenecks. Such high-brightness micro-displays are expected to become widely available as a new generation of display technology [15–17]. Independent control of every pixel in the array allows temporal and spatial structuring of the emitted light, enabling “smart lighting” technology, where additional functionality such as data communications and positioning can be performed through light fittings, displays or projector systems [8, 18–20]. Micro-LED devices are very compact structured light projection systems, able to generate spatial patterns at high frame rates from a mm-scale chip, comparable in size to than a typical single broad area LED. The modulation bandwidth of individual micro-LED pixels in an array can reach hundreds of MHz [1], allowing extremely high frame rate display systems to be developed, with 30 kfps already demonstrated [21]. In parallel, consumer electronics such as smartphones are now often being produced with slow motion camera systems, routinely with frame rates up to 240 fps and a few available at 960 fps. While the highest speeds are only currently available in a few smartphones, this is indicative of a trend in consumer imaging systems, and higher frame rate systems are likely to become increasingly commonplace. The combination of these emerging technologies presents an opportunity for high data-rate OCC with consumer-level receiver hardware, as the combined transmitter and receiver frame rate scaling directly improves data rates. By optically projecting spatial patterns at a high refresh rate, both spatial and temporal scaling improvements in data rate can be observed.

Here, we present an OCC system employing a bespoke, complementary metal-oxide-semiconductor (CMOS) controlled micro-LED array as a projector to display binary patterns on a wall. The projector is, for demonstration, operated at a refresh rate of 480 Hz, allowing a high speed, highly parallel optical communication link that can be received by a current generation smartphone camera capturing image data at 960 fps. A simple pair of pilot frames displayed before each data payload allows synchronisation and continuous re-alignment of the pattern image, meaning a user does not have to remain strictly stationary or tightly aligned to the image. Such a system has potential in applications where a user may require specific data transmission based on their current location. Additionally, it could be used for bi-directional communications, either with a suitable micro-LED projector system mounted on the smartphone, or in future devices which are expected to use micro-LED technology for displays.

 figure: Fig. 1

Fig. 1 Photograph of the micro-LED projector system. (Inset) micrograph of the micro-LED array displaying a pseudo-random pattern.

Download Full Size | PDF

 figure: Fig. 2

Fig. 2 (a) Schematic of the experimental set up, (b/c) single frames from a 960 fps video showing two exemplar pseudo-random patterns.

Download Full Size | PDF

2. Methods

A photograph of the projector system is shown in Fig. 1. The transmitter is an array of gallium nitride micro-LEDs with 450 nm emission wavelength. The 99 × 99 μm 2 pixels are arranged on a 100 μm pitch, producing a 16 × 16 grid of tessellated squares. The LED array is bump-bonded to a set of matching CMOS control electronics, providing a digitally controlled driver for each pixel. This chip-scale, mm-size system is capable of displaying binary patterns by setting each pixel in an on or off state, an example of which is shown in the inset of Fig. 1. Details of the fabrication and performance of similar devices can be found in [22] and [23]. The experimental arrangement is shown schematically in Fig. 2. The micro-LED array is imaged on to a wall in the laboratory with a set of optics (Navitar MVL8M23) at a distance of 1 m, producing an image approximately 20 cm along the edge dimension. The image can be captured by a camera system, allowing the patterns to transmit data from the micro-LED projector to a portable device. This arrangement has been adopted as it demonstrates indoor-range communication to users entering a static location. If the array were to display a single, static pattern, this would function similar to a QR code. However, the array is capable of updating the displayed pattern at high frame rates, allowing higher data rate transmission while appearing static to a human observer. Note that this frame rate is limited by the present drive chip electronics, and could, in principle, reach MHz rates or higher in a future device.

Image data is captured using a Samsung Galaxy S9 smartphone, recording 720p videos at a frame rate of 960 fps. At this frame rate, the camera is able to resolve the individual binary patterns projected by the micro-LED array. The patterns are updated at a rate of 480 Hz, in order to satisfy the Nyquist criterion with the camera sampling frequency of 960 Hz, allowing synchronization and decoding to be performed. With this pattern update rate, each of the 16 × 16 array of emitters is effectively transmitting binary data at 480 Hz, resulting in an effective data rate of 16×16×480=122.88 kb/s. The camera can only record approximately 0.2 s of video at this high frame rate, limited by the internal buffer memory, so the maximum length of time data can be streamed before repeating is 0.1 s, in order to guarantee the full stream is captured. Future improvements in smartphone camera technology are likely to enable longer streams to be transmitted. The camera is held approximately 1 m away from the wall to capture the images, and no particular effort is made to align the camera with the projected image. Figures 2(b) and 2(c) show example pseudo-random frames from the high speed video.

 figure: Fig. 3

Fig. 3 The block of frames transmitted for synchronisation, alignment and data transmission.

Download Full Size | PDF

To facilitate data transmission, the binary frames are arranged in blocks of 10, as shown in Fig. 3. The first frame of each block has every pixel switched off, in order to provide a synchronisation point for decoding. The second frame enables only the corner pixels of the array and a small number of pixels in the lower right quadrant. This allows the receiver to locate the projection and produce a set of pixel coordinates for each LED element, as well as determine the correct orientation of the received patterns. In this way, pixel coordinates can be re-determined for every block of frames, so the system does not need to be strictly aligned, nor does it need to remain in the same position provided it does not move significantly across in the time taken to transfer a single block. The remaining 8 frames are used to project pseudo-random patterns for data transfer. While alternative methods for synchronisation and alignment are possible, this sequence was chosen for simplicity, despite the fact that 2 frames of every 10 are not transmitting data, resulting in a 20% overhead. This overhead, combined with the 0.1 s of transmission time, results in transmission of 9.83 kb of data for each acquisition of high frame rate images. The data rate of 122.88 kb/s can be thought of as a peak rate, as it cannot currently be maintained indefinitely. This is due to the internal buffer memory on the smartphone limiting acquisition times. It is likely that future improvements in camera technology will lift this limitation, due to consumer demand for longer slow motion capture. In principle, the data rate could also be maintained with custom digital signal processing hardware.

The video data is currently processed offline in MATLAB. Synchronisation is performed for each block of 10 frames. Corner points of the alignment image are selected by user input, followed by automatic determination of the pixel grid and decoding of individual data streams. The decoding method is not computationally intensive, with the current scripts performing an estimated 4n2+8n floating point operations to process each block of 10 n×n frames, which could be processed rapidly with the hardware available on the smartphone.

 figure: Fig. 4

Fig. 4 Pixel addressing maps (left) and captured frames from the 960 fps video under dark (middle) and room lighting (right) conditions for the alignment frame (top) and an example pseudo-random frame (bottom). The LED co-ordinates determined from the alignment frame are shown in white.

Download Full Size | PDF

3. Results

Figure 4 shows an example set of image frames from the high speed video recordings. The experiment was performed under low background conditions, with room lights switched off, and under normal office lighting conditions. The top row shows plots for the alignment frame, displaying the pixel activation map and the images captured with the camera under the two background conditions. A grid of LED pixel locations is generated by computing a linearly spaced set of co-ordinates between the locations of the corner pixels. Each LED pixel is therefore assigned a corresponding image pixel, shown in white in the top-middle plot as an example. This accounts for skew or tilt of the image that may be introduced by the viewing angle of the camera.

 figure: Fig. 5

Fig. 5 Decoding of pixel [4, 8] across a transmitted block for dark and room lighting conditions. Thresholds are indicated in solid lines.

Download Full Size | PDF

With pixel locations and image orientation determined from the alignment frame, the following pseudo-random frames can be processed. The lower plots in Fig. 4 show the active pixels for an example pattern and the corresponding camera frames. For each LED co-ordinate and frame, a greyscale value is determined from the camera as the sum of RGB values for the corresponding image pixel. This provides 256 parallel data streams, one for each LED element. Each of these streams can be processed individually, as shown for the example pixel (8,4) in Fig. 5. By setting a threshold value for each pixel individually, the data streams can be decoded while accounting for some of the non-uniformity in the array. In a real system, this threshold could be determined with a training sequence. Figure 5 also shows the received pixel values for typical indoor ambient lighting conditions. Both high and low logic levels are shifted up, as the sensor collects more background light, and the signal-to-noise ratio (SNR) is reduced. This can make it more difficult to decode data streams if the LED pixel does not emit enough light. The array used here has a brightness of 3.88 × 105 cd m 2, emitting 1.35 lm into the optics, however, we note that optimised micro-LED devices are capable of producing 106107 cd m 2 [16].

 figure: Fig. 6

Fig. 6 False colour map of bit error ratios by pixel in (a) dark conditions and (b) room lighting.

Download Full Size | PDF

By comparing the decoded values to the binary frames that were initially transmitted, the bit error ratio (BER) for each pixel can be found. Figure 6 shows pixel BER for both dark and room lit conditions, for a set of 5 data blocks. This corresponds to a 0.1 s burst of data transfer, the maximum that can be collected with an unsynchronised receiver collecting 0.2 s of frames. With the 16 × 16 array, this time window transmits a data payload of 10.24 kb. Under dark conditions, the only pixels with erroneously detected bits are located in a diagonal line in the bottom left portion of the array. All of these pixels are located beneath a crack in the packaging over the LED array, which distorts the projected image and causes optical crosstalk between the pixels beneath it, resulting in a total BER of 8.3×103. All data streams associated with other, undistorted pixels are decoded successfully, implying a BER of 0 is achievable with undamaged packaging.

Under ambient lighting, a significant number of additional pixels record bit errors. This is attributed to brightness non-uniformity issues in the array, as the pixels which record errors emit less light. As the background illumination level is higher, the SNR of these pixels is no longer sufficient to distinguish between on and off states. A future micro-LED array with improved uniformity and brightness will allow all pixels to be decoded correctly. Nevertheless, over 97% of the transmitted bits are successfully transmitted in this case, giving a BER of 2.13×102.

Tables Icon

Table 1. Comparison of demonstrated methods for increasing data rate in OCC. Scaling variables for transmitter (Tx) and receiver (Rx) are shown, including frame rates (FR) and resolutions (Res).)

4. Discussion

The proof-of-concept experiments presented above demonstrate that the high frame rate micro-LED projector and smartphone receiver combination can be used to increase potential data rates in OCC. However, it is important to consider how performance will scale with moderate improvements in future projector systems, and how this compares with alternative approaches. Table 1 summarises the methods used to overcome the limited frame rates of image sensors and increase OCC data rates, as discussed in the introduction. While achieved data rate is an important performance metric, the scalability, complexity, range, and tolerance to receiver motion should also be considered.

The 122.88 kb/s data rate achieved with the current micro-LED projector system compares well with other spatial coding methods [11, 13], though does fall short of the HDM system [14]. However, the data rate increase here has been realised primarily through an increase in the temporal domain through frame rates, with only moderate exploitation of the spatial domain as the currently used transmitter has a resolution of 16 × 16. Current micro-LED projectors can display frames at up to 30 kfps [21], and bare micro-LEDs show modulation bandwidths approaching 1 GHz [24]. The availability of high speed cameras allows this high speed nature to be exploited, and as these cameras improve, micro-LED systems should be able to readily keep pace. Of course, such high frame rate receivers could be used in the other spatial coding and HDM methods. However, as the image transmitter systems are based on current display technology, the transmitter frame rate of 30/60 fps rapidly becomes a bottleneck and data rate improvements will saturate quickly. In principle, high resolution colour micro-LED displays could also be used to employ the more complex spatial coding techniques at orders of magnitude greater frame rates.

In the spatial domain, the micro-LED system demonstrated here has only made use of a 16 × 16 array of pixels. Scaling to a higher resolution transmitter will rapidly improve data rates. For example, a micro-LED array with the same resolution as the QR codes used in [11] (93 × 93) would achieve a data rate of 4.15 Mb/s, and transmitted images would still be easily resolvable with the current camera system at a resolution of 1280 × 720. In comparison, applying a 960 fps camera to the work in [11] would only yield a 2 - 4 fold increase in data rate, resulting in 240 - 480 kb/s. Furthermore, the current micro-LED projector system is based on simplistic, on-off keying style transmission. Multiple wavelengths and intensities have not yet been exploited, and offer further potential in data rate scaling [11, 24, 25].

As a major advantage of OCC is the availability of camera systems in hand-held electronics, it may be important for the communication systems to operate under poor alignment conditions, under motion, or at significant range, depending on the application area. Many of the previous demonstrations show point-to-point links [3, 4, 10], which have suitable ranges for indoor communications (meter scale) but have little tolerance for misalignment or motion of a receiver. Rolling shutter based communications have been performed using LED light reflected from a surface [5]. This suggests a high degree of flexibility for a moving receiver, especially considering no spatial information is required. However, the intensity loss results in only a short range demonstration of 20 cm. The spatially encoded systems rely on anchor or alignment symbols to accurately determine the position of the transmitted image [11, 13, 14]. This can account for some perspective distortion and misalignment effects, though motion blur may still be problematic. With the micro-LED projector system, motion blur will be reduced as the exposure time is shorter. So long as the alignment frame is repeated at a suitable rate, the projected image location should be easily tracked. Additionally, the demonstrated ranges in previous spatial methods are 20-30 cm. In particular, the high data rates of [14] fall to 300 kb/s at 0.5 m. The current micro-LED system operates at a total range of 2 m (transmitter to receiver) and longer potential ranges were not investigated at this time. Given the resolution of the received images significantly longer ranges should be possible before the spatial patterns can no longer be resolved or decoded.

5. Conclusion

A data transmission method for optical camera communications has been demonstrated using a 960 fps camera available in a consumer smartphone and a CMOS-driven micro-LED projector system. The slow motion camera mode allows capture of binary patterns which can be updated at a frame rate of 480 Hz. With a 16 × 16 array of micro-LEDs, the data rate is 122.88 kb/s. This data rate is enabled by improvements in the temporal domain, employing transmitter frame rates beyond those possible with current display technology. The primary bottleneck at this stage is transmitter resolution. Increasing the number of pixels in the micro-LED array will allow this communication system to readily extend to Mb/s data rates by scaling in the spatial domain. In addition, the system operates at meter scale ranges with significant tolerance to misalignment, and is based on simple, on-off keyed modulation. This proof-of-concept experiment demonstrates highly parallel data transmission can be performed with a high frame rate projector system and received by currently available consumer electronics.

The underlying data for this work is available at: https://doi.org/10.15129/915590d1-b69a-4ddb-99fc-01750b04e2ca

Funding

UK Engineering and Physical Sciences Research Council (EPSRC) (EP/M01326X/1 “QuantIC”, EP/S001751/1).

Acknowledgments

The authors would like to acknowledge Prof. Robert K. Henderson for the development of the CMOS control chip under the EPSRC “HYPIX” program.

References

1. S. Rajbhandari, J. J. D. Mckendry, J. Herrnsdorf, H. Chun, G. Faulkner, H. Haas, I. M. Watson, D. O. Brien, and M. D. Dawson, “A review of gallium nitride LEDs for multi-gigabit-per-second visible light data communications,” Semicond. Sci. Technol. 32, 1–44 (2017). [CrossRef]  

2. N. T. Le, M. A. Hossain, and Y. M. Jang, “A survey of design and implementation for optical camera communication,” Signal Process. Image Commun. 53, 95–109 (2017). [CrossRef]  

3. P. Luo, M. Zhang, Z. Ghassemlooy, H. Le Minh, H. M. Tsai, X. Tang, L. C. Png, and D. Han, “Experimental demonstration of RGB LED-based optical camera communications,” IEEE Photonics J. 7, 1–12 (2015). [CrossRef]  

4. P. Luo, M. Zhang, Z. Ghassemlooy, H. L. Minh, H. M. Tsai, X. Tang, and D. Han, “Experimental demonstration of a 1024-QAM optical camera communication system,” IEEE Photonics Technol. Lett. 28, 139–142 (2016). [CrossRef]  

5. C. Danakis, M. Afgani, G. Povey, I. Underwood, and H. Haas, “Using a CMOS camera sensor for visible light communication,” IEEE Globecom Workshops (IEEE, 2012), pp. 1244–1248.

6. C. W. Chow, W. C. Wang, C. W. Chen, H. C. Hsieh, and Y. T. Chen, “Beacon Jointed Packet Reconstruction Scheme for Mobile-Phone Based Visible Light Communications Using Rolling Shutter,” IEEE Photonics J. 9, 1–6 (2017).

7. C. W. Chow, R. J. Shiu, Y. C. Liu, C. H. Yeh, X. L. Liao, K. H. Lin, Y. C. Wang, and Y. Y. Chen, “Secure Mobile-Phone Based Visible Light Communications with Different Noise-Ratio Light-Panel,” IEEE Photonics J. 10, 1–6 (2018).

8. X. Li, B. Hussain, J. Kang, H. S. Kwok, and C. P. Yue, “Smart μLED display-VLC system with a PD-based/camera-based receiver for NFC applications,” IEEE Photonics J. 11, 1–8 (2019).

9. K. Liang, C.-W. Chow, and Y. Liu, “RGB visible light communication using mobile- phone camera and multi-input multi-output,” Opt. Express 24, 9383–9388 (2016). [CrossRef]   [PubMed]  

10. V. P. Rachim and W. Y. Chung, “Multilevel intensity-modulation for rolling shutter-based optical camera communication,” IEEE Photonics Technol. Lett. 30, 903–906 (2018). [CrossRef]  

11. T. Fath, F. Schubert, and H. Haas, “Wireless data transmission using visual codes,” Photonics Res. 2, 150 (2014). [CrossRef]  

12. R. Boubezari, H. Le Minh, Z. Ghassemlooy, and A. Bouridane, “Smartphone camera based visible light communication,” J. Light. Technol. 34, 4120–4126 (2016). [CrossRef]  

13. T. Hao, R. Zhou, and G. Xing, “COBRA: color barcode streaming for smartphone systems,” MobiSys (2012), p. 85. [CrossRef]  

14. W. A. Cahyadi and Y. H. Chung, “Smartphone camera-based device-to-device communication using neural network-assisted high-density modulation,” Opt. Eng. 57, 1–9 (2018). [CrossRef]  

15. J. Day, J. Li, D. Y. Lie, C. Bradford, J. Y. Lin, and H. X. Jiang, “III-Nitride full-scale high-resolution microdisplays,” Appl. Phys. Lett. 99, 1–4 (2011). [CrossRef]  

16. J. Herrnsdorf, J. J. D. McKendry, S. Zhang, E. Xie, R. Ferreira, D. Massoubre, A. M. Zuhdi, R. K. Henderson, I. Underwood, S. Watson, A. E. Kelly, E. Gu, and M. D. Dawson, “Active-matrix GaN micro light-emitting diode display with unprecedented brightness,” IEEE Trans. Electron Devices 62, 1918–1925 (2015). [CrossRef]  

17. S. K. Moore, “The nextgen display: Microleds [News],” IEEE Spectr. 55, 9–10 (2018).

18. Y. Kang, J. Lee, J. Lee, and J. Cha, “A study for indoor position detection using LED QR code and smart-terminal,” ICTC (IEEE, 2011), pp. 129–131.

19. X. Li, L. Wu, Z. Liu, B. Hussain, W. C. Chong, K. M. Lau, and C. P. Yue, “Design and characterization of active matrix LED microdisplays with embedded visible light communication transmitter,” J. Light. Technol. 34, 3449–3457 (2016). [CrossRef]  

20. J. Herrnsdorf, M. Strain, E. Gu, R. Henderson, and M. Dawson, “Positioning and space-division multiple access enabled by structured illumination with light-emitting diodes,” J. Light. Technol. 35, 2339–2345 (2017). [CrossRef]  

21. J. Herrnsdorf, J. J. McKendry, E. Xie, M. J. Strain, I. M. Watson, E. Gu, and M. D. Dawson, “High speed spatial encoding enabled by CMOS-controlled micro-LED arrays,” IEEE Photonics Society Summer Topical Meeting Series (IEEE, 2016), 173–174. [CrossRef]  

22. S. Zhang, S. Watson, J. J. D. McKendry, D. Massoubre, A. Cogman, E. Gu, R. K. Henderson, A. E. Kelly, and M. D. Dawson, “1.5 Gbit/s multi-channel visible light communications using CMOS-controlled GaN-based LEDs,” J. Light. Technol. 31, 1211–1216 (2013). [CrossRef]  

23. J. J. D. McKendry, D. Massoubre, S. Zhang, B. R. Rae, R. P. Green, E. Gu, R. K. Henderson, A. E. Kelly, and M. D. Dawson, “Visible-light communications using a CMOS-controlled micro-light-emitting-diode array,” J. Light. Technol. 30, 61–67 (2012). [CrossRef]  

24. R. Ferreira, E. Xie, J. Mckendry, and S. Rajbhandari, “High bandwidth GaN-based micro-LEDs for multi-Gbps visible light communications,” IEEE Photonics Technol. Lett. 28, 1 (2016). [CrossRef]  

25. H. Chun, S. Rajbhandari, G. Faulkner, D. Tsonev, E. Xie, J. J. D. McKendry, E. Gu, M. D. Dawson, D. C. O’Brien, and H. Haas, “LED Based Wavelength Division Multiplexed 10 Gb/s Visible Light Communications,” J. Light. Technol. 34, 3047–3052 (2016). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1 Photograph of the micro-LED projector system. (Inset) micrograph of the micro-LED array displaying a pseudo-random pattern.
Fig. 2
Fig. 2 (a) Schematic of the experimental set up, (b/c) single frames from a 960 fps video showing two exemplar pseudo-random patterns.
Fig. 3
Fig. 3 The block of frames transmitted for synchronisation, alignment and data transmission.
Fig. 4
Fig. 4 Pixel addressing maps (left) and captured frames from the 960 fps video under dark (middle) and room lighting (right) conditions for the alignment frame (top) and an example pseudo-random frame (bottom). The LED co-ordinates determined from the alignment frame are shown in white.
Fig. 5
Fig. 5 Decoding of pixel [4, 8] across a transmitted block for dark and room lighting conditions. Thresholds are indicated in solid lines.
Fig. 6
Fig. 6 False colour map of bit error ratios by pixel in (a) dark conditions and (b) room lighting.

Tables (1)

Tables Icon

Table 1 Comparison of demonstrated methods for increasing data rate in OCC. Scaling variables for transmitter (Tx) and receiver (Rx) are shown, including frame rates (FR) and resolutions (Res).)

Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.