Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Investigating the angular distortion impact on vehicular optical camera communication (OCC) systems

Open Access Open Access

Abstract

Optical camera communication (OCC) shows promise for optical wireless communication (OWC) in vehicular networks. However, vehicle mobility-induced angular distortions hinder system throughput by degrading non-isotropic vehicular OCC channel gain. Few of the prior works have ever made a comprehensive analysis of their impact, especially based on the pixel value which reflects the camera imaging features. To address this knowledge gap, a pixel value-described vehicular OCC system model accounting for transmitter imaging location and intensity from the geometry and radiometry aspects is presented in this paper with common types of the offset and rotation angles included. We integrate a MATLAB-based simulated vehicular OCC system with an experimentally designed testbed for validation and performance analysis. For a single-time snapshot, we investigate the impacts of common angular distortion types in vehicular OCC systems on maximum pixel value, imaging location, and communication-related metrics. Furthermore, we statistically analyze their influences by considering two driving scenarios with respective angular distributions. The angular distortion characterization from this work is expected to lay a stepping stone to addressing mobility in vehicular OCC systems.

© 2024 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Optical Camera Communication (OCC), as one of the Optical Wireless Communication (OWC) technologies, can be integrated into vehicular communication networks by using vehicle lighting systems as transmitters, and dash cameras as receivers [1,2]. With its advantages of unlicensed spectrum, no radio frequency interference, convenient achievement of Multiple-Input-Multiple-Output (MIMO) functionality, and joint sensing and communication ability, it is considered a promising complement to the majority Radio Frequency (RF)-based Intelligent Transportation Systems (ITS) to further enhance road safety, traffic efficiency, and driving comfort [3].

Investigations on performance analysis offer a foundation for communication systems design and optimization. Analytical evaluation, field operational tests, and simulation are three common techniques applied in the existing vehicular OCC studies for system analysis [1]. Based on these methods, performance evaluations are realized according to communication-related criteria such as received optical power, Signal-to-Noise Ratio (SNR), and Bit Error Rate (BER). For the transmitter side, the realistic taillight illumination pattern is measured based on the empirical data [4]. Optical path loss [5], weather impacts such as long distance-related fog and turbulence [6], link coherence time and link duration [7], as well as the unique characteristic of link asymmetry [8] are analyzed in detail for vehicular OCC channel. The impact of the camera field of view and its internal noise are described in [9] and [10]. Besides, transceiver mobility-induced imaging location variations and performance degradations are discussed in [11] and [12].

Vehicle mobility presents a fundamental impediment to link maintenance and high throughput for vehicular OCC systems. Such mobility leads to two influential aspects which include inter-vehicle distance variations and angular distortions, where impacts of the latter are what we focus on in this study. Moreover, apart from vehicle mobility, road irregularities may also lead to such a phenomenon. Although some prior works have paid attention to the angular distortion impact in vehicular communication systems, several research gaps still exist and are waiting to be filled. Firstly, there lack of a systematic definition and understanding of the angular distortion categories the vehicular OCC systems may encounter. Some of the existing studies only consider the angular distortions in two out of three directions where either the horizontal and vertical angle or the longitudinal and lateral shift are used for descriptions. Moreover, both the qualitative and quantitative analysis of their impact on vehicular OCC systems are not comprehensively studied, such as the motion-induced photometric effects. Besides, relationships between the driving behavior and the induced angular distortions have seldom been mentioned. Secondly, most of the performance investigations focus on the Photo Detector (PD)-based vehicular systems instead of the camera-based OCC ones [13]. As cameras use pixel values to record the received intensity and have imaging functionality, the optical power and photocurrent-based SNR calculations are unable to provide straightforward performance analysis for vehicular OCC systems. A pixel value-oriented research that jointly studies the angular distortion impact on the imaging location and intensity, as well as the related communication performance is in demand. Finally, limited research studies have ever integrated theoretical models and the experimental testbed into the OCC performance investigations. As systematic parameter characterization under a more realistic environment can be achieved by combining these two aspects, such research should be considered to benefit OCC’s future development.

Motivated by the above-analyzed limitations, this study systematically investigates the angular distortion impact on vehicular OCC systems with the following four main contributions:

  • • An analytical vehicular OCC system model which describes the impacts of both the offset and three categories of rotation angular distortions is proposed. Different from most of the existing models which only consider either mobility-induced intensity or position variations, angular-led changes in both imaging intensity and location are analyzed based on pixel values.
  • • An experimentally designed OCC platform has been set up to validate the accuracy of our proposed model as well as acquire the captured images with the required empirical data. Besides, a MATLAB-based synthetic figures generation method under vehicular OCC systems is introduced based on the analytical models for comparison and further investigations.
  • • Driving behavior and road irregularities-induced angular distortions are introduced, where such distortion-led variations of pixel values, Peak Signal-to-Noise Ratio (PSNR), BER, and channel capacity have been studied in detail for a single-time snapshot.
  • • Given the angular distributions considering the vehicle vibrations and road conditions under a short period for vehicular OCC systems, the pixel value and imaging location distributions are jointly presented based on the model to provide distortion impact from a statistical perspective.

The rest of this paper is organized as follows. The proposed pixel value-based vehicular OCC system model considering angular distortion is introduced in Section 2. Data acquisitions based on the OCC hardware platform and MATLAB-assisted simulations are presented in Section 3. Model accuracy validation and performance analysis with various angular distortion conditions are provided in Section 4. Finally, Section 5 concludes this paper.

2. System model

Vehicle-to-vehicle communication is the scenario we focus on in our research. Incoming and outgoing links are two communication link categories in a vehicular system. As shown in Fig. 1(a), link a is the outgoing channel for B (vehicle B Tx ${ \to }$ vehicle A Rx), and link b is the incoming channel for B (vehicle B Rx ${\leftarrow }$ vehicle A Tx) when the communication between vehicles A and B is considered. Similarly, concerning the communication between vehicles B and C, link c is the outgoing channel for B (vehicle B Tx ${ \to }$ vehicle C Rx), and link d is the incoming channel for B (vehicle B Rx ${\leftarrow }$ vehicle C Tx). We take the outgoing link c as an example to propose the vehicular OCC system model which can also be applied for the incoming link, and Light Emitting Diode (LED) is used to indicate the transmitter without misleading. Angular distortion impact on these two links varies due to the unique link asymmetry, which will be discussed for the performance analysis.

 figure: Fig. 1.

Fig. 1. Demonstrations of the vehicular OCC systems. (a) Two common types of communication links. (b) Four coordinate systems setup and common angular distortions.

Download Full Size | PDF

LED transmitters’ imaging location and their imaging intensity are considered in our proposed vehicular OCC system model. Determination of the imaging location is for the sake of OCC communication link setup and provides theoretical support for describing the geometry-based OCC characteristics. Calculations of the imaging intensity are of great benefit in guiding the threshold selection and quantization of the radiometry-based OCC features.

2.1 Imaging location determination

The pin-hole camera model is the most frequently applied method for determining the LED imaging location. Transceivers located on the front and rear vehicles are normally assumed to be perfectly parallel to each other and calculation of the LED imaging radius is sufficient to describe the OCC geometry-related features in such cases. However, when the angular distortion phenomenon appears, a more accurate method is needed to describe how the 3-dimensional (3D) LED world points get projected into 2D imaging points on the camera imaging plane.

Imaging geometry, which is a classical and frequently applied solution introduced in areas such as Computer Vision, is used in our proposed model to fulfill the target mentioned above. By referring to this method, we can determine the precise transmitter imaging location under OCC angular distortion conditions [14]. For the sake of the following calculations, the initial four coordinate systems are defined which include the World Coordinate System $(X, Y, Z)$, Camera Coordinate System $({X_c}, {Y_c}, {Z_c})$, Image (Film) Coordinate System $(x, y)$, and Pixel Coordinate System $(u, v)$ as shown in Fig. 1(b). For the World Coordinate System, the original point $O$ is located at the center of the LED array, the $Z$-axis is perpendicular to the LED array plane, the $X$-axis is vertically upward, and the $Y$-axis is facing inwards. For the Camera Coordinate System, the ${Z_c}$-axis is the optic axis, the ${X_c}$-axis is vertically upward, and the ${Y_c}$-axis is facing outwards. The image plane is located $f$ units out along the optics axis, and its origin ${O_i}$ is in the center of the imaging plane. The original point of the Pixel Coordinate System ${O_p}$ is in the upper left corner of the imaging plane.

Offset angle and rotation angle are the two angular distortion categories in vehicular OCC systems. The offset angle is brought by the translation of the transceiver position as shown in Fig. 1(b). Such offset angle doesn’t contribute to LED deformation and normally leads to obvious changes in the LED imaging position. Comparatively, the rotation angle is used to describe the transceiver rotation condition. To be more specific, there are three elemental rotation angles including yaw, pitch, and roll indicated as $\alpha$, $\beta$, and $\gamma$ respectively. These three angles correspond to the rotation about the $Z$($Z_c$), $X$($X_c$), and $Y$($Y_c$) axes and are shown in Fig. 1(b), where rotations of both the transmitter and receiver may lead to the existence of such rotation angles. Different from the general but rough descriptions of rotation phenomena such as the tilted receiver [15] and terminal jiggle [16] applied in some of the studies, various rotation categories are classified here for the sake of a systematic analysis. Impacts of both the offset and rotation angle are considered in our proposed model.

For simplicity, we take one point on the LED as an example to provide the forward projection procedure from the $({X_L}, {Y_L}, {Z_L})$ under the World Coordinate System to $({u_L}, {v_L})$ under the Pixel Coordinate System. The following calculations only consider the camera rotation impact as a calculation example, but this method can describe both the transceiver rotation impact.

Rigid transformation is applied to describe the mapping relationship between $({X_L}, {Y_L}, {Z_L})$ and $(X{c_{L}}, Y{c_{L}}, Z{c_{L}})$ under Camera Coordinate System as [14]

$$\left[ \begin{array}{l} \begin{array}{{l}} {X{c_L}}\\ {Y{c_L}}\\ {Z{c_L}} \end{array}\\ \,\,\,\,\,\,1 \end{array} \right] = \left[ {\begin{array}{{cc}} {\mathbf{R}} & {\mathbf{T}}\\ {\mathbf{0}} & 1 \end{array}} \right]\left[ \begin{array}{l} \begin{array}{{l}} {{X_L}}\\ {{Y_L}}\\ {{Z_L}} \end{array}\\ \,\,\,\,\,1 \end{array} \right],$$
where ${\mathbf {T}}$ is the translation matrix which describes the offset angle with the dimension of 3 $\times$ 1, ${\mathbf {R}}$ is the rotation matrix which describes the rotation angle with the dimension of 3 $\times$ 3, and ${\mathbf {0}}$ is a 1 $\times$ 3 matrix of all zeros. Matrix ${\mathbf {R}}$ can be calculated as
$${\mathbf{R}} = {{\mathbf{R}}_\alpha }{{\mathbf{R}}_\beta }{{\mathbf{R}}_\gamma } = \left[ {\begin{array}{{ccc}} {\cos \alpha } & { - \sin \alpha } & 0\\ {\sin \alpha } & {\cos \alpha } & 0\\ 0 & 0 & 1 \end{array}} \right]\left[ {\begin{array}{{ccc}} 1 & 0 & 0\\ 0 & {\cos \beta } & { - \sin \beta }\\ 0 & {\sin \beta } & {\cos \beta } \end{array}} \right]\left[ {\begin{array}{{ccc}} {\cos \gamma } & 0 & {\sin \gamma }\\ 0 & 1 & 0\\ { - \sin \gamma } & 0 & {\cos \gamma } \end{array}} \right],$$
where $\alpha$, $\beta$, $\gamma$ in Eq. (2) are the rotation angles of yaw, pitch, and roll respectively. For the sake of matrix calculations, homogeneous coordinates are introduced here.

Perspective projection is used to provide the mathematical relationship between the $(X{c_{L}}, Y{c_{L}},$ $Z{c_{L}})$ and $({x_L}, {y_L})$ under the Image (Film) Coordinate System as [14]

$$Z{c_L}\left[ {\begin{array}{{l}} {{x_L}}\\ {{y_L}}\\ {{\mkern 1mu} 1} \end{array}} \right] = \left[ {\begin{array}{{cccc}} f & 0 & 0 & 0\\ 0 & f & 0 & 0\\ 0 & 0 & 1 & 0 \end{array}} \right]\left[ \begin{array}{l} \begin{array}{{l}} {X{c_L}}\\ {Y{c_L}}\\ {Z{c_L}} \end{array}\\ \,\,\,\,\,\,1 \end{array} \right],$$
where $f$ is the camera focal length. The digitization process from $({x_L}, {y_L})$ to $({u_L}, {v_L})$ is described by
$$\left[ \begin{array}{l} {u_L}\\ {v_L}\\ \,1 \end{array} \right] = \left[ {\begin{array}{{ccc}} {1/{p_u}} & 0 & {{u_0}}\\ 0 & {1/{p_v}} & {{v_0}}\\ 0 & 0 & 1 \end{array}} \right]\left[ \begin{array}{l} {x_L}\\ {y_L}\\ \,1 \end{array} \right],$$
where $({u_0}, {v_0})$ is the coordinate of the Image (Film) Coordinate System original point under the Pixel Coordinate System. ${p_u}$ and ${p_v}$ are the physical length of a pixel along the $u$-axis and $v$-axis respectively.

Based on Eq. (1)–Eq. (4), given an arbitrary point of LED $({X_L}, {Y_L}, {Z_L})$ under the World Coordinate System, its corresponding imaging point $({u_L}, {v_L})$ can be calculated as [14]

$$Z{c_L}\left[ {\begin{array}{{l}} {{u_L}}\\ {{v_L}}\\ {{\mkern 1mu} 1} \end{array}} \right] = \left[ {\begin{array}{{ccc}} {1/{p_u}} & 0 & {{u_0}}\\ 0 & {1/{p_v}} & {{v_0}}\\ 0 & 0 & 1 \end{array}} \right]\left[ {\begin{array}{{cccc}} f & 0 & 0 & 0\\ 0 & f & 0 & 0\\ 0 & 0 & 1 & 0 \end{array}} \right]\left[ {\begin{array}{{cc}} {\mathbf{R}} & {\mathbf{T}}\\ {\mathbf{0}} & 1 \end{array}} \right]\left[ \begin{array}{l} \begin{array}{{l}} {{X_L}}\\ {{Y_L}}\\ {{Z_L}} \end{array}\\ \,\,\,\,\,1 \end{array} \right].$$

2.2 Imaging intensity determination

Most of the existing OCC system model pays limited attention to the incidence and irradiance angle variation caused by the transceiver offset and rotation angles when calculating the LED imaging intensity. These two values are always assumed to be independent of the transceiver geometry-based characteristics in most investigations. Our proposed calculation method is partly based on our previous research in [10] by further considering the angular distortion-induced pixel value variations to fill the above-mentioned research gap. The calculation of one example pixel value within the LED imaging area is presented in Fig. 2.

 figure: Fig. 2.

Fig. 2. Calculation process of one example pixel value.

Download Full Size | PDF

Vehicle lighting systems can be applied as transmitters which include the headlamps and taillights. Without loss of generality, the applied transmitter is abstracted into a Lambertian-pattern LED array in our vehicular OCC system. Each LED is assumed to be working under the linear region of the current-power relationship to avoid signal distortion [17]. The transmitted optical power of the $i$-th LED in the array ${{P_T}_{_i}(t)}$ is given as

$${P_T}_{_i}(t) = k{I_i}(t),$$
where $k$ is the linear coefficient factor determined by the LED physical properties, ${{I_i}(t)}$ is its driving current and varies with time according to the modulated signals together with ${P_T}_{_i}(t)$.

The vehicular OCC channel is not isotropic and its gain is significantly affected by angular distortion. Here, we ignore the weather-induced path loss for the OCC channel and only consider its Line-Of-Sight (LOS) channel Direct Current (DC) gain. As the pixel is the basic unit of the camera receiver and its pixel value relates to the received optical power within its area, we take the optical channel from the $i$-th LED to one of its signal-striking areas corresponding to the $({u_L}, {v_L})$ pixel as the calculation example whose channel DC gain ${{H_i}_{_{({u_L},{v_L})}}(0)}$ is given as [18]

$${H_i}_{_{({u_L},{v_L})}}(0) = \frac{{{A_p}R({\phi _i})}}{{D_{{{_i}_{_{({u_L},{v_L})}}}}^2}}{c_{{{_i}_{_{({u_L},{v_L})}}}}},$$
where ${R(\phi _i)}$ denotes the Lambertian radiation pattern which is calculated by
$$R({\phi _i}) = \frac{{(m + 1)}}{{2\pi }}{\cos ^m}({\phi _i}).$$

In Eq. (8), ${\phi _i}$ is the irradiance angle (viewing anlge) of the $i$-th LED and $m$ is its Lambertian emission order. ${A_p}$ in Eq. (7) is the signal collection area of the camera which is calculated as [19]

$${A_p} = \left\{ \begin{array}{l} {(\frac{f}{{2{F_{num}}}})^2}\pi \cos {\psi _{{{_i}_{_{({u_L},{v_L})}}}}},\,\,\,\,\,\,\,0 \le {\psi _{{{_i}_{_{({u_L},{v_L})}}}}} \le {\psi _l}\\ 0,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,{\psi _{{{_i}_{_{({u_L},{v_L})}}}}} > {\psi _l} \end{array} \right.$$
where $f$ is the camera focal length, ${F_{num}}$ is the $F$-number of the camera aperture, ${\psi _{{{_i}_{_{({u_L},{v_L})}}}}}$ is the incidence angle of the signal from the $i$-th LED to the $({u_L},{v_L})$ pixel, and ${\psi _l}$ is the camera field of view (FoV). ${D_{{{_i}_{_{({u_L},{v_L})}}}}}$ in Eq. (7) is the distance between the $i$-th LED center to the $({u_L},{v_L})$ pixel, and ${c_{{{_i}_{_{({u_L},{v_L})}}}}}$ is the ratio between a single pixel area and area covered by the $i$-th LED [20] shown as
$${c_{{{_i}_{_{({u_L},{v_L})}}}}} = \frac{{{p_u}{p_v}}}{{{S_{i\_LED}}}},$$
where ${p_u}$ and ${p_v}$ are the physical length of a pixel along the $u$-axis and $v$-axis respectively, and ${S_{i\_LED}}$ is the imaging area of the $i$-th LED. Demonstration of the ${\phi _i}$ and ${\psi _{{{_i}_{_{({u_L},{v_L})}}}}}$ are shown in Fig. 2.

Taking Eq. (6)–Eq. (10) into account, the channel DC gain ${H_i}_{_{({u_L},{v_L})}}(0)$ can finally be written as

$${H_i}_{_{({u_L},{v_L})}}(0) = \frac{{(m + 1){{\cos }^m}({\phi _i}){f^2}\cos ({\psi _{{{_i}_{_{({u_L},{v_L})}}}}}){p_u}{p_v}}}{{8{F_{num}}^2D_{{{_i}_{_{({u_L},{v_L})}}}}^2{S_{i\_LED}}}}.$$

Calculations of ${\cos ({\phi _i})}$ and ${\cos ({\psi _{{{_i}_{_{({u_L},{v_L})}}}}})}$ in Eq. (11) are what we especially pay attention to as these two values indicate the angular distortion-based channel gain variation. As shown in Fig. 2, $\cos ({\phi _i})$ can be calculated based on the normal vector of the $i$-th LED and the vector pointing from the center of the $i$-th LED to the center of the camera lens $\overrightarrow {{d_{i,c}}}$ as

$$\cos ({\phi _i}) = \frac{{\overrightarrow {{n_{LED}}} \cdot \overrightarrow {{d_{i,c}}} }}{{\left\| {\overrightarrow {{n_{LED}}} } \right\| \times \left\| {\overrightarrow {{d_{i,c}}} } \right\|}},$$
where the initial value of $\overrightarrow {{n_{LED}}}$ is $\,{(0,0, - 1)^T}$ in our model. If the LED rotates $\alpha$, $\beta$, and $\gamma$ respectively, the vector of the LED after rotation $\overrightarrow {n{'_{LED}}}$ can be calculated as
$$\overrightarrow {n{'_{LED}}} = {{\mathbf{R}}_\alpha }{{\mathbf{R}}_\beta }{{\mathbf{R}}_\gamma }\left[ \begin{array}{l} 0\\ 0\\ - 1 \end{array} \right] = \left[ \begin{array}{l} - \cos \alpha \sin \gamma - \sin \alpha \sin \beta \cos \gamma \\ - \sin \alpha \sin \gamma + \cos \alpha \sin \beta \cos \gamma \\ \,\,\,\,\,\,\,\,\,\,\,\,\,\,\, - \cos \beta \cos \gamma \end{array} \right].$$

The new $\cos ({\phi _i}')$ can be calculated based on Eq. (12) and Eq. (13). Calculations of the $\cos ({\psi _{{{_i}_{_{({u_L},{v_L})}}}}})$ is presented as

$$\cos ({\psi _{{{_i}_{_{({u_L},{v_L})}}}}}) = \frac{f}{{\sqrt {{f^2} + {{\left\| {{r_c}_{_{({u_L},{v_L})}}} \right\|}^2}} }},$$
where ${r_c}_{_{({u_L},{v_L})}}$ is the distance from the center of the imaging plane to the $({u_L},{v_L})$ pixel. Here, we can notice that $\cos ({\psi _{{{_i}_{_{({u_L},{v_L})}}}}})$ is influenced by the pixel location instead of the camera posture when supposing the LED can be imaged completely even with angle distortion. It should be noticed that Eq. (11) is the channel DC gain without considering the weather impact. For simulating realistic outdoor scenarios when the weather influence cannot be ignored, the expression of channel DC gain needs to be modified. Here, we take the characterization of fog impact as an example. The fog impact can be integrated with our proposed model by multiplying the fog channel coefficient ${e^{ - {A_f}{D_{{i_{({u_L},{v_L})}}}}}}$ with the channel DC gain (Eq. (11)). ${A_f}$ is the fog attenuation which is affected by the visibility, transmission distance, light wavelength, and the distribution size of the scattering particles. More details can be found in [10] where an analytical equation was provided.

Plugging Eq. (12)–Eq. (14) into Eq. (11), the channel DC gain can be derived by integrating with the OCC geometric characteristics. Following that, the received optical power ${P_{{R_{{{_i}_{_{({u_L},{v_L})}}}}}}}(t)$ by the $({u_L},{v_L})$ pixel is calculated as

$${P_{{R_{{{_i}_{_{({u_L},{v_L})}}}}}}}(t) = {P_T}_{_i}(t){H_i}_{_{({u_L},{v_L})}}(0).$$

The irradiance of the $({u_L},{v_L})$ pixel can then be calculated as

$${E_{{{_i}_{_{({u_L},{v_L})}}}}}(t) = \frac{{{P_{{R_{{{_i}_{_{({u_L},{v_L})}}}}}}}(t)}}{{{p_u}{p_v}}}.$$

Based on our previous investigation [10], the received irradiance at the camera receiver is transformed into photons, electrons, voltage, digital number, and pixel value sequentially. For an 8-bit Global Shutter (GS) camera, such a process can be modeled based on the pixel value of $P{V_i}_{_{({u_L},{v_L})}}$ as

$$P{V_i}_{_{({u_L},{v_L})}} = 255 \times {\left\{ {\frac{{{A_a}{A_c}[{V_a}_{\_ref} - {A_f}({V_{ref}} - {Q_e}{A_n}\frac{{\int_{{t_s}}^{{t_s} + {t_{\exp }}} {{E_{{{_i}_{_{({u_L},{v_L})}}}}}(t)} dt}}{{{Q_p}}})]}}{{ra{w_{\max }}}}} \right\}^{1/{g_{\rm{a}}}}},$$
where ${t_s}$ is the time point when the exposure of a frame starts, ${t_{\exp }}$ is the camera exposure time, ${Q_p}$ and ${Q_e}$ are the energy of a single photon and the quantum efficiency respectively. ${A_a}$, ${A_c}$, ${A_f}$, ${A_n}$ are the Analog-to-Digital Converter (ADC) gain, Correlated Double Sampling (CDS) gain, source follower gain, and sense node gain. ${V_a}_{\_ref}$ and ${V_{ref}}$ are the maximum voltage that can be quantified by ADC and the reference voltage. $ra{w_{\max }}$ is the maximum possible raw output value, and ${g_a}$ is the camera gamma value.

For an 8-bit Rolling Shutter (RS) camera, such a process can be modeled as

$$P{V_i}_{_{({u_L},{v_L})}} = 255 \times {\left\{ {\frac{{{A_a}{A_c}[{V_a}_{\_ref} - {A_f}({V_{ref}} - {Q_e}{A_n}\frac{{\int_{{t_s} + ({u_L} - 1){t_H}}^{{t_s} + ({u_L} - 1){t_H} + {t_{\exp }}} {{E_{{{_i}_{_{({u_L},{v_L})}}}}}(t)} dt}}{{{Q_p}}}]}}{{ra{w_{\max }}}}} \right\}^{1/{g_{\rm{a}}}}},$$
where ${t_H}$ is the exposure time interval of the RS camera’s adjacent rows. The detailed calculation process can be found in our previous work in [10]. For concise expression and highlighting our research focus, we don’t include the camera’s internal noise in our proposed model as the pixel value increment brought by the internal noise is normally small [10]. However, a more precise analytical model with the characterization of the camera noise can be acquired based on our previous work [10] by referring to the conclusions obtained from the modified imaging process description considering the existing camera noise effects.

Based on Eq. (17) and Eq. (18), the digital value of each pixel on the image illuminated by the $i$-th LED can be calculated. Furthermore, the Point Spread Function (PSF) is required to be applied to describe the spread due to the camera receiver’s finite Numerical Aperture (NA) and the crosstalk phenomenon commonly exists in the digital image sensor pixels [21,22]. The Gaussian filter is used to realize the PSF functionality in our model which is the same as the one we use in [10]. Finally, we may use either Eq. (5) and Eq. (17) or Eq. (5) and Eq. (18) to model the OCC system with the existence of angular distortion. As we didn’t consider colour imaging in our research, the Bayer filter was therefore not incorporated into our proposed model and simulations. However, the Bayer filter’s impact can be incorporated into our model by integrating the Bayer filter’s response in the channel DC gain part in Eq. (11), where some investigations have already studied its response in detail [23].

It should be mentioned here that although we refer to imaging geometry and some previous existing conclusions to derive the final equations, our proposed model is unique in the following aspects. Firstly, different from those who either only consider individual LED-based OCC systems [10] or empirical characterization of OCC systems without rigid validation [13,20], our analytically derived MIMO LED-based OCC systems can be applied to describe both the parallel and angular distortion situations. The accuracy of our proposed model will be validated in Section 4. Secondly, by considering the pixel value-based camera imaging functionality in OCC systems, both the imaging location and intensity are described based on pixel values in our model that differentiate from those using luminous intensity, optical power, and photocurrent as the criterion [5]. Finally, with the vehicular communication considered, both the rotation and offset angles are systematically classified and integrated with our model under vehicle-to-vehicle communication scenarios.

3. Data acquisitions based on the OCC hardware platform and MATLAB-assisted simulations

Data acquisitions which include both the captured images from the experimentally designed OCC hardware platform and simulated ones from our model-based vehicular OCC systems in MATLAB are introduced in this section. These data are applied for our model accuracy validation and further vehicular OCC system performance analysis.

3.1 Data acquisitions based on the experimental testbed

Experimental testbed setup and the practical method we apply to realize LED detection and pixel value acquisition based on the captured images are presented.

Experimental testbed setup: We use an 8 $\times$ 8 808nm LED array as the transmitter in our OCC system. The driving current of the LED is set as 65 mA, and its physical radius is 3mm. The adjacent LED interval within the array is 3 cm. A Micro-Controller Unit (MCU) is applied to generate the transmitted data frame at the transmitter side. Besides, a triode-based circuit is used to guarantee the LEDs are working under the linear region. At the receiver side, a sCMOS-based camera with an 808nm optical filter is applied with a variable focal length ranging from 11mm-197mm. Such an optical filter can suppress the interference from the background physically which then lowers requirements for the applied image processing algorithms. The resolution of the camera image sensor is 1024 $\times$ 1024 with each pixel size as 6.5$\mu m$ $\times$ 6.5$\mu m$. To avoid the blooming impact (the interference caused by the overexposure phenomenon is not what we pay attention to), the $F$-number of the camera is set as 64 for our experiments. The camera exposure time is $1.0 \times {10^{ - 3}}$ s. The applied camera is an 8-bit single-channel gray imaging camera as color imaging is not considered in the proposed model, simulation, and field tests. Moreover, a computer with an image capture card is used to support higher image read rates and Python-based processing algorithms. Key parameters of the hardware OCC platform are listed in Table 1. The experimental environment is chosen to be on the road beside the school building to simulate the outdoor conditions, where a relatively long transmission distance can be achieved and the weather impact such as the fog together with the turbulence can be reasonably ignored. Such neglect is also suitable for the NLOS phenomenon under the tested scenario. The overall hardware platform setup is shown in Fig. 3(a). Demonstrations of LED rotation-based, camera rotation-based and offset angle-based testing scenarios are presented in Fig. 3(b), Fig. 3(c), and Fig. 3(d).

Tables Icon

Table 1. Typical parameters of the OCC experimental testbed

 figure: Fig. 3.

Fig. 3. Experimental testbed setup. (a) The overall testbed setup. (b) Demonstration of LED rotation-based testing scenario.(c) Demonstration of camera rotation-based testing scenario. (d) Demonstration of LED offset angle-based testing scenario.

Download Full Size | PDF

Examples of the captured images based on the experimental testbed are presented in Fig. 4, where images taken under the transceiver parallel case, camera rotation case, and LED rotation case are shown in Fig. 4(a)-Fig. 4(c) respectively.

 figure: Fig. 4.

Fig. 4. Practically captured images (cropped) under 30 m. (a) Transceiver parallel. (b) Camera rotation $\alpha$ = $45^\circ$. (c) LED rotation $\beta$ = $45^\circ$.

Download Full Size | PDF

Detection and pixel values acquisition of the captured LED: As we use a monochrome 808nm-LED array and an 808nm optical filter-assisted camera, the complicated imaging background can be greatly suppressed by this simple but efficient physical filtering method. Python-based Hough circle detection is the LED detection method we use on our OCC testbed which is sufficient for processing our captured images with limited interferences. The maximum pixel values within the detected LED imaging area can then be acquired for further analysis.

3.2 Data acquisitions based on MATLAB simulations

Our MATLAB-based simulation process can be divided into the following steps which include simulation parameters selection, initial coordinate system and transceiver position determination, geometry-based imaging position calculation, radiometry-based imaging intensity calculation, simulated images formation, and acquisition of the simulated LED pixel values.

Parameters selection: For the rationality of the model accuracy validation and to facilitate the related comparisons, the transceiver parameters are selected the same as those in the hardware platform.

Initial coordinate system and transceiver position determination: Before the calculation of the imaging LED geometry and radiometry features, the four initial coordinate systems which include the World Coordinate System $(X, Y, Z)$, Camera Coordinate System $({X_c}, {Y_c}, {Z_c})$, Image (Film) Coordinate System $(x, y)$, and Pixel Coordinate System $(u, v)$ need to be defined. The simulated transceiver position can then be determined to facilitate the following calculations.

The four initial coordinate systems are defined the same as those shown in Fig. 1(b) with the original point of the World Coordinate System being the center of the 8 $\times$ 8 808nm LED array panel. The translation matrix ${\mathbf {T}}$ and the rotation matrix ${\mathbf {R}}$ described in Eq. (1) are applied to indicate the transceiver position. In our simulation, the influences of both the LED and camera rotation are analyzed.

Geometry-based imaging position calculation: As we use 64 LEDs in an array as the OCC transmitter, the imaging locations of these LEDs are all required to be determined. Here, we assume each LED in the array is a circular shape, and we take the $i$-th LED as an example whose imaging position is determined based on the description of its contour. For the accuracy consideration, 3600 points with $0.5^\circ$ as the angular interval are selected on the physical LED contour under the World Coordinate System. Based on Eq. (5), their corresponding imaging location under the Pixel Coordinate System and its geometric features can be acquired. With the same process, the imaging location of all the LEDs within the array can be obtained. Moreover, 64 image mask arrays with the size of $1024 \times 1024$ are applied to record the imaging area of each LED in the array respectively for the following analysis. The value of the array element located within the LED imaging area in each mask is set to 1 to realize the recording target.

Radiometry-based imaging intensity calculation: Pixel values variation induced by each LED in the array are calculated first respectively. Based on the applied camera type, Eq. (17) and Eq. (18) are referred to get the pixel value caused by the $i$-th LED together with the $i$-th image mask mentioned in the previous step for CCD and CMOS cameras respectively.

Simulated images formation: As the LED is an incoherent light source, the final pixel value on the simulated image is the sum of the pixel value caused by each LED within the camera FoV according to the pixel value linear addition property. By summing the pixel values caused by each LED in the array correspondingly, one final simulated image is obtained. Based on the above-mentioned procedures, multiple simulated images can further be acquired by varying the transceiver positions and rotation angles. Here, for simplicity and general analysis, all the 64 LEDs in the array are assumed to be in the ON state for our simulations. Four of the simulated images are provided as examples shown in Fig. 5(a)–Fig. 5(d), where cases of the transceiver offset, $\alpha$, $\beta$, and $\gamma$-induced LED angular distortion are presented respectively under 20m transmission distance. Here, it should be noticed that the actual imaging area of each LED is larger than the geometrically calculated one due to the application of the PSF [21]. The variance of the applied PSF is 10 which is the same as the value used in [22].

 figure: Fig. 5.

Fig. 5. MATLAB-assisted simulated images. (a) Transceiver offset. (b) LED rotation $\alpha$ = $45^\circ$. (c) LED rotation $\beta$ = $45^\circ$. (d) LED rotation $\gamma$ = $45^\circ$.

Download Full Size | PDF

Acquisition of the simulated LED pixel values: Maximum pixel extraction [24] is the method we apply to measure the targeted LED imaging intensity. As we use a unique image mask array to record the imaging location of each LED, the maximum pixel value within the corresponding imaging area is easily determined for the simulated images.

4. Performance analysis considering angular distortion in vehicular OCC systems

Model accuracy validation and performance analysis are realized based on the maximum pixel values and imaging locations acquired from either the simulated images or practically captured ones.

4.1 Angular distortion impact for a single-time snapshot

Given values of the rotation and offset angles under various single-time snapshots, their impact on the imaging intensity, location, and communication performance are investigated.

4.1.1 Angular distortion impact on imaging intensity and location

The impact of rotation angles is first analyzed, where influences of $\beta$ from the transmitter LED located on the front vehicle are presented in Fig. 6. Processes of lane changing and vehicle overtaking can lead to variations of $\beta$ in practical driving scenarios. The curves are obtained by averaging the 50 maximum pixel values of the first LED (the upper left one) acquired either from the 50 captured frames or the 50 simulated ones under each LED rotation angle with the distance of 10m, 20m, and 30m respectively. Here, any LED can be selected as the reference object as long as the selected one remains consistent in all experiments.

 figure: Fig. 6.

Fig. 6. The impact of $\beta$ rotation from the front vehicle transmitter. (a) Maximum pixel value variations of both the MATLAB-based simulated LEDs and testbed-based captured ones with $\beta$ varying from $- 60^\circ$ to $60^\circ$ under distances of 10 m, 20 m, and 30 m. (b) Demonstrations of maximum pixel value fluctuations with variations of distance and LED rotation angle $\beta$.

Download Full Size | PDF

Figure 6(a) presents the maximum pixel value variations of both the MATLAB-based simulated LEDs and testbed-based captured ones with the LED rotation angle $\beta$ on the front vehicle under different distances. Data shows good agreement between our proposed model and the experiment. Besides, some interesting results can be noticed. Firstly, the maximum pixel value distributions are symmetrical concerning LED rotation angle $\beta$. This is due to the application of the Lambertian radiation pattern-based transceiver model with similar angle symmetry properties. Here, it should be noticed that when more practical illumination patterns of the vehicle lighting systems are considered, related pixel value distributions may be different. Secondly, maximum pixel values decline with the increase of angular absolute values under the distances of 20m and 30m. When angular distortion increases, the decreased level of the pixel values becomes sharper. However, things turn different for the 10m-distance case, where the maximum pixel value remains 255 without decrease until the absolute value of $\beta$ is improved beyond $30^\circ$. This condition is caused by the pixel value saturation phenomenon due to the relatively short distances and large optical power. It also indicates that moderate pixel value saturation can compensate for the optical power loss caused by angular distortion. Thirdly, Fig. 6(a) also indicates farther distances are less sensitive to angular distortion as the pixel value declinations under the same rotation angle become less obvious with the increase of transmission distances. Three-dimensional demonstrations of the relationship between distance-pixel value and $\beta$-pixel value variations can be seen in Fig. 6(b).

Apart from $\beta$, $\alpha$ and $\gamma$ are two other rotation angles. $\alpha$ is normally caused by road irregularities, and $\gamma$ variations are brought by vehicle uphill and downhill processes. Comparisons of the impact on maximum pixel values among $\beta$, $\alpha$ and $\gamma$ of the transmitter on the front vehicle under 20m transceiver distance are presented in Fig. 7. It can be noticed that changes of both $\beta$ and $\gamma$ lead to significant but similar pixel value variations, whereas the $\alpha$-induced pixel value differences are limited. This is because variations of $\alpha$ don’t lead to fluctuations in the LED irradiance angle which then limits the differences in pixel values. Moreover, $\alpha$ only causes deviations in the LED imaging location without deformation. Comparatively, both $\beta$ and $\gamma$ influence the LED irradiance angle which then leads to obvious pixel value variations and possible LED imaging deformation.

 figure: Fig. 7.

Fig. 7. Impact of the front vehicle LED rotation angle $\alpha$, $\beta$, and $\gamma$ under 20 m distance.

Download Full Size | PDF

Except for the transmitter rotations from the front vehicle, angular distortions caused by the camera receiver from the rear one in incoming links also exist in practical vehicular OCC systems. Figure 8 presents their impacts in detail.

 figure: Fig. 8.

Fig. 8. Maximum pixel value variations with changes of front vehicle LED rotation angle $\beta$ and rear vehicle camera rotation angle $\beta$ under 20 m distance.

Download Full Size | PDF

Results show that the camera rotation-caused pixel value variations can be reasonably neglected compared to those induced by LED rotations, especially when the $\beta$ ranges from $- 45^\circ$ to $45^\circ$. This can be explained by Eq. (14) as the camera rotation angle has a limited impact on the incidence cosine value which then leads to unobvious variations in the imaging intensity. Comparatively, imaging location is greatly influenced by rotations on the camera side. When the absolute value of the camera rotation angle is beyond $60^\circ$, angular distortion leads to the occurrence of overlapping phenomena. The original single LED imaging area is illuminated by multiple LEDs due to angular distortion-induced overlapping and the linear superposition of light intensity leads to an increase in imaging gray value correspondingly. Such reasons help explain why the pixel value under $60^\circ$ is larger than that under $45^\circ$ as demonstrated in Fig. 8. Conclusions derived above can be applied to analyze the OCC link performance in Fig. 1(a), where OCC communications between B and A based on link b are less influenced by the B-induced angular distortions compared with data transmissions based on link a.

Impacts of the offset angles are demonstrated under 30m, 40m, and 50m inter-vehicle longitudinal distances in Fig. 9 where the inter-vehicle lateral distance is used to represent offset angles. It can be noticed that offset angles lead to limited pixel value variations compared to those induced by rotation angles, and farther distances are also less sensitive to offset angular distortions the same as rotation angles. However, imaging locations are sensitive to offset angles although such angular distortions don’t lead to imaging distortion.

 figure: Fig. 9.

Fig. 9. Impact of the offset angles under 30 m, 40 m, and 50 m inter-vehicle longitudinal distances.

Download Full Size | PDF

4.1.2 Angular distortion impact on communication performance

As offset angles lead to limited pixel value variations as described above, only rotation angles’ impacts on communication aspects are presented in this part. Peak Signal-to-Noise Ratio (PSNR), Bit Error Rate (BER), Signal-to-Noise Ratio (SNR), and channel capacity ${C_{OCC}}$ are the four performance criteria commonly applied for vehicular OCC systems. PSNR is applied to evaluate captured images’ quality based on pixel values given as

$${\rm{PSNR}} = 10\log \frac{{PV_{\max }^2}}{{{\rm{MSE}}}},$$
where $PV_{\max }^{}$ for an 8-bit camera is 255. ${\rm {MSE}}$ is the pixel luminance mean squared error calculated as
$${\rm{MSE}} = \frac{1}{z}\sum_{w = 1}^z {{{(P{V_{Tx}}(w) - P{V_{Rx}}(w))}^2}},$$
where $P{V_{Tx}}$ are the average pixel values of the imaging area on the transmitter LED side. They can be obtained from both the simulated images and the practically captured ones at approximately zero distance similar to the method mentioned in [6]. $P{V_{Rx}}$ are the average pixel values of the imaging area on the camera receiver side. $w$ is the image frame index number, and $z$ is the total captured frame number. BER reflects the ratio of wrongly retrieved data bits at the receiver and the total transmitted data bits, which can be theoretically calculated as
$${\rm{BER}} = \frac{1}{2}erfc(\sqrt {\frac{{{E_b}}}{{2\sigma _n^2}}}),$$
where $erfc( \bullet )$ is the complementary error function, and $\frac {{{E_b}}}{{\sigma _n^2}}$ is the pixel SNR which can be calculated as
$$\frac{{{E_b}}}{{\sigma _n^2}} = \frac{{E[P{V^2}]}}{{E[{n^2}]}},$$
where $PV$ is the pixel value. $n$ is the camera internal noise whose variance can be acquired based on the received pixel value variance as described in [10].

Figure 10(a) presents the PSNR and BER variations relative to different LED rotation angles $\beta$ based on our proposed system model under 20m distance. The fog impact is included to simulate more realistic outdoor scenarios where 20 km visibility is considered. Transmitter angular distortions can be noticed to deteriorate the PSNR and BER-based OCC performance significantly.

 figure: Fig. 10.

Fig. 10. Communication performance impacts of $\beta$ rotation from the front vehicle transmitter. (a) PSNR and BER fluctuations with variations of LED rotation angle ${\beta }$ under 20 m distance. (b) Demonstrations of OCC channel capacity fluctuations with changes of distance and LED rotation angle $\beta$.

Download Full Size | PDF

${C_{OCC}}$ is applied to reflect the OCC channel potentiality on allowing the reliable transmission of the maximum data bits per second. Similar to the method in [25], Eq. (23) is used to approximate the capacity of nonnegative channels for OCC systems as

$${C_{OCC}} \approx {W_{fps}}({W_s}{\log _2}(1 + {\rm{SINR}})),$$
where ${W_{fps}}$ is the camera frame rate and the value we use is 500 for practical considerations [26]. ${W_s}$ is the spatial bandwidth, which is the same as the orthogonal or parallel channel number in a MIMO system. As there are 64 LEDs in the transmitter LED array, its value is 64 in our study. ${\rm {SINR}}$ is the OCC signal-to-interference ratio which can be calculated as
$${\rm{SINR}} = \frac{{PV_{_{avg}}^2}}{{\sigma _n^2}},$$
where $PV_{_{avg}}$ is acquired by averaging the pixel values within the LED imaging area, and the calculation process of the $\sigma _n^2$ is the same as the method we take for ${\rm {BER}}$ analysis. Figure 10(b) presents the OCC channel capacity with respect of both the transmission distances and the LED rotation angle $\beta$ , where both the increase of the distances and rotation angles lower its capacity.

4.2 Angular distortion impact from a statistical perspective

Based on the above investigations of an individual angle impact under various single-time snapshots, influences of the road irregularities and vehicle vibrations-induced $\alpha$, $\beta$, and $\gamma$ are jointly studied from a statistical perspective.

According to [11], as frequencies of vehicle vibrations are mainly distributed in frequencies less than 20Hz, summing sinusoidal waveforms of 3Hz and 10Hz is applied to represent vehicle vibration frequencies. The road surface irregularity is simulated by Gaussian random variables. Two driving scenarios are analyzed in this part.

Scenario 1: Two moving vehicles with a constant spacing.

In this scenario, two vehicles are assumed to drive at a speed of 60km/h while keeping a constant 30m distance. As the statistical impact of angular distortions within 1 s is aimed to be investigated where a 500 frame-per-second (fps) camera is applied, 500 groups of $\alpha$, $\beta$, and $\gamma$ are generated considering the angle distributions in vehicular systems by summing sinusoidal waveforms of 3Hz and 10Hz as introduced before. The generated distributions of $\alpha$ and $\gamma$ are shown in Fig. 11. As the two vehicles are assumed to move only in the Z-direction under the World Coordinate System, $\beta$ is assumed to be zero. Based on the generated angles, statistical distributions of maximum pixel values, LED imaging center u and v within 1 s are demonstrated in Fig. 12. As we can notice, when the two moving vehicles stay relatively still, vehicle vibrations and uneven road have little impact on the pixel value variations where the maximum difference is 5, as well as the moving distance in the u and v directions where the maximum distances are 3 and 1 respectively.

 figure: Fig. 11.

Fig. 11. Angular distributions of $\alpha$ and $\gamma$ for Scenario 1, where $\beta$ is assumed to be zero. (a) $\alpha$ distributions. (b) $\gamma$ distributions.

Download Full Size | PDF

 figure: Fig. 12.

Fig. 12. Statistical impacts of angular distortions for Scenario 1. (a) Maximum pixel value distributions. (b) LED imaging center u distributions. (c) LED imaging center v distributions.

Download Full Size | PDF

Scenario 2: Two vehicles with a decreased spacing.

In this scenario, two vehicles are assumed on the road with the rear one moving close to the front one at a relative speed of 60km/h from a 30m distance. A 500fps camera is still used as the receiver and the statistical impact of angular distortions within 1 s is studied. With the same generation method introduced in Scenario 1, 500 groups of $\alpha$, $\beta$, and $\gamma$ are generated for this scenario as shown in Fig. 13. Statistical impacts of angular distortions for this scenario are presented in Fig. 14. Compared to the relatively static conditions, variations of the distances along with the vibrations and road irregularities lead to obvious differences in maximum pixel value and changes in the LED imaging location, which indicates the tracking system for a moving vehicular OCC system is necessary.

 figure: Fig. 13.

Fig. 13. Angular distributions of $\alpha$ and $\gamma$ for Scenario 2, where $\beta$ is assumed to be zero. (a) $\alpha$ distributions. (b) $\gamma$ distributions.

Download Full Size | PDF

 figure: Fig. 14.

Fig. 14. Statistical impacts of angular distortions for Scenario 2. (a) Maximum pixel value distributions. (b) LED imaging center u distributions. (c) LED imaging center v distributions.

Download Full Size | PDF

5. Conclusion

This paper proposes a validated pixel value-described vehicular OCC system model considering MIMO-based transceiver angular distortion. Driving behavior and road irregularities-induced angular distortions which include the offset angle and three categories of rotation angle (yaw, pitch, and roll) are described systematically. Model validation and impacts of the angular distortions on maximum pixel value, imaging location, and communication-related performance (PSNR, BER, and channel capacity) are achieved based on MATLAB-assisted vehicular OCC system and our designed experimental testbed. Apart from the performance analysis for various single-time snapshots, statistical impacts of angular distortions under two driving scenarios for vehicular OCC systems are also discussed.

Funding

Jilin Provincial Scientific and Technological Development Program (20200401122GX); National Natural Science Foundation of China (61601195); China Scholarship Council (202206170080).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. A. Memedi and F. Dressler, “Vehicular visible light communications: A survey,” IEEE Commun. Surv. Tutorials 23(1), 161–181 (2021). [CrossRef]  

2. K. Eöllős-Jarošíková, V. Neuman, C. M. Jurado-Verdú, et al., “Long-distance indoor optical camera communication using side-emitting fibers as distributed transmitters,” IEEE Commun. Surv. Tutorials 23(1), 161–181 (2021). [CrossRef]  

3. N.-N. Dao, T. H. Do, S. Cho, et al., “Information revealed by vision: A review on the next-generation OCC standard for AIoV,” IT Prof. 24(4), 58–65 (2022). [CrossRef]  

4. H. B. Eldeeb, E. Eso, M. Uysal, et al., “Vehicular visible light communications: The impact of taillight radiation pattern,” in Photonics Conference (IEEE, 2020), pp. 1–2.

5. P. Luo, Z. Ghassemlooy, H. Le Minh, et al., “Performance analysis of a car-to-car visible light communication system,” Appl. Opt. 54(7), 1696–1706 (2015). [CrossRef]  

6. E. Eso, Z. Ghassemlooy, S. Zvanovec, et al., “Performance of vehicular visible light communications under the effects of atmospheric turbulence with aperture averaging,” Sensors 21(8), 2751 (2021). [CrossRef]  

7. A. L. Chen, H. P. Wu, Y. L. Wei, et al., “Time variation in vehicle-to-vehicle visible light communication channels,” in Vehicular Networking Conference (IEEE, 2016), pp. 1–8.

8. H. Y. Tseng, Y. L. Wei, A. L. Chen, et al., “Characterizing link asymmetry in vehicle-to-vehicle visible light communications,” in Vehicular Networking Conference (IEEE, 2015), pp. 88–95.

9. M. Seminara, T. Nawaz, S. Caputo, et al., “Characterization of field of view in visible light communication systems for intelligent transportation systems,” IEEE Photonics J. 12(4), 1–16 (2020). [CrossRef]  

10. A. Liu, W. Shi, M. Ouyang, et al., “Characterization of optical camera communication based on a comprehensive system model,” J. Lightwave Technol. 40(18), 6087–6100 (2022). [CrossRef]  

11. T. Yamazato, M. Kinoshita, S. Arai, et al., “Vehicle motion and pixel illumination modeling for image sensor based visible light communication,” IEEE J. Select. Areas Commun. 33(9), 1793–1805 (2015). [CrossRef]  

12. P. Sharda, G. S. Reddy, M. R. Bhatnagar, et al., “A comprehensive modeling of vehicle-to-vehicle based VLC system under practical considerations, an investigation of performance, and diversity property,” IEEE Trans. Commun. 70(5), 3320–3332 (2022). [CrossRef]  

13. A. Islam, L. Musavian, and N. Thomos, “Performance analysis of vehicular optical camera communications: Roadmap to uRLLC,” in Global Communications Conference (IEEE, 2019), pp. 1–6.

14. R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision (Cambridge University Press, 2003).

15. S. R. Teli, S. Zvanovec, R. Perez-Jimenez, et al., “Spatial frequency-based angular behavior of a short-range flicker-free MIMO–OCC link,” Appl. Opt. 59(33), 10357–10368 (2020). [CrossRef]  

16. S. Chen and X. Chi, “Analysis of optical camera communications for terminal jiggle with quality of service guarantees,” J. Lightwave Technol. 41(19), 6280–6287 (2023). [CrossRef]  

17. K. Cui, G. Chen, Z. Xu, et al., “Traffic light to vehicle visible light communication channel characterization,” Appl. Opt. 51(27), 6594–6605 (2012). [CrossRef]  

18. J. M. Kahn and J. R. Barry, “Wireless infrared communications,” Proc. IEEE 85(2), 265–298 (1997). [CrossRef]  

19. K. Yokoo, Y. Kozawa, and T. Sawa, “Theoretical analysis of received optical intensity of underwater image sensor based visible light communications using RGB-LEDs,” in 24th International Symposium on Wireless Personal Multimedia Communications (IEEE, 2021), pp. 1–6.

20. M. S. Ifthekhar, M. A. Hossain, C. H. Hong, et al., “Radiometric and geometric camera model for optical camera communications,” in Seventh International Conference on Ubiquitous and Future Networks (IEEE, 2015), pp. 53–57.

21. M. Eghbal, F. S. Tabataba, and J. Abouei, “Investigating the effect of turbulence on IPI in a vehicular OCC system using PSF analysis,” Opt. Continuum 1(9), 2011–2029 (2022). [CrossRef]  

22. T. Yamamoto, T. Yamazato, H. Okada, et al., “Comparison of distance performances of modulation schemes in intelligent transport system image sensor communication,” IEICE Communications Express 10(8), 498–504 (2021). [CrossRef]  

23. D. Moreno, V. Guerra, J. Rufo, et al., “Multispectral optical camera communication links based on spectral signature multiplexing,” IET Optoelectron. 17(4), 91–100 (2023). [CrossRef]  

24. S. Kamegawa, M. Kinoshita, and T. Yamazato, “Performance evaluation of precoded pulse width modulation for image sensor communication,” in Globecom Workshops (IEEE, 2018), pp. 1–7.

25. M. K. Hasan, M. Z. Chowdhury, M. Shahjalal, et al., “Performance analysis and improvement of optical camera communication,” Appl. Sci. 8(12), 2527 (2018). [CrossRef]  

26. T.-H. Do and M. Yoo, “Multiple exposure coding for short and long dual transmission in vehicle optical camera communication,” IEEE Access 7, 35148–35161 (2019). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (14)

Fig. 1.
Fig. 1. Demonstrations of the vehicular OCC systems. (a) Two common types of communication links. (b) Four coordinate systems setup and common angular distortions.
Fig. 2.
Fig. 2. Calculation process of one example pixel value.
Fig. 3.
Fig. 3. Experimental testbed setup. (a) The overall testbed setup. (b) Demonstration of LED rotation-based testing scenario.(c) Demonstration of camera rotation-based testing scenario. (d) Demonstration of LED offset angle-based testing scenario.
Fig. 4.
Fig. 4. Practically captured images (cropped) under 30 m. (a) Transceiver parallel. (b) Camera rotation $\alpha$ = $45^\circ$. (c) LED rotation $\beta$ = $45^\circ$.
Fig. 5.
Fig. 5. MATLAB-assisted simulated images. (a) Transceiver offset. (b) LED rotation $\alpha$ = $45^\circ$. (c) LED rotation $\beta$ = $45^\circ$. (d) LED rotation $\gamma$ = $45^\circ$.
Fig. 6.
Fig. 6. The impact of $\beta$ rotation from the front vehicle transmitter. (a) Maximum pixel value variations of both the MATLAB-based simulated LEDs and testbed-based captured ones with $\beta$ varying from $- 60^\circ$ to $60^\circ$ under distances of 10 m, 20 m, and 30 m. (b) Demonstrations of maximum pixel value fluctuations with variations of distance and LED rotation angle $\beta$.
Fig. 7.
Fig. 7. Impact of the front vehicle LED rotation angle $\alpha$, $\beta$, and $\gamma$ under 20 m distance.
Fig. 8.
Fig. 8. Maximum pixel value variations with changes of front vehicle LED rotation angle $\beta$ and rear vehicle camera rotation angle $\beta$ under 20 m distance.
Fig. 9.
Fig. 9. Impact of the offset angles under 30 m, 40 m, and 50 m inter-vehicle longitudinal distances.
Fig. 10.
Fig. 10. Communication performance impacts of $\beta$ rotation from the front vehicle transmitter. (a) PSNR and BER fluctuations with variations of LED rotation angle ${\beta }$ under 20 m distance. (b) Demonstrations of OCC channel capacity fluctuations with changes of distance and LED rotation angle $\beta$.
Fig. 11.
Fig. 11. Angular distributions of $\alpha$ and $\gamma$ for Scenario 1, where $\beta$ is assumed to be zero. (a) $\alpha$ distributions. (b) $\gamma$ distributions.
Fig. 12.
Fig. 12. Statistical impacts of angular distortions for Scenario 1. (a) Maximum pixel value distributions. (b) LED imaging center u distributions. (c) LED imaging center v distributions.
Fig. 13.
Fig. 13. Angular distributions of $\alpha$ and $\gamma$ for Scenario 2, where $\beta$ is assumed to be zero. (a) $\alpha$ distributions. (b) $\gamma$ distributions.
Fig. 14.
Fig. 14. Statistical impacts of angular distortions for Scenario 2. (a) Maximum pixel value distributions. (b) LED imaging center u distributions. (c) LED imaging center v distributions.

Tables (1)

Tables Icon

Table 1. Typical parameters of the OCC experimental testbed

Equations (24)

Equations on this page are rendered with MathJax. Learn more.

[ X c L Y c L Z c L 1 ] = [ R T 0 1 ] [ X L Y L Z L 1 ] ,
R = R α R β R γ = [ cos α sin α 0 sin α cos α 0 0 0 1 ] [ 1 0 0 0 cos β sin β 0 sin β cos β ] [ cos γ 0 sin γ 0 1 0 sin γ 0 cos γ ] ,
Z c L [ x L y L 1 ] = [ f 0 0 0 0 f 0 0 0 0 1 0 ] [ X c L Y c L Z c L 1 ] ,
[ u L v L 1 ] = [ 1 / p u 0 u 0 0 1 / p v v 0 0 0 1 ] [ x L y L 1 ] ,
Z c L [ u L v L 1 ] = [ 1 / p u 0 u 0 0 1 / p v v 0 0 0 1 ] [ f 0 0 0 0 f 0 0 0 0 1 0 ] [ R T 0 1 ] [ X L Y L Z L 1 ] .
P T i ( t ) = k I i ( t ) ,
H i ( u L , v L ) ( 0 ) = A p R ( ϕ i ) D i ( u L , v L ) 2 c i ( u L , v L ) ,
R ( ϕ i ) = ( m + 1 ) 2 π cos m ( ϕ i ) .
A p = { ( f 2 F n u m ) 2 π cos ψ i ( u L , v L ) , 0 ψ i ( u L , v L ) ψ l 0 , ψ i ( u L , v L ) > ψ l
c i ( u L , v L ) = p u p v S i _ L E D ,
H i ( u L , v L ) ( 0 ) = ( m + 1 ) cos m ( ϕ i ) f 2 cos ( ψ i ( u L , v L ) ) p u p v 8 F n u m 2 D i ( u L , v L ) 2 S i _ L E D .
cos ( ϕ i ) = n L E D d i , c n L E D × d i , c ,
n L E D = R α R β R γ [ 0 0 1 ] = [ cos α sin γ sin α sin β cos γ sin α sin γ + cos α sin β cos γ cos β cos γ ] .
cos ( ψ i ( u L , v L ) ) = f f 2 + r c ( u L , v L ) 2 ,
P R i ( u L , v L ) ( t ) = P T i ( t ) H i ( u L , v L ) ( 0 ) .
E i ( u L , v L ) ( t ) = P R i ( u L , v L ) ( t ) p u p v .
P V i ( u L , v L ) = 255 × { A a A c [ V a _ r e f A f ( V r e f Q e A n t s t s + t exp E i ( u L , v L ) ( t ) d t Q p ) ] r a w max } 1 / g a ,
P V i ( u L , v L ) = 255 × { A a A c [ V a _ r e f A f ( V r e f Q e A n t s + ( u L 1 ) t H t s + ( u L 1 ) t H + t exp E i ( u L , v L ) ( t ) d t Q p ] r a w max } 1 / g a ,
P S N R = 10 log P V max 2 M S E ,
M S E = 1 z w = 1 z ( P V T x ( w ) P V R x ( w ) ) 2 ,
B E R = 1 2 e r f c ( E b 2 σ n 2 ) ,
E b σ n 2 = E [ P V 2 ] E [ n 2 ] ,
C O C C W f p s ( W s log 2 ( 1 + S I N R ) ) ,
S I N R = P V a v g 2 σ n 2 ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.