Skip to main content
Philosophical transactions. Series A, Mathematical, physical, and engineering sciences logoLink to Philosophical transactions. Series A, Mathematical, physical, and engineering sciences
. 2020 Mar 2;378(2169):20190191. doi: 10.1098/rsta.2019.0191

Some practical constraints and solutions for optical camera communication

Weijie LIU 1, Zhengyuan Xu 1,
PMCID: PMC7061997  PMID: 32114916

Abstract

Mobile wireless communication heavily relies on the radio frequency to convey message and data. However, its limited spectrum can hardly meet the demands for the future high data rate applications. Optical wireless communication, in particular visible light communication, opens up vast optical spectrum for communication, and meanwhile can retrofit the light sources as the communication transmitters in the existing working or living environments. In conjunction with the ubiquitous cameras in hand-held consumer electronics such as smartphones and pads, optical camera communication (OCC) further takes advantages of image sensors as the communication receivers and realizes low-cost communication systems. This article first provides an overview of OCC systems. It then addresses some practical constraints, ranging from sensor low frame rate and instability, rolling shutter readout, to visual qualities of displayed images and videos, and link blockage between the transmitter and receiver. Accordingly, it introduces existing and new solutions to deal with those constraints by data modulation, newly developed camera structures, post-processing of sensed signals and non-line of sight OCC as a new form. In particular, indirect paths by either the indoor surface reflection or the outdoor atmospheric scattering are explored for link connectivity under blockage. Finally, some future research directions are suggested.

This article is part of the theme issue ‘Optical wireless communication’.

Keywords: optical camera communication, covert communication, rolling shutter, region of interest, non-line of sight, optical wireless communication

1. Introduction

Traditional radio frequency (RF) communication techniques have been developed tremendously over the recent decades. Although it is not yet entirely clear what the next-generation technologies will provide, they aim at higher transmission speeds, lower latency, improved spectral efficiency, higher network capacity, and lower power consumption and cost [1]. Undoubtedly, the demands placed on the RF spectrum will continue to grow. To alleviate the RF spectrum shortage, higher frequency bands including millimetre wave [2], terahertz [3] and petahertz bands are all under consideration. Optical wireless communication (OWC) explores the infrared (IR) [4], ultraviolet (UV) [5] and visible light (VL) [6] frequencies for communications. Communication in both the visible and invisible light spectra can also be termed as petahertz communication for convenience.

In the VL sub-band, visible light communication (VLC) technologies were developed many years ago [7] and continued to be improved [810]. By using off-the-shelf light-emitting diodes (LEDs), they can simultaneously serve lighting and communication applications, including indoor high-speed internet access, information downloading, positioning, outdoor vehicular to infrastructure networking and underwater OWC for autonomous underwater vehicles (AUVs). Typically, the light beam diverges in space. A concentrator is usually employed in front of a photo-detector to collect more optical power and support a transmission rate up to several Gbps [11]. However, the path loss increases dramatically with distance, restricting long-range communication and outdoor vehicular communication. Meanwhile, beam directionality creates difficulties in communication on the move, and dense light sources have to be distinguishable by a mobile terminal. Different from VLC, light fidelity (LiFi) can adopt either IR, VL or UV in the backward path [12,13] while using VL in the forward path the same as VLC. Besides, LiFi can provide seamless terminal mobility and point-to-multipoint bidirectional communication services, which can be deployed easily within existing wireless networks. However, LiFi is vulnerable in outdoor scenarios and unable to support long-range communications. Therefore, free-space optical (FSO) communication emerges as a good candidate for achieving long transmission distance, ultra-high data rates and flexible coverage by beamforming techniques [14,15]. Leveraging the lower attenuation of optical signals in the IR spectrum while penetrating the atmosphere, FSO communication typically adopts an IR laser diode (LD) or other types of laser as the transmitter rather than an LED in VLC and LiFi, achieving several thousand kilometres via a point-to-point link [16]. However, it is sensitive to the link conditions, such as weather, atmospheric turbulence and physical obstructions.

The ubiquity of smart devices equipped with cameras in our daily life motivates study of optical camera communication (OCC) [1721]. It employs image sensors assembled in pervasive consumer electronics, such as smartphones and iPads, as an alternative to photodiode (PD) or avalanche photodiode (APD)-based receivers. As a pragmatic version of VLC, it allows easier delivery of various services by the smart devices. Compared to other OWC technologies, OCC has the unique features such as the larger receiver field-of-view (FOV), spatial and wavelength separation ability, and low cost. In some applications such as indoor positioning and navigation, vehicle steering, and remote controlling, cost and accuracy may be the main factors to consider rather than data rate. Thus, OCC built upon existing cameras is an appealing solution. Besides, the IEEE standard work group 802.15.7 m is dedicated to revision of a formerly established IEEE VLC standard to incorporate new physical layers which support OCC functionalities and medium access control (MAC) modifications [22]. Comparisons of different OWC technologies are summarized in table 1, showing their key characteristics, respectively.

Table 1.

Comparisons of different OWC technologies. OOK, on-off keying; IM, intensity modulation; PM, pulse modulation; CDMA, code-division multiple-access; OFDM, orthogonal frequency division modulation; CSK, colour shift keying; OAM, orbital angular momentum.

technologies VLC LiFi FSO OCC
standard IEEE 802.15.7 IEEE 802.15.11 LC SG well developed IEEE 802.15.7 m
transmitter LED/LD LED/LD laser LED/screen
receiver PD/APD/PIN/camera PD PD/APD camera
spectrum VL IR/VL/UV IR/VL/UV IR/VL
modulation OOK, IM, PM, OFDM, CSK, etc. OOK, IM, OFDM, CSK, etc. OOK, IM, PM, OFDM, OAM, etc. OOK, IM, PM, OFDM, CSK, etc.
data rate Mbps ∼ Gbps kbps ∼ Gbps ∼Gbps bps ∼ Mbps
distance <100 m <10 m >1 km <100 m
implementation complexity moderate moderate high low
computation complexity low low moderate high
path loss moderate moderate high low
robustness to interference moderate moderate high
additional function illumination, localization illumination imaging, localization
limitation illumination constraints, limited range, vulnerable to mobility limited use in outdoor, limited range sensitive to weather and turbulence low data rate

2. Overview of the optical camera communication technologies

Figure 1 shows the basic principle of an OCC system. The bit stream can be modulated by intensity, colour or spatial position according to the requirement of an application scenario. The data streams transmitted from various light sources (e.g. a display screen or a digital traffic light) can be captured and distinguished simultaneously by using an image sensor. An image sensor typically comprises a two-dimensional array of PD pixels to convert the captured incident light into an array of electrical signals. Typically, the pixel outputs due to sunlight or ambient light sources contain only low-frequency constant components as compared to that generated by useful signal light sources. Thus, the desired signal corrupted by background noise sources (e.g. solar radiation or billboard) can be easily filtered out by a frame differential technique [23], in conjunction with narrowband optical filters.

Figure 1.

Figure 1.

Operating principle of an OCC system. (Online version in colour.)

To perform colour imaging, a Bayer pattern or Foveon X3 pattern-based colour filter array is generally placed above an image sensor array [10]. Figure 2a shows the Bayer pattern-based colour filter, containing 25%, 50% and 25% of red, green and blue filters, respectively. Figure 2b shows the Foveon X3 pattern-based colour filter, where the red, green and blue colours can be directly measured after penetrating corresponding colour filters. Such arrangement can avoid the loss of information transmitted by light. Moreover, since an image sensor can classify multiple light sources simultaneously due to its spatial resolution as illustrated in figure 1, it is a natural multi-element optical receiver with multi-colour channels for optical multiple-input multiple-output (MIMO) communication settings.

Figure 2.

Figure 2.

Colour filters of an image sensor. (a) Bayer arrangement. (b) Foveon X3 arrangement. (Online version in colour.)

The attractive features of image sensor receivers prompt tremendous research and application activities to realize new forms of sensing, communication and mobile computing. Figure 3 shows a few application scenarios of the OCC technologies. A commercial camera-based visible light positioning system can provide high positioning accuracy and is free of electromagnetic interference compared with a conventional RF-based positioning system. Vehicle-to-vehicle, vehicle-to-infrastructure and vehicle-to-pedestrian communications can be realized by OCC in an intelligent transportation system (ITS). Secure data sharing and transformation can be achieved by OCC as well. Health data collected by wearable sensors can be easily analysed by a robot equipped with cameras. Viewers can acquire more information by scanning the digital signage with their smartphones. Besides, posture and gesture are easily detected and recognized by OCC.

Figure 3.

Figure 3.

Some OCC application scenarios. (Online version in colour.)

So far, there have been a few surveys on OCC in the literature. They concern about channel modelling, modulation and coding, and prospective business trends. In [17], the development, modulation scheme, superiority and shortcoming of the OCC systems were briefly reviewed. A new architecture was also proposed to handle the limitations including the camera sampling rate, frame rate variation and camera vibration. In [18], the authors provided a comprehensive overview of the OCC modulation schemes, and proposed an undersampled modulation scheme to achieve flicker-free for human eyes. In [19], the key technologies in IEEE 802.15.7r1, current research status, and appealing application scenarios of OCC were discussed in detail. The issues about designing and implementing a practical OCC system were addressed in [20]. Recently, Saeed et al. [21] provided a comprehensive survey on the major OWC techniques and their applications in navigation, positioning and motion capture.

Although OCC shows great potential in a range of applications, there still remain critical challenges limiting practical implementation. For example, most commercial cameras have a frame rate ranging from 30 fps to several hundred fps, which results in a data rate lower than several bps/pixel according to the Nyquist sampling theorem. Besides, transmitting at a low frequency by an LED source to match the low frame rate of the sensor is not acceptable for illumination and can be detected by human eyes. Moreover, the blocking of the optical link by buildings, and signal attenuation by rain and fog, will degrade the communication quality seriously.

In this work, we will first describe the aforementioned challenges and overview the corresponding solutions as comprehensively as possible. Some new application scenarios about both reflective and scattering non-line of sight (NLOS) OCC will be introduced. After that, we suggest some future research directions.

3. Frame rate constraints and solutions

As mentioned before, a commercial camera generally has a very low frame rate at about 30 fps. This limits the maximum symbol rate lower than 15 sps. Note that the critical flicker frequency (CFF) can be treated as the frequency at which a flicker light becomes indistinguishable from a constant light. The low frame rate will also cause a serious flicker effect considering the CFF for human eyes, which is typically 100 Hz [24]. Additionally, a commercial camera undergoes rate fluctuation due to sampling rate instability and asynchrony between the transmitter and the camera sampling clock. Therefore, in this section, we will summarize some key technologies to overcome the limitations of the constraints in either low frame rate or frame rate instability.

(a). Data rate increase by modulations

IEEE 802.15.7-2018 standardization confirmed some modulation schemes for low frame rate OCC from PHY layer IV to VI. Among various modulation schemes, Roberts from Intel firstly proposed undersampled frequency shift OOK (UFSOOK) [2527], which uses two square wave patterns at two different high frequencies fs1 and fs0 much higher than the CFF to represent symbol ‘1’ and symbol ‘0’, respectively. Similarly, another undersampled scheme known as undersampled phase shift OOK (UPSOOK) was introduced by Luo et al. [2830]. Two different bits are differentiated by the phase of the square wave. An example of UFSOOK and UPSOOK is shown in figure 4. The undersampled scheme is an innovative method for intensity modulation in OCC to avoid the flicker to human eyes, and can be used by off-the-shelf cameras with frame rate lower than 30 fps. The modulation can be easily generalized to high-order employing multiple frequencies or phases within a symbol period, generating a sequence of distinct states representing different symbols. Spatial two-phase shift keying (S2-PSK) can demodulate a randomly captured image by leveraging the spatial separation capability of the image sensor, providing a solution for a variable frame rate camera [31]. Twinkle-variable pulse position modulation (VPPM) and hybrid spatial phase shift keying (HS-PSK), specified in IEEE 802.15.7-2018 standard PHY IV, were proposed by Intel as well. In Twinkle-VPPM, bits are mapped by VPPM into one of two duty cycles, while the twinkle is produced by alternating between two duty cycles to generate a low-rate amplitude change for ROI signalling. In HS-PSK, a high rate data bit stream is modulated by dimmable spatial eight-phase shift keying (DS8-PSK) while the dimming level is controlled by S2-PSK. Several hybrid modulations standardized in PHY V and PHY VI include rolling shutter-FSK (RS-FSK) [32], compatible m-ary frequency shift keying series (CM-FSK) [33,34] and compatible OOK (C-OOK) [32], mirror pulse modulation (MPM), asynchronous quick link (A-QL) [35], variable transparent amplitude shape code (VTASC) [35], sequential scalable two-dimensional colour (SS2DC), invisible data embedding (IDE) and hidden asynchronous quick link (HA-QL) [35].

Figure 4.

Figure 4.

Undersampled intensity modulation. (a) UFSOOK signal with fs1/fs0 = 5:8. (b) UPSOOK signal. (Online version in colour.)

Apart from these standard modulation schemes, researchers have proposed other modulations to match the low frame rate camera. Table 2 lists experimental data rate results of various modulation and coding techniques. It can be seen that modulation and coding techniques start to shift from traditional intensity domain to the colour domain [44,62,63] and spatial domain [31,61,6471] to achieve higher spectrum efficiency but at a cost of higher implementation complexity and more interference. To gain a better understanding of how these two domains operate in the OCC, we give a brief introduction to two special cases: optical spatial modulation (OSM) and colour shift keying (CSK).

Table 2.

Comparisons of different OCC modulation schemes. WDM, wavelength division multiplexing; PWM, pulse width modulation; CIM, colour intensity modulation; UPAMSM, under-sampled pulse amplitude modulation with subcarrier modulation; UQAMSM, under-sampled quadrature amplitude modulation with subcarrier modulation; SM-ST, spatially-modulated space-time; L-STC, layered space-time code; S-TCF, spatial-temporal complementary frames; PTM, pixel translucency modulation; SVM, spatial visual modulation; SDMT, spatial discrete multitone.

year data rates distance (m) modulation complexity reference
2016 1.28 kbps 0.05–0.2 OOK low [36]
2018 1–3.1 kbps 0.35 OOK low [37]
2015 0.3 kbps 0.5 OOK low [38]
2013 10, 15, 20 Mbps 0.5, 1.2 OOK low [23]
2017 9 kbps 0.1 PWM moderate [39]
2016 84 kbps 4 PWM low [40]
2015 0.3–1.5 kbps 0.5 PWM low [41]
2016 0.62–1.35 kbps 1.5–5.5 hybrid OOK-PWM moderate [42]
2016 95 kbps 1.2 CIM and PAM high [43]
2016 126.7 kbps 1.4 CIM high [44]
2018 57.6 kbps 0.65–1 CIM high [45]
2014 12.8 kbps 0.5 CIM high [46]
2016 317.3 kbps 0.2 CIM high [47]
2016 11.52 kbps 2 CIM high [48]
2015 5.2 kbps 0.3 CSK moderate [49]
2014 0.24 kbps 0.5 CSK moderate [50]
2016 2.88 kbbs 0.1 WDM low [51]
2015 0.12–0.96 Mbps 0.12–0.24 colour barcodes high [52]
2014 0.15 kbps 12 UPSOOK low [28]
2015 0.1 kbps 50 UPAMSM moderate [53]
2015 0.25 kbps 1.5 UPAMSM moderate [29]
2015 0.15 kbps 0.6 UPSOOK and WDM high [54]
2016 0.5 kbps 1.5 UQAMSM high [30]
2014 1 kbps 30 SM-ST high [55]
2015 1 kbps 40–210 L-STC high [56]
2016 240 kbps 0.6 S-TCF high [57]
2015 0.8–1.1 kbps 0.3–1.5 PTM high [58]
2017 16.67 kbps 0.7–1.5 SVM high [59]
2006 1.344 Mbps 2 SDMT high [60]
2010 12 Mbps 10 OFDM high [61]

(i). Optical spatial modulation

The spatial domain can be explored to further increase the data rate under the low frame rate constraint. The OSM [31,6468], typically used in LiFi [13], can be adopted in OCC as well [72]. Since an image sensor is composed of a large pixel array and an optical lens, its outputs specifically in terms of light intensity or luminance values are arranged in a square matrix to form a digital electronic representation of the scene. Each optical path between a light source and a single pixel in an image sensor can be regarded as a VLC link using PD as the receiver. Each pixel can record data instantaneously or successively depending on shutter type of camera (global shutter or rolling shutter). Thus, multiple light sources can be used to further increase the capacity of the OCC channel, and the spatial position (also called spatial domain) of these light sources can be used to convey more information. Positions of different light sources on the image plane are distinguishable thanks to the optical lens, which contributes to the spatial separation ability of an image sensor. Therefore, the two-dimensional PD architecture facilitates the detection of light source position and the luminance. In OSM, the data bits are conveyed into spatial domain of the light source array, and an image sensor can fully demodulate bits within the captured image based on its spatial separation capability.

In an OSM-based OCC system using LEDs in the transmitter, a bit sequence is firstly transformed to a parallel form, then passed to a modulator. The modulator maps the parallel bits to the transmitting symbols according to the combination of input parallel bits. Some bits control selection of the LEDs, while all other bits control intensity levels of selected LEDs. Typically, only one LED is activated while other LEDs are set to be off at any particular time instant. Therefore, LED mapping and level mapping are performed jointly. Figure 5 depicts a typical OSM principle providing four bits per symbol based on four LEDs as a group. Four bits are grouped to construct one symbol. The first two bits are used to select one of the four LEDs and the second two bits are used to set one of the four intensity levels for that selected LED. For example, in the first symbol period t0, a random bits combination ‘0110’ means LED2 is selected to transmit according to the first two bits ‘10’ and this LED transmits a pulse with optical power at I1 based on the second two bits ‘01’ while other LEDs remain off. At the receiver, the image sensor performs two steps to demodulate transmitted bits. First, to demodulate the information from the transmitter light array, the coordinates of each light source have to be obtained by implementing a frame differential technique [23] or Hough circle detection method [44] on the label or training light sources at the head of a bit sequence. After that, the index and intensity of the activated light source in that time interval can be distinguished and measured. Only if both the index and the signal level are detected correctly, is the bit sequence decoded without errors. Similarly, one can easily find the activated LED and transmission level for the second symbol period corresponding to bit group ‘1001’ along the input bit sequence.

Figure 5.

Figure 5.

Demonstration of optical spatial modulation providing four bits per symbol. (Online version in colour.)

Obviously, OSM can entirely avoid inter-carrier interference (ICI) and provide a larger capacity while keeping a higher energy efficiency than conventional low-complexity methods for the MIMO systems. For an array consisting of N light sources with L luminance levels at the transmitter, and a camera with frame rate fcam at the receiver, the aggregated data rate can be calculated as (log 2N + log 2L) × fcam bps. However, the receiver requires perfect channel knowledge for data demodulation. This may impose a complexity constraint on the channel estimator. Besides, OSM offers only a logarithmical data rate increase with the number of transmit elements rather than linear. This might limit the spectral efficiency given a practical number of the light elements. Certainly, the changing frequency of the entire light source array cannot exceed the CFF to avoid flicker.

(ii). Colour shift keying

The VL spectrum contains rich colours. The RGB-type LEDs combine three primary colours: namely red, green and blue, and emit their corresponding light simultaneously. For OCC, colourful LEDs offer a new degree of freedom to represent more information bits. Together with the natural colour filter architecture in an image sensor for easy colour detection, CSK is an effective modulation scheme for cost-effective parallel OCC systems [49,50].

As a unique modulation scheme specified in PHY III of the IEEE 802.15.7 standard, CSK is proposed to operate with RGB LEDs in order to enable higher-order spectrally efficient modulation and provide high data rate [73]. Unlike intensity modulation, CSK guarantees no intensity fluctuation of the luminary. According to the aforementioned two-dimensional PD array architecture and existing literature, CSK can be easily adopted in OCC without significantly increased complexity. Figure 6 depicts a typical OCC link using CSK modulation. Data are first mapped onto the constellation symbol ci=[xci,yci], which is a member of the CIE 1931 xy chromaticity diagram. Then the symbol set is converted into the absolute luminous flux symbol set, si=[rsi,gsi,bsi], in which rsi, gsi and bsi represent the absolute luminous flux of the red, green and blue colour bands, respectively. After that, the electric current levels required to generate the specific luminous flux of each symbol are determined to drive RGB LEDs, respectively. At the receiver, light passing through colour filters is recorded by the image sensor. Using the estimated channel matrix from pilot or training sequence, the receiver performs maximum-likelihood detection based on the received signal. Then the symbols are converted back into their xy chromaticity coordinates and decoded to bits eventually.

Figure 6.

Figure 6.

Block diagram of an OCC system using CSK modulation. (Online version in colour.)

Clearly, CSK can entirely avoid intensity fluctuation of the luminary and improve the spectrum efficiency. For an RGB LED with L constellation points in the colour space, and a camera with frame rate fcam at the receiver, the data rate can be calculated as fcam × log 2L bps. The cross-talk between colour channels is typically assumed to be deterministic and invertible. However, this may fail to be a valid assumption especially under sensor saturation and nonlinear distortion for the outdoor strong background radiation scenario, which imposes complexity constraints and channel estimation imperfectness on the receiver. Besides, the responsivity patterns of the pixels consisting of PDs typically vary with human perception, which may lead to a perceptible flicker for human eyes for a designed target intensity and colour. Furthermore, extra complexity will be imposed on the transceiver [74,75] to achieve a large dimension and high-quality constellation.

(b). Data rate increase by exploring rolling shutter

An image sensor’s shutter determines how and when light gets recorded during an exposure. Typically, there are two methods to capture a static picture or each frame of a video signal. One is called the rolling shutter and the other called global shutter. Although both technologies record light within a necessary time interval, not every portion of the image starts and stops receiving light at the same time. As used in popular complementary metal oxide semiconductor (CMOS) cameras, a rolling shutter is exposed in a progressive step, in fact line by line from top to bottom.

Figure 7 illustrates details of the rolling shutter process in successive image recording. As can be seen from this figure, the start and end of exposure on each row or column or individual pixel happen sequentially. So all the pixels are not exposed at the same time. It can take time up to 1/fcam for all of the pixels on the sensor to finish exposure where fcam is the frame rate. The effect will be noticed if the object is moving fast, causing a jello effect and damage to the image quality. However, one can take advantage of the rolling shutter mechanism in an OCC system to increase the sampling rate of the receiver [33,34,37,41,51,7683].

Figure 7.

Figure 7.

Frame capturing process of a rolling shutter camera. (Online version in colour.)

Figure 8 demonstrates an operation process of an OCC system with sequential data transmission by the LED transmitter and signal reception by a rolling shutter-based sensor. Figure 8a shows sequential data transmission by a commercial white LED at a rate of kHz. When the LED is on, the first row of the image sensor starts to be exposed and read out before the exposure of the next row. Then the second row starts to be exposed as the LED is off. Such a process creates a bright stripe followed by a dark one. It continues till the last row. Finally, an image full of bright and dark stripes can be observed. The duration of LED’s states tLED is typically set to be multiples of the line exposure time tROW. Therefore, a stripe image is observed as shown in figure 8b, in which the bright and dark stripes can be easily separated. Figure 8c plots the mean output of the pixel values per row. Then the transmitted bits can be recovered by applying equalization and threshold detection.

Figure 8.

Figure 8.

An operation process of rolling shutter. (a) Sequential data transmission by a commercial white LED at a rate of kHz. (b) A captured image (3968 × 2976). (c) The mean output of pixel values across rows of an image. (Online version in colour.)

Assume a camera (rows × cols = M × N) with frame rate fcam at the receiver, taking n rows to represent the bit ‘1’ or bit ‘0’ to increase the signal-to-noise ratio (SNR). Then the data rate can be calculated as fcam × (M/n/2) bps since there must be at least two complete slots in a frame to recover a correct data stream. Clearly, the rolling shutter mechanism improves the sampling rate greatly as compared to the traditional sampling method applied over the entire frame. The light source operates at a frequency of several kilohertz thanks to the high sampling rate, which makes it imperceptible to human eyes. Different techniques for improving the demodulation performance of the rolling shutter pattern were well addressed and investigated, such as blooming mitigation [79], extinction ratio enhancement [77] and optimal thresholding schemes [51]. Binary frequency shift keying as well as frequency division multiple access were proposed to enable multiple access while avoiding collision of signals from multiple LED transmitters [33].

However, due to the reset mechanism for preparing another exposure from last row to the first row, the gap within two successive frames showed in figure 7 may impose complex frame design and recovery especially while using the rolling shutter technique in receiving a long packet across multiple frames [79,84,85]. Besides, for a typical OCC operation setting, spatial distortion is inevitably visible in the rolling shutter mode especially while the image projected on the sensitive area is relatively small and under a mobile scenario. Despite the higher sampling rate, the link range of a rolling shutter-based OCC system is severely limited, and the bit rate changes with the distance and the size of the optical footprint [86,87].

(c). Data rate increase by high-speed cameras

For intensity modulation with N levels in general, each symbol embeds log 2N bits. If the camera frame rate is fcam, the bit rate becomes log 2N × fcam/2 bps. It is clear that increasing the frame rate by directly using a high-speed camera can linearly increase the data rate. High-speed cameras can capture more information of fast-moving objects and changing objects in high frequency, compared to the normal cameras, opening a much expanded application space for commercial, industrial and military application sectors.

High-speed cameras at up to tens of thousands of fps can directly realize high-speed OCC and detection. For instance, Premachandra et al. [88] proposed an optical flow-based VLC tracking system by using an on-vehicle high-frame-rate (HFR) camera working at 1000 fps cooperated with an LED array changing at 500 Hz. Similarly, Ishii et al. [89] developed an HFR vision system operating in real time at 1000 fps for 1024 × 1024 pixel images. It implemented an improved optical flow detection algorithm on a high-speed vision platform. Such HFR vision system was designed for estimating optical flow rather than communication, but showed a great potential for OCC. To overcome the distance dependence of the LED to camera channel frequency response, Arai et al. [90] proposed a hierarchical coding scheme based on two-dimensional fast Haar wavelet transform for LED detection, and the communication system achieved a data rate of 128 kbps at distance 32 m with an HFR camera working at 4000 fps. Nagura et al. adopted a commercial camera with maximum frame rate 16 000 fps for LED array detection and communication, achieving a data rate of 128 kbps and 16 kbps for static mode and driving mode, respectively [91]. Nagura et al. [70] proposed two improved decoding methods to distinguish multi-valued data more reliably, leveraging a HFR camera (1000 fps) to achieve a data rate of 42.7 kbps at up to 65 m. Iwasaki et al. proposed a road-to-vehicle VLC system at intersection using an LED traffic light as the transmitter and an on-vehicle high-speed camera (1000 fps) as the receiver [92]. The system achieved the maximum data rate of 4 kbps with 64 LEDs modulated at 250 Hz.

To achieve high-speed data acquisition from various transmitters simultaneously, a new hybrid OCC and PD system, typically called optical communication image sensor (OCI), was proposed [23,69,93]. It is able to realize new types of image sensors with PD cells integrated between the imaging pixels. Figure 9 shows an OCI schematic and its operation process. An image output from image cells is used to detect and track various light sources. To avoid the intolerable computation cost of the image sensor output while the imaging pixels are running at a full resolution, a 1-bit flag for each imaging pixel is designed by using a comparator circuit. After an image processor acquires the flag image, the coordinates of the LEDs are determined by image processing techniques. Then the corresponding communication pixels are activated and the communication links are subsequently established according to the central coordinates. The output signal is amplified, digitized and demodulated in an external circuit. By using such novel architecture for the image sensor, the frequency response of the PD cells in the image sensor has been tested [69], and several PD communication methods have been implemented, including baseband modulations with line coding [23,93] and OFDM [69].

Figure 9.

Figure 9.

Structure of an OCI sensor and its operation process. (Online version in colour.)

Currently, those high-speed cameras or hybrid OCI techniques are relatively expensive, but their demonstrated reliable performances of high-frame rate processing definitely pose them in unique positions towards future OCC applications.

(d). Tackling the frame rate instability

Frame rate variation always exists in smartphone cameras. With the rapid development of consumer electronics, it almost becomes a standard configuration for a camera operating at 60 fps or higher at a 4K resolution, which facilitates the further improvement on the OCC techniques. However, due to the variation of sampling interval which relies on the manufacturing process and shutter mode of an image sensor, the synchronization turns out to be a necessary technique for an OCC system to recover bits from captured images. Figure 10 shows measured frame rate variation concerning various types of smartphones operating at 30 fps and 60 fps. It is clear that the practical frame rate results depart from the configured frame rates. Such mismatch causes sampling time errors and the receiver is out of synchronization as depicted in figure 10c.

Figure 10.

Figure 10.

Measured frame rate variation concerning various types of smartphones operating at (a) 30 fps, (b) 60 fps and (c) its impact on the sampling process. (Online version in colour.)

To overcome the issue of unstable frame rate, Tian et al. simply adopted a four times oversampling by setting the camera frame rate at 330 fps, while the LED array remains at a 82.5 Hz refresh rate [43]. LightSync system was designed with a fast frame synchronization ability for a rolling shutter image sensor to minimize the synchronization error between an LCD screen and a camera [94]. Rajagopal et al. [33] proposed a preamble architecture and a periodic packet transmission scheme to guarantee the precise synchronization between the infrastructure and the camera operating at 30 fps. Hu et al. [49] investigated the impact of synchronization and proposed an error correction coding method to minimize the inter-frame packet loss. Different from other methods, Shiraki et al. [95] proposed a novel algorithm to demodulate the captured signal even without the information about transmitting period. Kwon et al. [96] proposed an effective synchronization strategy which maintains colour uniformity. They further investigated the colour independence of visual MIMO and improved the synchronization performance [97].

Existing synchronization strategies mainly depend on a reference aided signal or a reference code. However, the reference signal or reference code may cause a perceptible flicker or decrease the achievable data rate. Therefore, effective synchronization methods are necessary for further investigation and development in the OCC systems.

4. Visual constraints and solutions

There exist pervasive electronic displays such as smartphones and tablet screens, computer monitors, electronic advertising boards and TVs. Humans are more used to acquiring information from these electronic devices. With the ubiquity of these electronic display devices and cameras, a new OCC architecture called camera-screen communication (CSC) has emerged as a competitive solution for quick information acquisition.

A well-known CSC example, possibly as its predecessor, is the quick response (QR) code [98] which embeds information into two-dimensional barcodes. QR codes have become common in consumer advertising. In the past few years, there emerged various new QR-code-like tag forms with improved communication capabilities [52,61,99], enhanced reliability and security [47,94,100,101]. To explore more degrees of freedom, a COBRA system embedding five colours to achieve a higher capacity was designed for a colour barcode [99]. However, the usage of additional colours may lead to more decoding errors because of the colour cross-talks in the image sensor. In [52], an enhanced RainBar visual communication system was proposed based on a colour barcode system with a high-capacity barcode layout design, flexible frame synchronization and accurate code extraction. By means of multiple tracking bars and per-line tracking within each frame in conjunction with linear erasure coding, the LightSync system can recover lost frames and correct partial or mixed frames across the original frames.

However, these CSC systems mentioned above are perceptible for human eyes. From the perspective of visual quality, humans prefer to enjoy a normal full-screen viewing experience, while acquiring extra information from the screen to camera channel simultaneously. This kind of CSC is usually called covert CSC or hidden CSC, which is subject to a visual constraint for human eyes [32,58,59,102108]. Unlike the traditional VLC system providing proper illumination while transmitting data simultaneously, a covert CSC system focuses more on the transmission while providing the quality display rather than illumination. Such a covert OCC protects data from any unintended attention based on an eye channel perceiving source contents and a sensor channel recovering transmitted data.

A general description of covert CSC is illustrated in figure 11. When data are encoded by intensity, colour or transparency as a single image or a stream of pictures displayed on a screen, a device equipped with a camera can capture the frame and then recover the transmitted data while guaranteeing an acceptable viewing experience. Such mutually conflicting requirements need to be fulfilled via optimizing communication performance and minimizing image distortions for human perception. Tremendous efforts have been made on this regard, aware of visual contents and communication performance.

Figure 11.

Figure 11.

The concept of covert camera-screen communication. (Online version in colour.)

Hilight [104] was first proposed by Li et al., encoding bits into the orthogonal transparency channel on a smartphone screen and recovering bits by using another smartphone camera at a short distance (15 cm). Similarly, they realized a real time covert CSC system for any screen content, and discussed the impact of transmission distance, ambient light, hand motion and viewing angle [58]. Yuan et al. [103] proposed a novel algorithm to extract photographic messages from received images with photometric and geometric distortions in a visual MIMO optical system. Wang et al. [102] proposed an Inframe++ system which can support up to 360 kpbs by leveraging the spatial–temporal flicker–fusion property of human perception. Nguyen et al. [32] adopted content-adaptive encoding techniques to achieve a high-throughput and unnoticeable flicker covert CSC system. Wengrowski et al. [105] modified each colour pixel by shifting the base colour to a specific colour gradient which was then used to map pixels in an arbitrary image. Sato et al. [106] proposed a blue colour difference modulation scheme and verified its feasibility experimentally. In [107], the modulated signals were added to the R, G and B pixels of the video content and demodulated by a high-resolution camera. Recently, Wang et al. [59] suggested a spatial visual modulation scheme and conducted the quality assessment among 21 people to test the impact of modulated images to human perception. A recent vectorized colour modulation work extended the traditional colour modulation to a colour vector space by exploring all possible feasible combinations of colour vector pairs to represent more information [63]. The set to achieve the best demodulation performance under colour cross-talk was found through extensive search. Table 3 summarizes the key characteristics of these covert CSC systems.

Table 3.

Comparisons of different covert CSC systems.

year data rates distance (m) BER media reference
2016 22 kbps 0.7 m 0.1 video [32]
2019 11.52 kbits frame−1 0.9 m 8.6 × 10−5 − 8.5 × 10−2 image [63]
2015 0.2–5 kbps 0.3–1.2 m <0.1 image, video [58]
2017 16.67 kbps 0.7–1.3 m 0.1 video [59]
2015 150–360 kbps 0.6–5 m 0.1–0.2 video [102]
2012 6.22 kbps 5.4 × 10−2 image [103]
2015 9–11 kbps 15 cm <0.1 video [104]
2017 144 bits frame−1 0.5 m 0.3 image [105]
2016 0.756 kbps 0.5 m <0.1 video [106]
2017 4.5 Mbps 2.8 × 10−3 − 0.3 video [107]

Currently, these convert or embedded CSC systems make a trade-off between visual experience and communication performance including data rate and bit error rate (BER). However, with the booming demands of ultra-high-definition video or images, traditional embedding methods may not satisfy the requirement of visual quality. Therefore, more efficient embedding techniques need to be deeply investigated while keeping a better viewing experience.

5. Link blockage and non-line of sight solutions

Most reported OCC systems rely on a line of sight (LOS) path from the transmitter to the receiver. However, there are scenarios where the transmitter is not within the receiver’s FOV, for example a bookshelf blocks the transmission link between the lamp and smartphone in indoor, or two vehicles approach a cross-road in outdoor. In such cases, the connection cannot be established directly. Instead, an NLOS OCC system explores either indoor surface reflections or outdoor atmospheric scattering. Figure 12 depicts two typical NLOS OCC scenarios, where figure 12a is for an indoor environment and figure 12b is for an outdoor environment.

Figure 12.

Figure 12.

Typical NLOS OCC scenarios. (a) Indoor and (b) outdoor. (Online version in colour.)

(a). Indoor reflective non-line of sight

OCC is widely studied for indoor LOS communications. However, due to the limited FOV of a camera, an LOS-based OWC link typically suffers from shadowing and blockage, which limits the mobility and performance [109111]. Thus, the OCC through an NLOS path emerges as a solution for achieving strong robustness to blocking and mobility. It mainly leverages the light reflections from walls, ceiling and other diffusely reflecting surfaces to establish the communication links, but may experience high path loss and multipath-induced inter symbol interference especially for links with high transmission rates.

For example, by detecting the rapid changes of LED radiation reflected from surfaces, Rajagopal et al. achieved indoor localization and an error-free communication purpose simultaneously at a data rate of 1.25 bytes s−1 and a lighting level above 600 lux [33]. In [112], an improved rolling shutter pattern demodulation algorithm was proposed based on reflected light from the wall. The system achieved an NLOS link distance of 1.5 m at a low illumination level of 145 lux in a dark environment. A novel 2 × N indoor NLOS OCC system was proposed and experimentally verified by Hassan et al. [113] with a specially designed packet structure and a detection methodology to support a 30 fps data transmission rate at 5 m. The impacts of ISO levels and exposure time on the transmitted power were investigated as well. Similarly, the NLOS space–time division multiplexing OCC system proposed by Hassan et al. performed well over a transmission distance of 10 m with a 12 mW transmitting power, and its tolerance to the ambient radiation was improved by mask matching and equal gain combining techniques [114].

Obviously, reflection-based NLOS OCC can maintain a stable communication link, or realize new forms of sensing and mobile computing with strong robustness to blocking and mobility. However, the path loss is usually much higher than that of an LOS link due to blockage, which limits the achievable data rate and transmission performance. Besides, the output of each pixel is a superposition of contributions from all light sources due to the NLOS nature, which includes additive interference [115].

(b). Outdoor scattering non-line of sight

In outdoor scenarios, sometimes the transmitter beam is beyond the receiver FOV. For instance, vehicles moving in perpendicular directions in a crossroad are typically difficult to find each other. Tall buildings often shadow communication links. Thus, NLOS scattering communication is necessary. With the ubiquity of smartphones equipped with cameras and flash LEDs, OCC-based NLOS scattering communication emerges as a good candidate for outdoor key information delivery, such as steering information for vehicles, remote control instructions or an SOS signal for a trapped person in a barren field. Here, we will give a brief introduction to the feasibility of OCC-based NLOS scattering communication.

Figure 13 shows a typical OCC-based NLOS scattering link geometry. The pitch angles of the transmitter and the receiver are denoted as θ1 and θ2, respectively. ϕ1 and ϕ2 are the beam width of the transmitter light source and the FOV of the receiver camera, respectively. R is the transmitter and receiver separation distance. V denotes the total interaction volume of the beam and FOV. Note that the azimuth of the transceiver is not considered here.

Figure 13.

Figure 13.

OCC-based NLOS scattering communication geometry.

An NLOS channel model is usually analytically intractable, and the path loss η(λ) at a specific wavelength λ between the transmitter and the receiver can be obtained through a Monte Carlo method by simulating the propagation behaviour of each photon [116]. Incorporating the spectral responses of the LED fLED(λ) and camera fCam(λ), respectively, in the VL spectrum from 380 to 780 nm, the average number of photons λph arrived at the image sensor within the exposure time τ is given by

λph=τPhcλfLED(λ)fCam(λ)η(λ)λdλ, 5.1

where P denotes the transmitted power corresponding to signal ‘1’ for OOK modulation, h is Planck’s constant and c is the speed of light. The received signal of an image sensor incorporating a signal-dependent noise model is given by Huang & Xu [117]

Y=X+XZ2+XZ1+Z0, 5.2

where X is the desired signal component, and the second and third terms on the right are Gaussian noise terms, with mean zero and variances proportional to X2 and X, respectively, and the last term is a Gaussian noise term with zero mean and variance independent of X. In [116], an experiment in a dark room was conducted to verify the signal-dependent noise model in equation (5.2) for a practical grey sensor-based camera (Hamamatsu, C11440-52U). With 50 000 successive images taken at a fixed light intensity, Gaussian distribution with signal-dependent variance in a quadratic form of the signal was experimentally justified.

To further verify the feasibility of OCC-based NLOS scattering communication, an outdoor experiment was also conducted on the roof of two buildings, minimizing the influence induced by ambient noises [116]. At the transmitter, a pseudo random binary bit sequence was generated in Matlab, and transferred to the arbitrary waveform generator (AWG) (RIGOL DG5252). Then a white LED driven by a DC bias generated a 900 mW total optical power. The data rate was set to 100 bps, which was mainly limited by low sensor frame rate and the significant NLOS channel path loss. At the receiver, the frame rate of the camera was set to 800 Hz to achieve a better BER performance. Note that a higher frame rate causes decreased exposure time, reducing the number of received signal photons and background photons simultaneously. A lower frame rate decreases the achievable data rate but enhances the received photons. Thus, there is a tradeoff between the data rate and SNR. The pitch angle of the transceiver was set to 15° to facilitate coarse alignment and acquire a better performance. Then the BER was tested at different distance values (130 m and 300 m) at an eight times oversampling rate of the camera. From the measured pixel output values for captured ‘1’ and ‘0’ at different distance, their statistical distributions were found to fit the Gaussian distribution well. The variance of captured ‘1’ was found to be larger than that of ‘0’, and increase with the output mean value, indicating the nature of signal-dependent noise. It was also observed that the probability distribution curves corresponding to ‘1’ and ‘0’, respectively, were well separated, leading to an error-free reception during the observed long bit periods after downsampling and optimal detection from the captured signal.

To overcome the physical limitations from practical experiments, Monte Carlo simulations were also performed to predict range-dependent BERs for different system settings. A white LED with optical power 50 mW every 1 nm in the entire VL spectrum was assumed in the simulation. The NLOS scattering path loss over the entire LED spectrum for additional transceiver geometric parameter values was obtained by Monte Carlo simulations with the atmosphere parameters specified in [116]. At the receiver, the noise distribution parameters were derived from the experimental results. Then the BER performance was predicted at different transmission distance under various transceiver pitch angles and sensor exposure time.

Figure 14 illustrates the predicted range-dependent BER for different system settings. Figure 14a provides BERs for distance ranging from 100 to 1000 m and various transceiver pitch angle pairs (θ1, θ2) = (15°, 15°), (15°, 45°), (45°, 15°), (45°, 45°) with a fixed exposure time τ = 1.25 ms. Clearly, the transmitter pitch angle has a greater impact on the system performance, which implies that longer distance can be achieved by lowering the transmitter pitch angle. The distance can be extended over 500 m considering the forward error correction (FEC) limit represented by a horizontal dashed line in figure 14a. Figure 14b depicts the range-dependent error probability for various exposure time values with a fixed pitch angle pair (θ1, θ2) = (15°, 15°). The exposure time here mainly determines the achievable maximum frame rate. Shorter exposure time results in a higher achievable frame refresh rate, but at a cost of reduced number of captured signal photons. It is clear that for a fixed optical SNR (keeping a fixed ratio of arrived signal photons and background noise photons), longer integral time within a symbol duration will yield better BER performance, but enhancement may be limited by the saturation of pixels in a practical imaging system.

Figure 14.

Figure 14.

Predicted error probability for different (a) transceiver elevation angles and (b) exposure time. (Online version in colour.)

6. Future directions

Despite numerous appealing features of the OCC technologies, there still remain several challenges for practical implementation. For example, existing works mainly focus on simplex communication while ignoring the full-duplex requirement for instant messaging or two-party interaction. But when full-duplex communication is considered, OCC may introduce interference which incurs more vulnerability to mobile transceivers. Besides, an OCC system may be operated under strong ambient radiation from different sources such as solar irradiance, street lamps or artificial advertising boards. Such interference degrades the system performance seriously, and even causes the image sensor to be saturated and blind. Furthermore, practical LEDs and screen pixels exhibit nonlinearity. The nonlinear distortion has to be compensated in system design, optimization and practical implementation in typical intensity modulation-based systems. And the blur issue exists [118] when a camera is not focused due to the absence of an adaptive focusing mechanism, leading to SNR degradation and poor spatial separation of source signals. Additionally, an ultra-high frame rate camera is still in an unaffordable price for general consumer electronics.

The above issues need to be further tackled from the devices to transmitter and receiver design, advanced signal processing, interference suppression, and system and network protocol designs, to fully enjoy the various benefits of the OCC technologies. To further improve the communication data rate, joint modulations integrating various domains, such as intensity, colour, spatial, phase, frequency, should be investigated in depth. It is necessary to further explore the possibility of developing higher performance image sensors to deliver a higher frame rate, higher resolution, wider dynamic range and higher sensitivity.

As commonly acknowledged, the current RF spectrum can hardly fulfill the exponential growth in demand of wireless capacity for next-generation communications. The optical spectrum will play an important role for communications in offices, shopping malls, industrial areas, public gathering places, public transportation hubs and many others. OCC appears as one of the most attractive options in terms of flexibility, mobility, popularity and cost.

Data accessibility

This article has no additional data.

Authors' contributions

All the authors contributed to the writing of the manuscript.

Competing interests

We declare we have no competing interests.

Funding

This work was supported by Key Program of National Natural Science Foundation of China(grant no. 61631018), and Key Research Program of Frontier Sciences of CAS(grant no. QYZDY-SSW-JSC003).

Reference

  • 1.Hanzo L, Haas H, Imre S, O’Brien DC, Rupp M, Gyongyosi L. 2012. Wireless myths, realities, and futures: from 3G/4G to optical and quantum wireless. Proc. IEEE 100, 1853–1888. ( 10.1109/JPROC.2012.2189788) [DOI] [Google Scholar]
  • 2.Huang KC, Wang Z. 2011. Millimeter wave communication system. New York, NY: Wiley-IEEE Press. [Google Scholar]
  • 3.Chen Z, Ma X, Zhang B, Niu Z, Kuang N, Chen W, Li L, Li S. 2019. A survey on terahertz communications. China Commun. 16, 1–35. ( 10.12676/j.cc.2019.02.001) [DOI] [Google Scholar]
  • 4.Barry J, Kahn J, Lee E, Messerschmitt D. 1991. High-speed nondirective optical communication for wireless networks. IEEE Network 5, 44–54. ( 10.1109/65.103810) [DOI] [Google Scholar]
  • 5.Xu Z, Sadler BM. 2008. Ultraviolet communications: potential and state-of-the-art. IEEE Commun. Mag. 46, 67–73. ( 10.1109/MCOM.2008.4511651) [DOI] [Google Scholar]
  • 6.Elgala H, Mesleh R, Haas H. 2011. Indoor optical wireless communication: potential and state-of-the-art. IEEE Commun. Mag. 49, 56–62. ( 10.1109/MCOM.2011.6011734) [DOI] [Google Scholar]
  • 7.Komine T, Nakagawa M. 2004. Fundamental analysis for visible-light communication system using led lights. IEEE Trans. Consum. Electron. 50, 100–107. ( 10.1109/TCE.2004.1277847) [DOI] [Google Scholar]
  • 8.Afgani MZ, Haas H, Elgala H, Knipp D. 2006. Visible light communication using OFDM. In 2nd Int. Conf. on Testbeds and Research Infrastructures for the Development of Networks and Communities, Barcelona, Spain, 1–3 March 2006. Piscataway, NJ: IEEE.
  • 9.O’Brien DC, Zeng L, Le-Minh H, Faulkner G, Walewski JW, Randel S. 2008. Visible light communications: challenges and possibilities. In 2008 IEEE 19th Int. Symp. on Personal, Indoor and Mobile Radio Communications, Cannes, France, 15–18 September 2008. Piscataway, NJ: IEEE.
  • 10.Wang Z, Wang Q, Huang W, Xu Z. 2017. Visible light communications: modulation and signal processing. New York, NY: John Wiley & Sons. [Google Scholar]
  • 11.Tsonev D, Videv S, Haas H. 2015. Towards a 100 gb/s visible light wireless access network. Opt. Expr. 23, 1627–1637. ( 10.1364/OE.23.001627) [DOI] [PubMed] [Google Scholar]
  • 12.Lu HH, Li CY, Chen HW, Ho CM, Cheng MT, Yang ZY, Lu CK. 2016. A 56 Gb/s PAM4 VCSEL-based LiFi transmission with two-stage injection-locked technique. IEEE Photon. J. 9, 1–8. ( 10.1109/JPHOT.2016.2637564) [DOI] [Google Scholar]
  • 13.Haas H, Yin L, Wang Y, Chen C. 2015. What is LiFi? J. Lightw. Technol. 34, 1533–1544. ( 10.1109/JLT.2015.2510021) [DOI] [Google Scholar]
  • 14.Chan VW. 2006. Free-space optical communications. J. Lightw. Technol. 24, 4750–4762. ( 10.1109/JLT.2006.885252) [DOI] [Google Scholar]
  • 15.Alzenad M, Shakir MZ, Yanikomeroglu H, Alouini MS. 2018. FSO-based vertical backhaul/fronthaul framework for 5G+ wireless networks. IEEE Commun. Mag. 56, 218–224. ( 10.1109/MCOM.2017.1600735) [DOI] [Google Scholar]
  • 16.Khalighi MA, Uysal M. 2014. Survey on free space optical communication: A communication theory perspective. IEEE Commun. Surv. Tutor. 16, 2231–2258. ( 10.1109/COMST.2014.2329501) [DOI] [Google Scholar]
  • 17.Bae JH, Le NT, Kim JT. 2017. Smartphone image receiver architecture for optical camera communication. Wireless Pers. Commun. 93, 1043–1066. ( 10.1007/s11277-017-3971-3) [DOI] [Google Scholar]
  • 18.Luo P, Zhang M, Ghassemlooy Z, Zvanovec S, Feng S, Zhang P. 2018. Undersampled-based modulation schemes for optical camera communications. IEEE Commun. Mag. 56, 204–212. ( 10.1109/MCOM.2018.1601017) [DOI] [Google Scholar]
  • 19.Saha N, Ifthekhar MS, Le NT, Jang YM. 2015. Survey on optical camera communications: challenges and opportunities. IET Optoelectron. 9, 172–183. ( 10.1049/iet-opt.2014.0151) [DOI] [Google Scholar]
  • 20.Le NT, Jang YM. 2017. MIMO architecture for optical camera communications. J. Korean Inst. Commun. Inf. Sci. 42, 8–13. ( 10.7840/kics.2017.42.1.8) [DOI] [Google Scholar]
  • 21.Saeed N, Guo S, Park KH, Al-Naffouri TY, Alouini MS. 2019. Optical camera communications: survey, use cases, challenges, and future trends. Phys. Commun. 37, 100900 ( 10.1016/j.phycom.2019.100900) [DOI] [Google Scholar]
  • 22.Nguyen T, Islam A, Yamazato T, Jang YM. 2018. Technical issues on ieee 802.15.7m image sensor communication standardization. IEEE Commun. Mag. 56, 213–218. ( 10.1109/MCOM.2018.1700134) [DOI] [Google Scholar]
  • 23.Takai I, Ito S, Yasutomi K, Kagawa K, Andoh M, Kawahito S. 2013. LED and CMOS image sensor based optical wireless communication system for automotive applications. IEEE Photon. J. 5, 6801418 ( 10.1109/JPHOT.2013.2277881) [DOI] [Google Scholar]
  • 24.Herrmann CS. 2001. Human eeg responses to 1–100 hz flicker: resonance phenomena in visual cortex and their potential correlation to cognitive phenomena. Exp. Brain Res. 137, 346–353. ( 10.1007/s002210100) [DOI] [PubMed] [Google Scholar]
  • 25.Roberts RD. 2013. Undersampled frequency shift ON-OFF keying (UFSOOK) for camera communications (CamCom). In 2013 22nd Wireless and Optical Communication Conf., Chongqing, China, 16–18 May 2013. Piscataway, NJ: IEEE.
  • 26.Roberts RD. 2013. Space-time forward error correction for dimmable undersampled frequency shift ON-OFF keying camera communications (CamCom). In Proc. 5th IEEE Int. Conf. on Ubiquitous and Future Networks (ICUFN), Da Nang, Vietnam, 2–5 July 2013. Piscataway, NJ: IEEE.
  • 27.Roberts RD. 2013. A MIMO protocol for camera communications (CamCom) using undersampled frequency shift ON-OFF keying (UFSOOK). In Proc. IEEE Global Commun. Conf. (GLOBECOM) Wkshps, Atlanta, GA, USA, 9–13 December 2013. Piscataway, NJ: IEEE.
  • 28.Luo P, Ghassemlooy Z, Le Minh H, Tang X, Tsai HM. 2014. Undersampled phase shift on-off keying for camera communication. In 2014 6th Int. Conf. on Wireless Communications and Signal Processing (WCSP), Hefei, China, 23–25 October 2014. Piscataway, NJ: IEEE.
  • 29.Luo P, Ghassemlooy Z, Le Minh H, Tsai HM, Tang X. 2015. Undersampled-PAM with subcarrier modulation for camera communications. In 2015 Opto-Electronics and Communications Conf. (OECC), Shanghai, China, 28 June–2 July 2015. Piscataway, NJ: IEEE.
  • 30.Luo P, Zhang M, Ghassemlooy Z, Le H, Tsai HM, Tang X, Han D. 2016. Experimental demonstration of a 1024-QAM optical camera communication system. IEEE Photon. Technol. Lett. 28, 139–142. ( 10.1109/LPT.2015.2487544) [DOI] [Google Scholar]
  • 31.Nguyen T, Islam A, Jang YM. 2017. Region-of-interest signaling vehicular system using optical camera communications. IEEE Photon. J. 9, 1–20. ( 10.1109/JPHOT.2016.2644960) [DOI] [Google Scholar]
  • 32.Nguyen V, Tang Y, Ashok A, Gruteser M, Dana K, Hu W, Wengrowski E, Mandayam N. 2016. High-rate flicker-free screen-camera communication with spatially adaptive embedding. In IEEE INFOCOM 2016-The 35th Annual IEEE Int. Conf. on Computer Communications, San Francisco, CA, USA, 10–14 April 2016. Piscataway, NJ: IEEE.
  • 33.Rajagopal N, Lazik P, Rowe A. 2014. Visual light landmarks for mobile devices. In IPSN-14 Proc. of the 13th Int. Symp. on Information Processing in Sensor Networks, Berlin, Germany, 15–17 April 2014. Piscataway, NJ: IEEE.
  • 34.Hong CH, Nguyen T, Le NT, Jang YM. 2017. Modulation and coding scheme (MCS) for indoor image sensor communication system. Wireless Pers. Commun. 93, 987–1003. ( 10.1007/s11277-017-3977-x) [DOI] [Google Scholar]
  • 35.Nguyen T, Le NT, Jang YM. 2015. Asynchronous scheme for optical camera communication-based infrastructure-to-vehicle communication. Int. J. Distrib. Sens. Netw. 11, 1–15. ( 10.1155/2015/908139) [DOI] [Google Scholar]
  • 36.Cahyadi WA, Kim YH, Chung YH, Ahn CJ. 2016. Mobile phone camera based indoor visible light communications with rotation compensation. IEEE Photon. J. 8, 1–8. ( 10.1109/JPHOT.2016.25456431) [DOI] [Google Scholar]
  • 37.Danakis C, Afgani M, Povey G, Underwood I, Haas H. 2012. Using a CMOS camera sensor for visible light communication. In 2012 IEEE Globecom Workshops, Anaheim, CA, USA, 3–7 December 2012. Piscataway, NJ: IEEE.
  • 38.Chen SH, Chow CW. 2015. Single-input multiple-output visible light optical wireless communications supporting quality of service. Electron. Lett. 51, 406–408. ( 10.1049/el.2014.3575) [DOI] [Google Scholar]
  • 39.Lee JW, Yang SH, Han SK. 2017. Optical pulse width modulated multilevel transmission in CIS-based VLC. IEEE Photon. Technol. Lett. 29, 1257–1260. ( 10.1109/LPT.2017.2716941) [DOI] [Google Scholar]
  • 40.Imai Y, Ebihara T, Mizutani K, Wakatsuki N. 2016. Performance evaluation of high-speed visible light communication combining low-speed image sensor, polygon mirror in an outdoor environment. In Proc. 8th IEEE Int. Conf. on Ubiquitous and Future Networks (ICUFN), Vienna, Austria, 5–8 July 2016. Piscataway, NJ: IEEE.
  • 41.Nguyen T, Hong CH, Le NT, Jang YM. 2015. High-speed asynchronous optical camera communication using led and rolling shutter camera. In 2015 Seventh Int. Conf. on Ubiquitous and Future Networks, Sapporo, Japan, 7–10 July 2015. Piscataway, NJ: IEEE.
  • 42.Hao J, Yang Y, Luo J. 2016. CeilingCast: Energy efficient and location-bound broadcast through LED-camera communication. In The 35th Annual IEEE Int. Conf. on Computer Communications (INFOCOM), San Francisco, CA, USA, 10–14 April 2016. Piscataway, NJ: IEEE.
  • 43.Tian P, Huang W, Xu Z. 2016. Design and experimental demonstration of a real-time 95 Kbps optical camera communication system. In 10th Int. Symp. on Communication Systems, Networks and Digital Signal Processing (CSNDSP), Prague, Czech Republic, 20–22 July 2016. Piscataway, NJ: IEEE.
  • 44.Huang W, Tian P, Xu Z. 2016. Design and implementation of a real-time CIM-MIMO optical camera communication system. Opt. Expr. 24, 24 567–24 579. ( 10.1364/OE.24.024567) [DOI] [PubMed] [Google Scholar]
  • 45.Nie X, Xu Z. 2018. A cloud-assisted 57.6 Kbps image sensor communication system using a smartphone camera. In 2018 11th Int. Symp. on Communication Systems, Networks & Digital Signal Processing (CSNDSP), Budapest, Hungary, 18–20 July, 2018. Piscataway, NJ: IEEE.
  • 46.Wang A, Peng C, Zhang O, Shen G, Zeng B. 2014. Inframe: Multiplexing full frame visible communication channel for humans and devices. In Proc. of the 13th ACM Workshop on Hot Topics in Networks, Los Angeles, CA, USA, 27–28 October 2014. New York, NY: ACM.
  • 47.Du W, Liando JC, Li M. 2016. Softlight: Adaptive visible light communication over screen-camera links. In IEEE INFOCOM 2016 The 35th Annual IEEE Int. Conf. on Computer Communications, San Francisco, CA, USA, 10–14 April 2016. Piscataway, NJ: IEEE.
  • 48.Cahyadi WA, Kim YH, Chung YH. 2016. Dual camera-based split shutter for high-rate and long-distance optical camera communications. Opt. Eng. 55, 110504 ( 10.1117/1.OE.55.11.110504) [DOI] [Google Scholar]
  • 49.Hu P, Pathak PH, Feng X, Fu H, Mohapatra P. 2015. Colorbars: Increasing data rate of LED-to-camera communication using color shift keying. In Proc. of the 11th ACM Conf. on Emerging Networking Experiments and Technologies, Heidelberg, Germany, 1–4 December 2015. New York, NY.
  • 50.Chen SH, Chow CW. 2014. Color-shift keying and code-division multiple access transmission for RGB-LED visible light communications using mobile phone camera. IEEE Photon. J. 6, 1–6. ( 10.1109/JPHOT.2014.2374612) [DOI] [Google Scholar]
  • 51.Liang K, Chow CW, Liu Y, Yeh CH. 2016. Thresholding schemes for visible light communications with CMOS camera using entropy-based algorithms. Opt. Expr. 24, 25 641–25 646. ( 10.1364/OE.24.025641) [DOI] [PubMed] [Google Scholar]
  • 52.Wang Q, Zhou M, Ren K, Lei T, Li J, Wang Z. 2015. Rain bar: Robust application-driven visual communication using color barcodes. In 2015 IEEE 35th Int. Conf. on Distributed Computing Systems, Columbus, OH, USA, 29 June–2 July 2015. Piscataway, NJ: IEEE.
  • 53.Song L, Luo P, Zhang M, Ghassemlooy Z, Han D, Minh HL. 2015. Undersampled digital PAM subcarrier modulation for optical camera communications. In Asia Communications and Photonics Conf. (ACP), Hong Kong, China, 19–23 November 2015. Washington, DC: OSA.
  • 54.Luo P, Zhang M, Ghassemlooy Z, Le Minh H, Tsai HM, Tang X, Png LC, Han D. 2015. Experimental demonstration of RGB LED-based optical camera communications. IEEE Photon. J. 7, 1–12. ( 10.1109/JPHOT.2015.2486680) [DOI] [Google Scholar]
  • 55.Ebihara K, Kamakura K, Yamazato T. 2014. Spatially-modulated space-time coding in visible light communications using 2 × 2 LED array. In 2014 IEEE Asia Pacific Conf. on Circuits and Systems (APCCAS), Ishigaki, Japan, 17–20 November 2014. Piscataway, NJ: IEEE.
  • 56.Ebihara K, Kamakura K, Yamazato T. 2015. Layered transmission of space-time coded signals for image-sensor-based visible light communications. J. Lightw. Technol. 33, 4193–4206. ( 10.1109/JLT.2015.2470091) [DOI] [Google Scholar]
  • 57.Wang X, Yang H, Song J. 2016. A VLC-based robust ID broadcasting indoor location system for mobile devices. In IEEE Int. Symposium on Broadband Multimedia Systems and Broadcasting (BMSB), Nara, Japan, 1–3 June 2016. Piscataway, NJ: IEEE.
  • 58.Li T, An C, Xiao X, Campbell AT, Zhou X. 2015. Real-time screen-camera communication behind any scene. In Proc. of the 13th Annual Int. Conf. on Mobile Systems, Applications, and Services, Florence, Italy, 18–22 May 2015. New York, NY: ACM.
  • 59.Wang J, Huang W, Xu Z. 2017. Demonstration of a covert camera-screen communication system. In 2017 13th Int. Wireless Communications and Mobile Computing Conf. (IWCMC), Valencia, Spain, 26–30 June 2017. Piscataway, NJ: IEEE.
  • 60.Hranilovic S, Kschischang FR. 2006. A pixelated MIMO wireless optical communication system. IEEE J. Sel. Top. Quantum Electron. 12, 859–874. ( 10.1109/JSTQE.2006.876601) [DOI] [Google Scholar]
  • 61.Perli SD, Ahmed N, Katabi D. 2010. Pixnet: Interference-free wireless links using LCD-camera pairs. In Proc. of the 16th Annual Int. Conf. on Mobile Computing and Networking, Chicago, IL, USA, 20–24 September 2010. New York, NY: ACM.
  • 62.Ahn KI, Kwon JK. 2012. Color intensity modulation for multicolored visible light communications. IEEE Photon. Technol. Lett. 24, 2254–2257. ( 10.1109/LPT.2012.2226570) [DOI] [Google Scholar]
  • 63.Peng C, Xu Z. 2019. Vectorized color modulation for covert camera-screen communication. In 2019 IEEE Int. Conf. on Communications (ICC), Shanghai, China, 20–24 May 2019. Piscataway, NJ: IEEE.
  • 64.Mesleh R, Mehmood R, Elgala H, Haas H. 2010. Indoor MIMO optical wireless communication using spatial modulation. In 2010 IEEE Int. Conf. on Communications (ICC), Cape Town, South Africa, 23–27 May 2010. Piscataway, NJ: IEEE.
  • 65.Fath T, Haas H, Di Renzo M, Mesleh R. 2011. Spatial modulation applied to optical wireless communications in indoor LOS environments. In 2011 IEEE Globecom, Kathmandu, Nepal, 5–9 December 2011. Piscataway, NJ: IEEE.
  • 66.Mesleh R, Elgala H, Haas H. 2011. Optical spatial modulation. IEEE J. Opt. Commun. Netw. 3, 234–244. ( 10.1364/JOCN.3.000234) [DOI] [Google Scholar]
  • 67.Yamazato T. 2011. Visible light communications using led array and high-speed camera for its applications. IEICE ESS Fundam. Rev. 3, 2 (doi:10/cskrkp) [Google Scholar]
  • 68.Nishimoto S, Nagura T, Yamazato T, Yendo T, Fujii T, Okada H, Arai S. 2011. Overlay coding for road-to-vehicle visible light communication using led array and high-speed camera. In 2011 14th Int. IEEE Conf. on Intelligent Transportation Systems (ITSC), Washington, DC, USA, 5–7 October 2011. Piscataway, NJ: IEEE.
  • 69.Goto Y, Takai I, Yamazato T, Okada H, Fujii T, Kawahito S, Arai S, Yendo T, Kamakura K. 2016. A new automotive VLC system using optical communication image sensor. IEEE Photon. J. 8, 1–17. ( 10.1109/JPHOT.2016.2555582) [DOI] [Google Scholar]
  • 70.Nagura T, Yamazato T, Katayama M, Yendo T, Fujii T, Okada H. 2010. Improved decoding methods of visible light communication system for its using led array and high-speed camera. In 2010 IEEE 71st Vehicular Technology Conf., Taipei, Taiwan, 16–19 May 2010. Piscataway, NJ: IEEE.
  • 71.Okada H, Ishizaki T, Yamazato T, Yendo T, Fujii T. 2011. Erasure coding for road-to-vehicle visible light communication systems. In 2011 IEEE Consumer Communications and Networking Conf. (CCNC), Las Vegas, NV, USA, 9–12 January 2011. Piscataway, NJ: IEEE.
  • 72.Butala PM, Elgala H, Little TD. 2014. Performance of optical spatial modulation and spatial multiplexing with imaging receiver. In IEEE Wireless Communications and Networking Conf. (WCNC), Istanbul, Turkey, 6–9 April 2014. Piscataway, NJ: IEEE.
  • 73.Monteiro E, Hranilovic S. 2014. Design and implementation of color-shift keying for visible light communications. J. Lightw. Technol. 32, 2053–2060. ( 10.1109/JLT.2014.2314358) [DOI] [Google Scholar]
  • 74.Bai B, He Q, Xu Z, Fan Y. 2012. The color shift key modulation with non-uniform signaling for visible light communication. In 2012 1st IEEE Int. Conf. on Communications in China Workshops (ICCC), Bejing, China, 15–17 August 2012. Piscataway, NJ: IEEE.
  • 75.Monteiro E, Hranilovic S. 2012. Constellation design for color-shift keying using interior point methods. In 2012 IEEE Globecom Workshops, Anaheim, CA, USA, 3–7 December 2012. Piscataway, NJ: IEEE.
  • 76.Ji P, Tsai HM, Wang C, Liu F. 2014. Vehicular visible light communications with led taillight and rolling shutter camera. In 2014 IEEE 79th Vehicular Technology Conf. (VTC Spring), Seoul, South Korea, 18–21 May 2014. Piscataway, NJ: IEEE.
  • 77.Chow CW, Chen CY, Chen SH. 2015. Enhancement of signal performance in LED visible light communications using mobile phone camera. IEEE Photon. J. 7, 1–7. ( 10.1109/JPHOT.2015.2476757) [DOI] [Google Scholar]
  • 78.Aoyama H, Oshima M. 2015. Line scan sampling for visible light communication: theory and practice. In 2015 IEEE Int. Conf. on Communications (ICC), London, UK, 8–12 June 2015. Piscataway, NJ: IEEE.
  • 79.Chow CW, Chen CY, Chen SH. 2015. Visible light communication using mobile-phone camera with data rate higher than frame rate. Opt. Expr. 23, 26 080–26 085. ( 10.1364/OE.23.026080) [DOI] [PubMed] [Google Scholar]
  • 80.Liu Y, Liang K, Chen HY, Wei LY, Hsu CW, Chow CW, Yeh CH. 2016. Light encryption scheme using light-emitting diode and camera image sensor. IEEE Photon. J. 8, 1–7. ( 10.1109/JPHOT.2016.2519287) [DOI] [Google Scholar]
  • 81.Liu Y, Chen HY, Liang K, Hsu CW, Chow CW, Yeh CH. 2016. Visible light communication using receivers of camera image sensor and solar cell. IEEE Photon. J. 8, 1–7. ( 10.1109/JPHOT.2015.2507364) [DOI] [Google Scholar]
  • 82.Chow CW, Liu YC, Shiu RJ, Yeh CH. 2018. Adaptive thresholding scheme for demodulation of rolling-shutter images obtained in CMOS image sensor based visible light communications. IEEE Photon. J. 10, 1–6. ( 10.1109/JPHOT.2018.2876798) [DOI] [Google Scholar]
  • 83.Chen HW, Wen SS, Wang XL, Liang MZ, Li MY, Li QC, Liu Y. 2019. Color-shift keying for optical camera communication using a rolling shutter mode. IEEE Photon. J. 11, 1–8. ( 10.1109/JPHOT.2019.2898909) [DOI] [Google Scholar]
  • 84.Nguyen T, Hong CH, Le NT, Jang YM. 2015. High-speed asynchronous Optical Camera Communication using LED and rolling shutter camera. In 7th Int. Conf. on Ubiquitous and Future Networks, Sapporo, Japan, 7–10 July 2015. Piscataway, NJ: IEEE.
  • 85.Wang WC, Chow CW, Chen CW, Hsieh HC, Chen YT. 2017. Beacon jointed packet reconstruction scheme for mobile-phone based visible light communications using rolling shutter. IEEE Photon. J. 9, 1–6. ( 10.1109/JPHOT.2017.2762460) [DOI] [Google Scholar]
  • 86.Nguyen T, Islam A, Hossan T, Jang YM. 2017. Current status and performance analysis of optical camera communication technologies for 5G networks. IEEE Access 5, 4574–4594. ( 10.1109/ACCESS.2017.2681110) [DOI] [Google Scholar]
  • 87.Nguyen H, Pham TL, Nguyen H, Jang YM. 2019. Trade-off communication distance and data rate of rolling shutter OCC. In 11th Int. Conf. on Ubiquitous and Future Networks (ICUFN), Zagreb, Croatia, Croatia, 2–5 July 2019. Piscataway, NJ: IEEE.
  • 88.Premachandra HCN, Yendo T, Tehrani MP, Yamazato T, Okada H, Fujii T, Tanimoto M. 2010. High-speed-camera image processing based led traffic light detection for road-to-vehicle visible light communication. In 2010 IEEE Intelligent Vehicles Symp. San Diego, CA, USA, 16 August 2010. Piscataway, NJ: IEEE.
  • 89.Ishii I, Taniguchi T, Yamamoto K, Takaki T. 2012. High-frame-rate optical flow system. IEEE Trans. Circuits Syst. Video Technol. 22, 105–112. ( 10.1109/TCSVT.2011.2158340) [DOI] [Google Scholar]
  • 90.Arai S, Mase S, Yamazato T, Endo T, Fujii T, Tanimoto M, Kidono K, Kimura Y, Ninomiya Y. 2007. Experimental on hierarchical transmission scheme for visible light communication using led traffic light and high-speed camera. In 2007 IEEE 66th Vehicular Technology Conf., Baltimore, MD, 30 September–3 October 2007. Piscataway, NJ: IEEE.
  • 91.Nagura T, Yamazato T, Katayama M, Yendo T, Fujii T, Okada H. 2010. Tracking an led array transmitter for visible light communications in the driving situation. In 2010 7th Int. Symp. on Wireless Communication Systems, York, UK, 19–22 September 2010. Piscataway, NJ: IEEE.
  • 92.Iwasaki S, Premachandra C, Endo T, Fujii T, Tanimoto M, Kimura Y. 2008. Visible light road-to-vehicle communication using high-speed camera. In 2008 IEEE Intelligent Vehicles Symp., Eindhoven, The Netherlands, 4–6 June 2008. Piscataway, NJ: IEEE.
  • 93.Takai I, Harada T, Andoh M, Yasutomi K, Kagawa K, Kawahito S. 2014. Optical vehicle-to-vehicle communication system using led transmitter and camera receiver. IEEE Photon. J. 6, 1–14. ( 10.1109/JPHOT.2014.2352620) [DOI] [Google Scholar]
  • 94.Hu W, Gu H, Pu Q. 2013. Lightsync: Unsynchronized visual communication over screen-camera links. In Proc. of the 19th Annual Int. Conf. on Mobile Computing & Networking, Miami, FL, USA, 30 September–4 October 2013. New York, NY: ACM.
  • 95.Shiraki Y, Sato TG, Kamamoto Y, Moriya T. 2017. Flexible synchronization in optical camera communication with on-off keying. In 2017 IEEE Globecom Workshops (GC Wkshps), Singapore, 4–8 December 2017. Piscataway, NJ: IEEE.
  • 96.Kwon TH, Kim JE, Kim KD. 2017. Synchronization method for color-independent visual-MIMO communication. In 2017 IEEE Int. Conf. on Consumer Electronics (ICCE), Las Vegas, NV, USA, 8–10 January 2017. Piscataway, NJ: IEEE.
  • 97.Kwon TH, Kim JE, Kim KD. 2018. Time-sharing-based synchronization and performance evaluation of color-independent visual-MIMO communication. Sensors 18, 1553 ( 10.3390/s18051553) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 98.ISO I. 2006. IEC 18004: 2006 information technology–automatic identification and data capture techniques–QR code 2005 bar code symbology specification. See https://www.sis.se/api/document/preview/911067/.
  • 99.Hao T, Zhou R, Xing G. 2012. Cobra: color barcode streaming for smartphone systems. In Proc. of the 10th Int. Conf. on Mobile Systems, Applications, and Services, Low Wood Bay, Lake District, UK, 25–29 June 2012. New York, NY: ACM.
  • 100.Wang A, Ma S, Hu C, Huai J, Peng C, Shen G. 2014. Enhancing reliability to boost the throughput over screen-camera links. In Proc. of the 20th Annual Int. Conf. on Mobile Computing and Networking, Maui, HI, USA, 7–11 September 2014. New York, NY: ACM.
  • 101.Zhang B, Ren K, Xing G, Fu X, Wang C. 2016. Sbvlc: Secure barcode-based visible light communication for smartphones. IEEE Trans. Mobile Comput. 15, 432–446. ( 10.1109/TMC.2015.2413791) [DOI] [Google Scholar]
  • 102.Wang A, Li Z, Peng C, Shen G, Fang G, Zeng B. 2015. Inframe++: Achieve simultaneous screen-human viewing and hidden screen-camera communication. In Proc. of the 13th Annual Int. Conf. on Mobile Systems, Applications, and Services, Florence, Italy, 18–22 May 2015. New York, NY: ACM.
  • 103.Yuan W, Dana K, Ashok A, Gruteser M, Mandayam N. 2012. Dynamic and invisible messaging for visual MIMO. In 2012 IEEE Workshop on the Applications of Computer Vision (WACV), Breckenridge, CO, USA, 9–11 January 2012. Piscataway, NJ: IEEE.
  • 104.Li T, An C, Campbell AT, Zhou X. 2015. Hilight: Hiding bits in pixel translucency changes. ACM SIGMOBILE Mobile Comput. Commun. Rev. 18, 62–70. ( 10.1145/2721896.2721910) [DOI] [Google Scholar]
  • 105.Wengrowski E, Dana KJ, Gruteser M, Mandayam N. 2017. Reading between the pixels: photographic steganography for camera display messaging. In 2017 IEEE Int. Conf. on Computational Photography (ICCP), Stanford, CA, USA, 12–14 May 2017. Piscataway, NJ: IEEE.
  • 106.Sato S, Okada H, Kobayashi K, Yamazato T, Katayama M. 2016. Visible light communication systems using blue color difference modulation for digital signage. In 2016 IEEE 27th Annual Int. Symp. on Personal, Indoor, and Mobile Radio Communications (PIMRC), Valencia, Spain, 4–8 September 2016. Piscataway, NJ: IEEE.
  • 107.Kays R, Brauers C, Klein J. 2017. Modulation concepts for high-rate display-camera data transmission. In 2017 IEEE Int. Conf. on Communications (ICC), Paris, France, 21–25 May 2017. Piscataway, NJ: IEEE.
  • 108.Le NT, Jang YM. 2017. Invisible embedded techniques for optical camera communications. In 2017 Int. Conf. on Information Networking (ICOIN), Da Nang, Vietnam, 11–13 January, 2017. Piscataway, NJ: IEEE.
  • 109.Xiang Y, Zhang M, Kavehrad M, Chowdhury MIS, Liu M, Wu J, Tang X. 2014. Human shadowing effect on indoor visible light communications channel characteristics. Opt. Eng. 8, 086113 ( 10.1117/1.OE.53.8.086113) [DOI] [Google Scholar]
  • 110.Farahneh H, Mekhiel C, Khalifeh A, Farjow W, Fernando X. 2016. Shadowing effects on visible light communication channels. In 2016 IEEE Canadian Con. on Electrical and Computer Engineering (CCECE), Vancouver, BC, Canada, 15–18 May 2016. Piscataway, NJ: IEEE.
  • 111.Dong Z, Shang T, Gao Y, Li Q. 2018. Study on VLC channel modeling under random shadowing. IEEE Photon. J. 9, 987–1003. ( 10.1109/JPHOT.2017.2775664) [DOI] [Google Scholar]
  • 112.Wang WC, Chow CW, Wei LY, Liu Y, Yeh CH. 2017. Long distance non-line-of-sight (NLOS) visible light signal detection based on rolling-shutter-patterning of mobile-phone camera. Opt. Expr. 25, 10 103–10 108. ( 10.1364/OE.25.010103) [DOI] [PubMed] [Google Scholar]
  • 113.Hassan NB, Ghassemlooy Z, Zvanovec S, Luo P, Le-Minh H. 2018. Non-line-of-sight 2 × n indoor optical camera communications. Appl. Opt. 57, B144–B149. ( 10.1364/AO.57.00B144) [DOI] [PubMed] [Google Scholar]
  • 114.Hassan NB, Ghassemlooy Z, Zvanovec S, Biagi M, Vegni AM, Zhang M, Luo P. 2019. Non-line-of-sight MIMO space-time division multiplexing visible light optical camera communications. J. Lightw. Technol. 37, 2409–2417. ( 10.1109/JLT.2019.2906097) [DOI] [Google Scholar]
  • 115.Hassan NB, Ghassemlooy Z, Zvanovec S, Biagi M, Vegni AM, Zhang M, Huang Y. 2019. Interference cancellation in MIMO NLOS optical-camera-communication-based intelligent transport systems. Appl. Opt. 58, 9384–9391. ( 10.1364/AO.58.009384) [DOI] [PubMed] [Google Scholar]
  • 116.Liu W, Xu Z. 2018. Predicted and experimental performance of a long distance non-line of sight image sensor communication system. In 2018 IEEE Int. Conf. on Communications Workshops (ICC Workshops), Kansas City, MO, USA, 20–24 May 2018. Piscataway, NJ: IEEE.
  • 117.Huang W, Xu Z. 2017. Characteristics and performance of image sensor communication. IEEE Photon. J. 9, 1–19. ( 10.1109/JPHOT.2017.2681660) [DOI] [Google Scholar]
  • 118.Le NT, Hossain MA, Hong CH, Nguyen T, Le T, Jang YM. 2015. Effect of blur Gaussian in MIMO optical camera communications. In 2015 Int. Conf. on Information and Communication Technology Convergence (ICTC), Jeju, South Korea, 28–30 October 2015. Piscataway, NJ: IEEE.

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

This article has no additional data.


Articles from Philosophical transactions. Series A, Mathematical, physical, and engineering sciences are provided here courtesy of The Royal Society

RESOURCES