Abstract
Nocturnal and crepuscular fast-eyed insects often exploit multiple optical channels and temporal summation for fast and low-light imaging. Here, we report high-speed and high-sensitive microlens array camera (HS-MAC), inspired by multiple optical channels and temporal summation for insect vision. HS-MAC features cross-talk–free offset microlens arrays on a single rolling shutter CMOS image sensor and performs high-speed and high-sensitivity imaging by using channel fragmentation, temporal summation, and compressive frame reconstruction. The experimental results demonstrate that HS-MAC accurately measures the speed of a color disk rotating at 1950 rpm, recording fast sequences at 9120 fps with low noise equivalent irradiance (0.43 μW/cm2). Besides, HS-MAC visualizes the necking pinch-off of a pool fire flame in dim light conditions below one thousandth of a lux. The compact high-speed low-light camera can offer a distinct route for high-speed and low-light imaging in mobile, surveillance, and biomedical applications.
Nine thousand one hundred twenty fps biologically inspired low-light camera has been developed.
INTRODUCTION
The vision of fast-eyed insects such as Drosophila and Odonata plays an essential role in evading threats, foraging, and navigating under assorted light conditions (1–7). Insect eyes distinctly use multiple optical channels called ommatidia, unlike single-chambered human eyes (8, 9). Visual stimuli from individual optical channels are separately and parallelly processed via the lamina and medulla in the optic lobe. This parallel processing in corresponding sensing neurons effectively increases the update rate for motion detection in the lobula complex (10, 11). Each small facet lens in ommatidia nonetheless hinders capturing sufficient photons (12). In particular, some nocturnal and crepuscular insects, such as Bombus, distinctly use temporal summation of visual signals to increase light sensitivity for individual optical channels (13–16). They integrate the visual signals over time to improve visual reliability in darkness at the cost of motion artifacts (17). Both unique features help fast-eyed insects thrive with high photon sensitivity in low-light conditions (18).
Compact cameras mimicking insect vision exhibit wide field-of-view (FOV), low aberration, high temporal resolution, and multifunctional imaging (19–22). Artificial compound eye cameras feature spherical arrangements of multiple optical channels for notable wide FOV (23–25). Ultrathin microlens array cameras (MACs) inspired from the Xenos peckii vision demonstrate high-resolution and high-sensitivity imaging (26–28). Distinct optical features within multiple channels of insect vision (21, 22) lead to the creation of polarization (29) or multispectral cameras (30–33). Multiple apertures of insect eyes further inspire high-speed cameras by sequentially acquiring subframes from a single frame captured using a rolling shutter complementary metal-oxide semiconductor (CMOS) image sensor (28, 30, 34–36). This approach offers advantages over traditional high-speed cameras with bulky sizes or complex circuits (37–42) in machine vision, even in neuroscience and health care applications (43–45). However, these cameras have technical restrictions for substantially increasing the frame rate over orders of magnitudes and still suffer from rolling shutter distortions for high-speed imaging. They also struggle with photon capture in low-light conditions due to the small aperture. These features pose challenges for achieving high-speed and high-sensitivity imaging.
Here, we report high-speed and high-sensitive MAC (HS-MAC), inspired by insect’s vision. A fast-eyed insect detects swiftly moving objects at a distance in dim light, wherein reflected light sequentially activates photoreceptors in discrete optical channels (Fig. 1A). A series of visual stimuli collected from each channel undergo parallel processing and temporal summation while traversing the lamina and medulla of the optic lobe (46). Parallel processing allows a high update rate for high-speed imaging, and temporal summation further enhances light sensitivity in low-light conditions. Similar to the insect’s vision, HS-MAC performs high-speed and high-sensitive imaging by combining channel fragmentation, temporal summation, and compressive frame reconstruction process (Fig. 1B). Offset microlens arrays (oMLAs) packaged with a single rolling shutter CMOS image sensor allow the high-speed imaging. oMLAs in optical channels project swiftly moving objects onto offset patterned channels with a low signal-to-noise ratio (SNR) in dim conditions. Sequential row-wise exposure of the rolling shutter image sensor divides the individual channels into discrete fragmented arrays, which is defined as channel fragmentation. Spatial information in an ultrashort event is preserved by the fragmented arrays, much above the conventional frame rate. Each fragmented array is then temporally summed during the overlapping period with the adjacent arrays, yielding high SNR fragmented channel images due to the superposition of temporal information. The fragmented channel images (a, b, and c) on a single fragmented array image contain different spatial information about swiftly moving objects due to oMLAs and are stitched together to form a single frame. The compressive image reconstruction process further deblurs the frames by segmenting the overlapped information. Microlens with small offset allows for precise channel fragmentation and substantially improves the frame rate by more than three orders of magnitude within the limited bandwidth. Furthermore, temporal summation and compressive frame reconstruction enhance camera sensitivity in low light by overlapping temporal information, breaking the trade-off between SNR and frame rate.
Fig. 1. Schematic illustration for high-speed and HS-MAC.
(A) Vision in a fast-eyed insect. Reflected light from swiftly moving objects sequentially stimulates the photoreceptors along the individual optical channels called ommatidia, of which the visual signals are separately and parallelly processed via the lamina and medulla. Each neural response is temporally summed to enhance the visual signals. The parallel processing and temporal summation allow fast and low-light imaging in dim light. (B) High-speed and HS-MAC. HS-MAC incorporates channel fragmentation, temporal summation, and compressive frame reconstruction. oMLAs projects swiftly moving objects on a rolling shutter CMOS image sensor via corresponding optical channels. Row-wise exposure horizontally fragments channels into fragmented arrays. The temporal summation of each fragmented array enhances SNR in low-light conditions, resulting in overlapping temporal information in adjacent frames. The frame components of a single fragmented array image are stitched into a single blurred frame, which is subsequently deblurred by compressive image reconstruction.
RESULTS
Fully packaged HS-MAC and comparison to offset-free MACs
HS-MAC was fully packaged with oMLAs (offset distance: 15.5 μm) and a single rolling shutter CMOS image sensor (IMX 477, SONY Corp.) within a compact size (1.53 mm × 8.35 mm × 10.5 mm, 0.3 g) (Fig. 2A). oMLAs with various offset distances were also microfabricated and integrated into HS-MACs to capture array images (Fig. 2B). Each microlens was designed and fabricated to prevent image overlap (26). The image size reduces as the offset distance decreases, allowing for the precise channel fragmentation and enhancing the frame rate. The frame component stitching diagram including the design parameters of oMLAs and HS-MAC is described in fig. S1. The performance of HS-MAC was compared with that of an offset-free MAC. No distortion correction or deblurring process was applied, except for minimal image processing for frame reconstruction, to compare HS-MAC and MAC. Computational resources required were not noticeably different. We also defined the working distance of the camera to capture images in an environment without disparity between microlenses (fig. S2), thereby disregarding depth information. Both MAC and HS-MAC were set to the same working distance. The frame rate of HS-MAC and MAC—both exhibiting unit cell sizes of 750, 493, and 284 μm—are represented in Fig. 2C. HS-MAC shows a remarkable high frame rate as the unit cell size decreases compared to that of MAC. Notably, HS-MAC achieves 9120 fps, which exceeds MAC’s frame rate (480 fps) by a factor of 19. The frame rate of the HS-MAC is 91,200 fps, where the offset distance is equal to the size of a single pixel. Both HS-MAC and MAC use rolling shutter image sensors, which often cause rolling shutter distortion. HS-MAC exhibits significantly lower rolling shutter distortion unlike MAC (Fig. 2D). A free-falling ball with a diameter of 78 mm was captured by HS-MAC and MAC with the same shutter speed, where the exposure time is the reciprocal of MAC’s frame rate. The distortion was calculated by the ratio of the major to the minor axis of the captured ball. The major-to-minor axis ratio of HS-MAC is closer to the ideal value of one compared to MAC, and as the unit cell size decreases, the ratio approaches one. The mean absolute percentage error of HS-MAC is 4% when the unit cell size is 284 μm, indicating a significant decrease of more than 50% in comparison to the error of MAC (10%). This experimental result implies that HS-MAC distinguishes the ball as a nearly spherical shape. This is because HS-MAC captures a single frame in a shorter time compared to MAC by precisely fragmenting the image sensor using the offset. The image quality was also compared between HS-MAC and MAC. Still, pepper images were captured by using both cameras with a unit cell size of 284 μm at a constant exposure time (2.08 ms) (Fig. 2E). The Pearson correlation coefficient (PCC) between the captured images is 0.99, indicating a high degree of similarity. The imaging performance of HS-MAC is equivalent to that of MAC in terms of the measured modulation transfer function (MTF), as shown in Fig. 2F. HS-MAC demonstrates high-speed imaging with low distortion and comparable image quality to the MAC (see figs. S3 and S4 and table S1 for detailed specifications of HS-MAC).
Fig. 2. Fully packaged HS-MAC and comparison to an offset-free MAC.
(A) HS-MAC packaged with an oMLAs having a 15.5-μm offset distance. An enlarged image of a portion of the oMLAs is displayed on the left side. White dashed lines indicate grids to identify the offset. (B) Microfabricated oMLAs and captured array images. Each scale bar provides a consistent reference across all images regardless of the offset. (C) The frame rate of HS-MAC and offset-free MAC as a function of the reciprocal of the unit cell size. The solid line represents the calculated frame rate. HS-MAC exhibits higher frame rate than MAC at equivalent unit cell sizes. HS-MAC achieves remarkable frame rate of up to 91,200 fps, where the offset distance is equivalent to the size of one pixel (corresponding unit cell size: 98.4 μm). (D) Photographs of a free-falling ball using MAC and HS-MAC (unit cell size = 284 μm). MAC and HS-MAC exposed the scene during the reciprocal of MAC’s frame rate. The shape of the ball appears vertically elongated due to the rolling shutter distortion. The ratio of the major axis to the minor axis of the ball was analyzed with different unit cell sizes. Unlike MAC, HS-MAC maintains a nearly spherical shape due to small distortion. (E) Standstill pepper images captured by MAC and HS-MAC (unit cell size of 284 μm, MAC = 480 fps, HS-MAC = 9120 fps). The image of MAC was obtained by averaging the channels along the same horizontal axis. The similarity between MAC and HS-MAC is consistently high (PCC = 0.99). (F) Comparison of the MTF between MAC and HS-MAC. HS-MAC exhibits high-speed imaging with low distortion and comparable MTF to the MAC. a.u., arbitrary unit.
HS-MAC–driven low-light imaging
HS-MAC uses temporal summation, which substantially improves light sensitivity at a constant frame rate. Conventional single lens high-speed cameras restrict the maximum exposure time to one over frame rate, as the photodetector stops the exposure before capturing a subsequent frame. In contrast, HS-MAC breaks this trade-off between the frame rate and the exposure time and thus enhances the light sensitivity by overlapping the exposure time across the subsequent frames for the temporal summation. Temporal summation factor (TSF) is defined as a ratio of HS-MAC’s exposure time to the maximum exposure time of a single-lens camera at the same frame rate (Fig. 3A). The compressive image reconstruction algorithm was used to restore deblurred frames from blurred temporally summed (blurred TS) frames , where a frame size is m × n and the number of frames are d and c, respectively. g is expressed as a linear forward model as follows
| (1) |
where is a sensing matrix (Fig. 3B) (38). For more information about the sensing matrix, see fig. S5. Recovering f directly from g is a challenging task due to the instability of an ill-conditioned problem (47). Instead, is obtained by solving the optimization problem as follows:
| (2) |
where R is the regularization function and is the regularization parameter. L1-norm–based total variation (TV) was used for regularization (48–50). A constrained TV deblurring model with the alternating direction method of multipliers was used to address the optimization problem associated with , where the pixel values lie within a certain dynamic range [, ] (47, 51). The optimization problem is formulated as follows:
| (3) |
where θ and y indicate auxiliary variables. The augmented Lagrangian function of (4.3) is defined as follows
| (4) |
where T denotes the transpose of the vector. The detailed algorithm is documented in table S2 and fig. S6.
Fig. 3. Low-light imaging with temporal summation.
(A) Temporal summation for HS-MAC. TSF is a ratio of the exposure time of HS-MAC () to the maximum exposure time of a single lens camera under the same frame rate (). HS-MAC overlaps the exposure time across the subsequent frames for the temporal summation, while a single-lens camera halts the exposure per frame. (B) Temporal summation sampling diagram. Deblurred frames f were calculated by compressive image reconstruction. and are blurred temporally summed (blurred TS) and deblurred frames, concatenated into mn-vectors, where each frame size is and the numbers of frames are c and d, respectively. is a sensing matrix, where red and gray boxes indicate values of 1 and 0. (C) NEI of HS-MAC operating at 3040 fps and 9120 fps. A point light source was placed in a dark chamber to calculate NEI. The NEI and TSF are in inverse proportion. Both cameras show the minimum NEI of 0.43 μW/cm2 for the maximum exposure time of the CMOS image sensor at 30 fps. (D) Captured blurred TS (top) and deblurred (bottom) frames of a rotating disk at 1953 ± 3 rpm by using HS-MAC at 9120 fps with various TSF. The brightness of all images was normalized for comparison. (E) Comparative full reference image similarities before and after deblurring, depending on TSF. A stationary disk was captured at the maximum exposure level as a reference image. The similarity was calculated by the PCC. The blurred TS frame shows the maximum similarity at TSF = 10 (PCC = 0.93), noticeably decreasing above TSF = 10 due to the motion artifact resulting from the excessive temporal summation. In contrast, the deblurred frames maintain relatively high and consistent similarity even beyond TSF = 10.
The low-light imaging performance of HS-MAC was analyzed by measuring the noise equivalent irradiance (NEI) (Fig. 3C). A small white light-emitting diode was placed at a distance from the HS-MAC within a dark chamber, serving as a point light source, and oMLAs project the dotted patterns on the image sensor. The NEI was defined as the irradiance at SNR of 1 by gradually reducing the light source. The SNR was calculated from image intensities in the bright and dark regions. The integrated spectral range of the irradiance is from 245 to 812 nm, which is beyond the wavelength of the point source to prevent underestimation of NEI. The NEIs of two HS-MACs with 3040 and 9120 fps decrease linearly as TSFs increase on a logarithmic scale for both cases, exhibiting an inverse proportional relationship between TSF and the achievable light intensity. Both cameras achieve the minimum NEI of 0.43 μW/cm2 for the maximum exposure time of the 30 fps CMOS image sensor, i.e., 1/30 s, while NEI was measured as 9.68 μW/cm2 for 3040 fps and 16.99 μW/cm2 for 9120 fps without temporal summation. In comparison, HS-MAC exhibited 23× and 40× lower NEI values with temporal summation for the respective frame rates, highlighting its outstanding sensitivity.
A rotating disk was captured by using HS-MAC at 9120 fps while increasing the TSF to validate the deblurring algorithm (Fig. 3D). The intensities of all captured images were normalized for comparison. Increasing TSF gradually sharpens the disk’s shape in TS frames by capturing more light, while the frame without the temporal summation (TSF = 1) is highly susceptible to noise due to the insufficient photons. The TS frames nonetheless become blurred as TSF increases, resulting from the excessive exposure time. In contrast, the image of a bee printed on the disk is distinguishable after deblurring by compressive image reconstruction, even at TSF = 100. The full reference image similarity between blurred TS and deblurred frames was assessed by the PCC (Fig. 3E) (52). A stationary disk was used as a reference image. The blurred TS frame exhibits the highest image similarity (PCC = 0.93) at TSF = 10, which notably decreases above TSF = 10 due to motion artifacts caused by the excessive temporal summation. In contrast, the deblurred frame maintains relatively high image similarity (PCC = 0.96) even above TSF = 10, remaining relatively constant for higher TSF values. The experimental results demonstrate that HS-MAC effectively captures detailed information about objects under dim light conditions by using the temporal summation and the compressive frame reconstruction.
Experimental demonstration of high-speed and low-light imaging
HS-MAC captured a rotating disk at 9120 fps with TSF = 5 (Fig. 4, A to C). The disk with a diameter of 135 mm rotated at 1953 ± 3 rpm with segments in six different colors (Fig. 4A). A portion of a projected array image and a single channel image taken by HS-MAC are shown in Fig. 4B. A single channel image contains a curved black arrow due to the rolling shutter effect. Sequential frames were reconstructed from a set of fragmented array images (Fig. 4C). HS-MAC distinctly observed the black arrow rotating counterclockwise over time without rolling shutter distortions. The rotating speed of the disk was measured by analyzing pixel intensities on the edge of the disk over time (Fig. 4D). The measured rotating speed is 1950 rpm, falling within an acceptable range of speed jitter. HS-MAC also excelled in capturing the necking pinch-off process of a pool fire flame (0.88 lux) under attenuated lighting conditions (Fig. 4, E to G). The pool fire flame was captured at 20 cm apart from HS-MAC to eliminate the image disparity. Neutral density (ND) filters with optical densities (ODs) of 0, 1.3, and 3.0 were used to attenuate the light intensity of the pool fire flame (Fig. 4E). A cycle of the flame flickering was captured by HS-MAC at 1020 fps during 71.7 ms via different ND filters (Fig. 4F). Each frame captured through different OD was normalized for comparison. SNR of the captured frames decreases as OD increases at TSF = 1, resulting in noisy frames due to lack of light. The shape of the flame even disappears at OD 3.0. However, the temporal summation exceptionally improves the image sensitivity by collecting photons over a longer period of time. HS-MAC clearly observes a set of time-lapse images for the flame pinch-off even at OD = 3.0 (880 μlux) with TSF = 33. The flame pocket detachment process (OD = 3.0, TSF = 33) is also captured from 59.28 to 65.02 ms as displayed in Fig. 4G. The frame tip’s pinching off leads to the separation of the flame pocket. Evidently, HS-MAC successfully captures the exact moment of flame pocket detachment (63.10 ms), even under a considerably reduced light condition.
Fig. 4. Experimental demonstration of high-speed and low-light imaging.
(A) A high-speed rotating disk with segments in different colors at 1953 ± 3 rpm. (B) Photographs of (A) captured by HS-MAC. Left: A portion of a projected array image. Each channel is shifted by the microlens offset. Right: A single channel captured by HS-MAC with 15.5-μm microlens offset (9120 fps). (C) Deblurred sequential frames fully reconstructed from a set of fragmented array images. White dashed lines indicate grids to distinguish a rotation of the disk. HS-MAC operates at 9120 fps and a TSF of 5. (D) Normalized pixel intensities of the red marked region of the rotating disk over time. The measured rotating speed is 1950 rpm, which falls within the acceptable range of speed jitter. (E) Necking pinch-off of a pool fire flame (0.88 lux; 20 cm apart from HS-MAC) under intended dim conditions. (F) Optical images of the pool fire flame via different ND filters with ODs of 0, 1.3, and 3.0 taken at 1020 fps. Left: Captured images at TSF = 1 under different dim conditions. Right: Optical images of a cycle of a flame flickering during 71.7 ms with temporal summation. HS-MAC effectively visualizes necking pinch-off of a pool fire flame under dim light conditions by increasing TSF, while the flame is invisible due to low SNR above OD = 1.3. (G) Sequential snapshots of a fast pinch-off (OD = 3.0, TSF = 33, time span = 5.74 ms). The frame tip is pinched off, resulting in the separation of the flame pocket.
DISCUSSION
We have reported high-speed and HS-MAC, inspired by insect’s vision. HS-MAC captures high-speed images at 9120 fps, outperforming conventional MACs by using channel fragmentation. Temporal summation successfully enhances the sensitivity, allowing HS-MAC to detect extremely low light (0.43 μW/cm2) while maintaining high frame rate. Furthermore, the compressive frame reconstruction process eliminates the blur caused by the temporal summation and provides sharp high-speed frame sequences. In particular, HS-MAC not only allows high-speed and high-sensitivity imaging but also supports three-dimensional (3D) imaging and super-resolution imaging by simply replacing the image processing algorithms of conventional MACs. This versatility facilitates the development of biologically inspired multifunctional camera capable of high-speed imaging at the far field plane and 3D imaging or super-resolution imaging at mid to near field planes. Besides, we anticipate developing multifunctional camera by implementing a sophisticated image processing algorithm that allows for large FOV imaging at mid or near field planes while simultaneously conducting high-speed imaging within overlapping FOV regions between each microlens. Despite the remarkable capabilities of HS-MAC, several limitations warrant further investigation. Disparity between microlenses at short distances may cause wavy distortions in video, requiring improvements in optical design or algorithmic corrections. In addition, various optical improvements can further enhance camera performance, potentially resulting in sharper and higher-resolution image sequences. These strategies include the use of aspherical offset microlenses or achromatic lens designs (53–56) and using hexagonal microlens array or packaging image sensors with an appropriate chief ray angle for oMLAs. Another issue is the computational cost of compressive image reconstruction for deblurring. Further algorithm optimization will substantially enhance the utility of HS-MAC in various applications, including wireless imaging and real-time low-light imaging. Cleary, our work reports the first demonstration of high-speed and high-sensitivity insect eye camera that fully mimics multiple channels and temporal summation of fast-eyed insect’s eyes. This compact, high-speed, and high-sensitive camera can provide a distinct route for diverse imaging applications in healthcare, smartphone, drones, or automobile.
MATERIALS AND METHODS
Microfabrication of HS-MAC
The microfabrication process involves the combination of a metal insulator–metal aperture layer and a microlens array layer (fig. S7). To reduce optical cross-talk noise between individual microlenses, the metal-insulator-metal aperture layer was utilized (57). The fabrication of the oMLAs employs a standard large-area microfabrication technique. Initially, a negative photoresist (DNR L300-D1, Dong-Jin Semichem. Co. Ltd., Korea) was defined photolithographically with an offset-patterned mask on a 4-inch borosilicate wafer. Next, a Cr (5 nm), SiO2 (100 nm), and Cr (100 nm) layer was evaporated respectively using an electron beam and then lifted off with DNR stripper (DPS-7300, Dong-Jin Semichem. Co. Ltd., Korea). Another DNR photoresist (DNR L4615-D, Dong-Jin Semichem. Co., Ltd., Korea) was then photolithographically defined with the same offset patterned mask. The offset patterned DNR posts were thermally reflowed to form the oMLAs. Last, the oMLAs were integrated in an inverted manner with a rolling shutter image sensor (Sony IMX477), combined with submillimeter-thickness microposts using a flip-chip bonder. We verified that the image sensor was free of pixel defects and operating properly prior to packaging.
Image aquisition and experimental setup
The fully packaged HS-MAC used SONY IMX477 image sensor. This sensor records video at full resolution with a frame rate of 30 fps. The captured data was transferred to a single-board computer (Jetson Nano Development Kit-B01, NVIDIA Corp.). All objects were positioned sufficiently far from the HS-MAC to minimize disparities between channel images.
Acknowledgments
Funding: This work was finantially supported by KRIT(Korea Research Institute for defense Technology planning and advancement) grant funded by Defense Acquisition Program Administration (DAPA) (11-102-103-021), the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (no. 2022M3H4A4085645 and RS-2024-00438316), and the Technology Innovation Program (no. RS-2024-00432381) funded by the Ministry of Trade Industry & Energy (MOTIE, Korea), Ministry of Science and ICT, Republic of Korea (2021R1A2B5B03002428).
Author contributions: Conceptualization: H.-K.K., K.-H.J., and M.H.K. Data curation: H.-K.K. Formal analysis: H.-K.K., K.-H.J., and Y.-G.C. Funding acquisition: K.-H.J., and Y.-J.J. Investigation: H.-K.K., Y.-G.C., and J.-M.K. Methodology: H.-K.K., S.-I.B., K.K., and K.-W.J. Software: H.-K.K., and M.H.K. Validation: H.-K.K., K.-H.J., M.H.K., Y.-G.C., and J.-M.K. Visualization: H.-K.K., K.-H.J., and M.H.K. Supervision: K.-H.J. and M.H.K. Writing—original draft: H.-K.K., K.-H.J., and M.H.K. Writing—review and editing: H.-K.K., K.-H.J., and M.H.K.
Competing interests: The authors declare that they have no competing interests.
Data and materials availability: All data needed to evaluate the conclusions in the paper are present in the paper and/or the Supplementary Materials.
Supplementary Materials
The PDF file includes:
Figs. S1 to S7
Tables S1 and S2
Legends for movies S1 and S2
Other Supplementary Material for this manuscript includes the following:
Movies S1 and S2
REFERENCES AND NOTES
- 1.Egelhaaf M., Kern R., Vision in flying insects. Curr. Opin. Neurobiol. 12, 699–706 (2002). [DOI] [PubMed] [Google Scholar]
- 2.Ruck P., Photoreceptor cell response and flicker fusion frequency in the compound eye of the fly, Lucilia sericata (Meigen). Biol. Bull. 120, 375–383 (1961). [Google Scholar]
- 3.Kelber A., Jonsson F., Wallen R., Warrant E., Kornfeldt T., Baird E., Hornets can fly at night without obvious adaptations of eyes and ocelli. PLOS ONE 6, e21892 (2011). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Spiewok S., Schmolz E., Changes in temperature and light alter the flight speed of hornets (Vespa crabro L.). Physiol. Biochem. Zool. 79, 188–193 (2006). [DOI] [PubMed] [Google Scholar]
- 5.Frederiksen R., Wcislo W. T., Warrant E. J., Visual reliability and information rate in the retina of a nocturnal bee. Curr. Biol. 18, 349–353 (2008). [DOI] [PubMed] [Google Scholar]
- 6.Giraldo Y. M., Dickinson M. H., Celestial navigation in drosophila. Integr. Comp. Biol. 57, E273 (2017). [Google Scholar]
- 7.Bybee S. M., Johnson K. K., Gering E. J., Whiting M. F., Crandall K. A., All the better to see you with: A review of odonate color vision with transcriptomic insight into the odonate eye. Org. Divers. Evol. 12, 241–250 (2012). [Google Scholar]
- 8.Horridge G. A., The compound eye of insects. Sci. Am. 237, 108–120 (1977).929159 [Google Scholar]
- 9.Seo M., Seo J. M., Cho D. I., Koo K. I., Insect-mimetic imaging system based on a microlens array fabricated by a patterned-layer integrating soft lithography process. Sensors 18, 2011 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Zhang L., Zhan H., Liu X., Xing F., You Z., A wide-field and high-resolution lensless compound eye microsystem for real-time target motion perception. Microsyst. Nanoeng. 8, 83 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Buschbeck E. K., Ehmer B., Hoy R. R., The unusual visual system of the Strepsiptera: External eye and neuropils. J. Comp. Physiol. A 189, 617–630 (2003). [DOI] [PubMed] [Google Scholar]
- 12.Greiner B., Ribi W. A., Warrant E. J., Retinal and optical adaptations for nocturnal vision in the halictid bee Megalopta genalis. Cell Tissue Res. 316, 377–390 (2004). [DOI] [PubMed] [Google Scholar]
- 13.Warrant E. J., Seeing better at night: Life style, eye design and the optimum strategy of spatial and temporal summation. Vision Res. 39, 1611–1630 (1999). [DOI] [PubMed] [Google Scholar]
- 14.Warrant E., Porombka T., Kirchner W. H., Neural image enhancement allows honeybees to see at night. Proc. R. Soc. B Biol. Sci. 263, 1521–1526 (1996). [Google Scholar]
- 15.Warrant E., Dacke M., Vision and visual navigation in nocturnal insects. Annu. Rev. Entomol. 56, 239–254 (2011). [DOI] [PubMed] [Google Scholar]
- 16.Stockl A. L., O'Carroll D. C., Warrant E. J., Neural summation in the hawkmoth visual system extends the limits of vision in dim light. Curr. Biol. 26, 821–826 (2016). [DOI] [PubMed] [Google Scholar]
- 17.Baird E., Kreiss E., Wcislo W., Warrant E., Dacke M., Nocturnal insects use optic flow for flight control. Biol. Lett. 7, 499–501 (2011). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Reber T., Vahakainu A., Baird E., Weckstrom M., Warrant E., Dacke M., Effect of light intensity on flight control and temporal properties of photoreceptors in bumblebees. J. Exp. Biol. 218, 1339–1346 (2015). [DOI] [PubMed] [Google Scholar]
- 19.Phan H. L., Yi J., Bae J., Ko H., Lee S., Cho D., Seo J.-M., Koo K.-I., Artificial compound eye systems and their application: A review. Micromachines 12, 847 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Cheng Y., Cao J., Zhang Y., Hao Q., Review of state-of-the-art artificial compound eye imaging systems. Bioinspir. Biomim. 14, 031002 (2019). [DOI] [PubMed] [Google Scholar]
- 21.Labhart T., Meyer E. P., Detectors for polarized skylight in insects: A survey of ommatidial specializations in the dorsal rim area of the compound eye. Microsc. Res. Tech. 47, 368–379 (1999). [DOI] [PubMed] [Google Scholar]
- 22.Awata H., Wakakuwa M., Arikawa K., Evolution of color vision in pierid butterflies: Blue opsin duplication, ommatidial heterogeneity and eye regionalization in Colias erate. J. Comp. Physiol. A 195, 401–408 (2009). [DOI] [PubMed] [Google Scholar]
- 23.Cogal O., Leblebici Y., An insect eye inspired miniaturized multi-camera system for endoscopic imaging. IEEE Trans. Biomed. Circuits Syst. 11, 212–224 (2017). [DOI] [PubMed] [Google Scholar]
- 24.Hu Z. Y., Zhang Y. L., Pan C., Dou J. Y., Li Z. Z., Tian Z. N., Mao J. W., Chen Q. D., Sun H. B., Miniature optoelectronic compound eye camera. Nat. Commun. 13, 5634 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Song Y. M., Xie Y., Malyarchuk V., Xiao J., Jung I., Choi K.-J., Liu Z., Park H., Lu C., Kim R.-H., Li R., Crozier K. B., Huang Y., Rogers J. A., Digital cameras with designs inspired by the arthropod eye. Nature 497, 95–99 (2013). [DOI] [PubMed] [Google Scholar]
- 26.Kim K., Jang K. W., Ryu J. K., Jeong K. H., Biologically inspired ultrathin arrayed camera for high-contrast and high-resolution imaging. Light Sci. Appl. 9, 28 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Jang K.-W., Kim K., Bae S.-I., Jeong K.-H., Biologically inspired ultrathin contact imager for high-resolution imaging of epidermal ridges on human finger. Adv. Mater. Technol. 6, 2100090 (2021). [Google Scholar]
- 28.Kim K., Jang K.-W., Bae S.-I., Jeong K.-H., Multi-functional imaging inspired by insect stereopsis. Comms. Eng. 1, 39 (2022). [Google Scholar]
- 29.Zhang W., Cao Y., Zhang X., Liu Z., Sky light polarization detection with linear polarizer triplet in light field camera inspired by insect vision. Appl. Optics 54, 8962–8970 (2015). [DOI] [PubMed] [Google Scholar]
- 30.K. Kagawa, N. Fukata, J. Tanida, paper presented at the Optics and Photonics for Information Processing IV, San Diego, CA, USA, 7 September 2010. [Google Scholar]
- 31.Zhang Y., Xu H., Guo Q., Wu D., Yu W., Biomimetic multispectral curved compound eye camera for real-time multispectral imaging in an ultra-large field of view. Opt. Express 29, 33346–33356 (2021). [DOI] [PubMed] [Google Scholar]
- 32.Hao Q., Song Y., Cao J., Liu H., Liu Q., Li J., Luo Q., Cheng Y., Cui H., Liu L., The development of snapshot multispectral imaging technology based on artificial compound eyes. Electronics 12, 812 (2023). [Google Scholar]
- 33.Kim K., Jang K. W., Bae S. I., Kim H. K., Cha Y., Ryu J. K., Jo Y. J., Jeong K. H., Ultrathin arrayed camera for high-contrast near-infrared imaging. Opt. Express 29, 1333–1339 (2021). [DOI] [PubMed] [Google Scholar]
- 34.Kim K., Jang K.-W., Kim H.-K., Cho S. B., Jeong K.-H., Biologically inspired intraoral camera for multifunctional dental imaging. J. Opt. Microsyst. 2, 031202 (2022). [Google Scholar]
- 35.Miyazaki D., Ebata T., Moriguchi K., Mukai T., Regionally adaptive enhancement of frame rate and resolution with a multi-aperture image capturing system using rolling shutter effect. ITE Trans. Media Technol.Appl. 3, 240–244 (2015). [Google Scholar]
- 36.D. Miyazaki, H. Shimizu, Y. Nakao, T. Toyoda, Y. Masaki, in Sensors, Cameras, and Systems for Industrial/Scientific Applications X. (SPIE, 2009), vol. 7249, 205–212. [Google Scholar]
- 37.K. Kagawa, T. Kokado, Y. Sato, F. Mochizuki, H. Nagahara, T. Takasawa, K. Yasutomi, S. Kawahito, paper presented at the Proceedings of the 2019 International Image Sensor Workshop, Snowbird, UT, USA, June 2019. [Google Scholar]
- 38.Guzmán F., Meza P., Vera E., Compressive temporal imaging using a rolling shutter camera array. Opt. Express 29, 12787–12800 (2021). [DOI] [PubMed] [Google Scholar]
- 39.T. G. Etoh, Q. A. Nguyen, Evolution of high-speed image sensors, in The Micro-World Observed by Ultra High-Speed Cameras: We See What You Don’t See (Springer, 2018), pp. 81–101. [Google Scholar]
- 40.Bub G., Tecza M., Helmes M., Lee P., Kohl P., Temporal pixel multiplexing for simultaneous high-speed, high-resolution imaging. Nat. Methods 7, 209–211 (2010). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.O. Ait-Aider, A. Bartoli, N. Andreff, paper presented at the 2007 IEEE Conference on Computer Vision and Pattern Recognition, Minneapolis, MN, USA, 17-22 June 2007. [Google Scholar]
- 42.A. Johansson, F. Johansson, “Verification Method for Time of Capture of a Rolling Shutter Image”, thesis, Linköping University, Linköping, Sweden (2023). [Google Scholar]
- 43.A. Osman, J. H. Park, D. Dickensheets, J. Platisa, E. Culurciello, V. A. Pieribone, paper presented at the 2011 IEEE Biomedical Circuits and Systems Conference (BioCAS), San Diego, CA, USA, 10–12 November 2011. [Google Scholar]
- 44.Zhu H., Isikman S. O., Mudanyali O., Greenbaum A., Ozcan A., Optical imaging techniques for point-of-care diagnostics. Lab Chip 13, 51–67 (2013). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.C. J. Revello, R. G. Driggers, D. Brady, K. Renshaw, “Large area coverage using drone mounted multi-camera systems,” Infrared Imaging Systems: Design, Analysis, Modeling, and Testing XXXIII 2022, Virtual, 6 to 12 June 2022, (SPIE, 2022), vol. 12106. [Google Scholar]
- 46.Chen J. W., Zhou Z., Kim B. J., Zhou Y., Wang Z., Wan T., Yan J., Kang J., Ahn J.-H., Chai Y., Optoelectronic graded neurons for bioinspired in-sensor motion perception. Nat. Nanotechnol. 18, 882–888 (2023). [DOI] [PubMed] [Google Scholar]
- 47.Chan R. H., Tao M., Yuan X., Constrained total variation deblurring models and fast algorithms based on alternating direction method of multipliers. SIAM J. Imaging Sci. 6, 680–697 (2013). [Google Scholar]
- 48.Beck A., Teboulle M., Fast gradient-based algorithms for constrained total variation image denoising and deblurring problems. IEEE Trans. Image Process. 18, 2419–2434 (2009). [DOI] [PubMed] [Google Scholar]
- 49.Jiang W., Cui H., Zhang F., Rong Y., Chen Z., Oriented total variation l1/2 regularization. J. Vis. Commun. Image Represent. 29, 125–137 (2015). [Google Scholar]
- 50.X. Yuan, in 2016 IEEE International Conference on Image Processing (ICIP) (IEEE, 2016). [Google Scholar]
- 51.I. W. Selesnick, I. Bayram, “Total variation filtering” (white paper, 2009); eeweb.engineering.nyu.edu/iselesni/lecture_notes/TV_filtering.pdf).
- 52.Starovoitov V. V., Eldarova E. E., Iskakov K. T., Comparative analysis of the ssim index and the pearson coefficient as a criterion for image similarity. Eurasian J. Math Comput. Appl. 8, 76–90 (2020). [Google Scholar]
- 53.Huang S., Li M., Shen L., Qiu J., Zhou Y., Fabrication of high quality aspheric microlens array by dose-modulated lithography and surface thermal reflow. Opt. Laser. Technol. 100, 298–303 (2018). [Google Scholar]
- 54.Hu Y., Chen Y., Ma J., Li J., Huang W., Chu J., High-efficiency fabrication of aspheric microlens arrays by holographic femtosecond laser-induced photopolymerization. Appl. Phys. Lett. 103, 141112 (2013). [Google Scholar]
- 55.M. C. López-Bautista, M. Avendaño-Alejo, I. Velázquez-Gómez, L. Castañeda, Designing afocal achromatic doublet lenses, in International Optical Design Conference 2021, Washington, DC, 27 June to 1 July 2021 (Optica Publising Group, 2021), vol. 12078, p. 1207816. [Google Scholar]
- 56.Vallerotto G., Askins S., Victoria M., Antón I., Sala G., A novel achromatic fresnel lens for high concentrating photovoltaic systems. AIP Conf. Proc. 1766, 050007 (2016). [DOI] [PubMed] [Google Scholar]
- 57.Bae S.-I., Kim K., Jang K.-W., Kim H.-K., Jeong K.-H., High contrast ultrathin light-field camera using inverted microlens arrays with metal–insulator–metal optical absorber. Adv. Opt. Mater. 9, 2001657 (2021). [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Figs. S1 to S7
Tables S1 and S2
Legends for movies S1 and S2
Movies S1 and S2




