Abstract
Lensless biological imaging systems are an emerging alternative to conventional microscopic systems because they enable a wide field of view imaging. While most microscopic systems sacrifice the field of view for magnification, lensless systems have taken advantage of small imaging pixel size, projection, digital magnification, and post-processing to compensate for diffracted images. A new lens-based system is designed to have the exact same wide field of view as that of a basic lensless setup. A new compound lens system design is utilized to achieve an explicit aim to have the same fields of view as the lensless setup. Then the characteristics of these two optical imaging setups (lensless and lens-based setups) are compared at this level of complexity to see what the minimal systems principles are needed to achieve the biological imaging goals for simplified and less expensive future designs. For both imaging systems, images of biological entities are recorded with the help of the same CMOS imaging device and computer software. The main contribution of this work is an exhaustive comparison between the performance characteristics of both systems using optical standards and biological images.
I. INTRODUCTION
Lensless imaging systems have recently gained wide popularity because they are portable, cost-effective, and lightweight, with numerous applications in the fields of biosensing, point-of-care diagnostics, and cytometry [1, 2]. One of the key advantages of lensless imaging is the wide field of view [3, 4]. However, the lack of a lens imposes limitations. Typically, a computational analysis program is used to enhance, substitute, and compensate for the lack of a lens and to add various features such as counting and detection. To improve resolution, Bishara et al. used multiple fiber-optic waveguides that are butt-coupled to light-emitting diodes. The light-emitting diodes are controlled by a low-cost micro-controller to illuminate the sample in a time-varying sequence, producing lens-free holograms. The resulting holograms are then captured by a digital sensor array and are processed using a fast super-resolution algorithm to generate high resolution holographic images (phase and amplitude) of the objects. This system showed good imaging performance but is complex.
Lensless imaging systems use integrated circuits that act as an imaging device, typically a charge-coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) pixel array. An alternate system is a scanning microscope that combines various small images into one large composite image, but our focus is on point-of-care systems that do not have such capabilities. The continuing improvements in the field of semiconductors have greatly enhanced the features of CMOS sensors used in cell phone cameras, with tremendous improvements over the last few years. The pixels size has shrunk while the number of pixels has grown. This continued improvement will enhance lensless systems and lens-based systems. A smaller pixel size provides a higher definition and allows for digital magnification, while a larger number of pixels will yield a wider field of view [5].
In this paper, we describe two imaging systems, i.e., the lensless and lens-based systems, and compare their performance using various imaging standards. Lab quality optics were used to demonstrate the principles, but a portable, low-cost product can easily be designed for mass production using the same principles. The main contribution of this work is the performance comparison of purposely designed and fabricated lens-based and lensless systems with the exactly same field of view. This is for the first time to the best of our knowledge.
We begin with a theoretical analysis of the two systems. It is important to note that different sets of equations are used to describe the system depending on whether the imaging subject is in the near or far-field. Our systems operate in the far-field of a plane wave due to the properties of our design.
A lensless system is designed and built, which allows a variable, near monochromatic plane wave to be generated with an illuminating source (Light Emitting Diode, LED). The light source can be easily changed to have different wavelengths. A three-dimensional x, y and z fixture is constructed to have distinct features like:
To ensure good contact to the module’s surface,
To enable visual movement along the entire channel,
To increase the field of view.
The distance through the module to the imaging device effectively magnifies the image diffraction pattern by a factor of 10 to 20X.
Wide area imaging is also beneficial to lens-based systems. Werley et al. designed a wide view microscope “Firefly” based on the Olympus MVPLAPO XC, an objective lens with a numerical aperture (NA) of 0.5 and a magnification of 2x [6]. This allowed them to observe 100 neurons in parallel with blue light while reading fluorescent voltage readings. The Firefly microscope is sophisticated and uses fluorescent sensors and optogenetic actuators.
To provide a fair comparison, we designed a lens-based system to have the same field of view as that of a lensless system. A compound lens system design was found necessary to get the equivalent field of view. The novel new compound lens-based system design is one of the major contributions of this work. The working distance to the slide is about 50 mm. For the same field of view, a 1X magnification was used. Increased magnification of this system is explored. To compare the two systems, the same type of imaging camera is used. The comparison images are composed of both optical standards and biological subjects. The results are shown as direct image comparisons.
II. MATERIALS AND METHODS
A. EXPERIMENTAL DESIGN
Each imaging approach is subject to theoretical design considerations. In the lensless case, it is necessary to decide if the object and detector are in the near or far-field to choose the proper design equations. Since the diffraction of light is an important parameter and used as a tool, it is essential to calculate and study its effects. In the lens-based system case, there is a trade-off between magnification and field of view. It is demonstrated that an equivalent field of view can be obtained while maintaining 1X magnification with the object in focus. Keeping in view these requirements, a lens-based setup is designed and built. The two systems are then compared with standard photographic measurement techniques, and several biological images are recorded for comparison purposes.
B. LENSLESS DESIGN
A directional wave is needed to create a visual pattern of the object on the image plane. This works best if the object is close to the image plane. Although some magnification can be obtained with some object separation at the cost of increased diffraction, this can be used as a magnifying effect for small objects. With the use of an LED or laser, the light can be structured as a plane wave, perhaps with the help of a pinhole, and is analogous to a flashlight portraying an image or a pinhole camera [7]. A key issue for the lensless case is how close the object can get to the imagining device. It is separated by the thickness of the slide, space to the imaging device, and any optical covering on the device. To realize any particular design, we need to identify whether the subject is in the near or far-field.
C. Diffraction, Near or Far Field
The Fraunhofer model best describes the far-field in an electromagnetic wave. The Fresnel diffraction best describes the near field effects. The Fresnel number for an electromagnetic wave through a 2.5 μm radius (a), and a distance of 1.5 mm (L) and at a wavelength of 380 nm is 0.011. Since this Fresnel number is less than one; lensless setup is in the far-field. Fig. 1 represents the layout arrangement of a lensless system. The final version of the developed system is shown in Fig. 4.
FIGURE 1.

Block diagram of Lensless optical system with a micro-module and CMOS detector, which is 6.14 mm × 4.6 mm.
FIGURE 4:

Lensless Imaging (Thor LL) with X, Y, and Z-axis manual allow fine-tuning of the image. LED light source with isolation tube and iris ensure a plane wave.
A composite microfluidic chip module was designed in AutoCAD software.
The VLS 2.30 laser cutter (Versa Laser, Scottsdale, Arizona) is used to cut the Poly (methyl methacrylate) (PMMA), and double-sided adhesive DSA sheets as per design specifications. This microfluidic chip module assembled by the non-photolithographic process contains the biological sample and is used for capturing images [5, 8–16]. The light is approximately monochromatic and a plane wave from a far source. The object to be imaged is on either coverslip or standard glass slide at the bottom of the microfluidic chip. Given the frequency of light, it is of interest to find out the number of wavelengths from the object to the detector and then determine if a far-field or near field model is appropriate for white and blue light.
| (1) |
(n= refractive index of 1.52)
As the thickness of the glass slide is 1mm, w is the number of wavelengths to pass from one side to the other of the object at 520 nm, white light in the middle of the spectrum, 380 to 700nm, is then.
| (2) |
The wavelength of an electromagnetic wave gets smaller with the influence of the refractive index and the speed of the wavefront reduces. The frequency of the wave does not change. wn is the number of wavelengths in a medium.
| (3) |
(where n=1.53 is a refraction index of glass)
| (4) |
Given a 0.13 mm thick coverslip but at a wavelength of 380 nm.
Given the n refractive index
| (5) |
The wavelength is comparable to the thickness of the microfluidic chip, even before reaching the CMOS sensor. Thus, anything within the microfluidic chip is in the far-field.
Large antennas are normally greater than 0.5λ. The CMOS sensor by this criterion is in the far-field.
Fresnel number provides an indication of the lensless imaging setup that whether the developed optical system is in near or far-field. It will be far-field if the Fresnel number is less than 1, and near field if greater than 1. This calculation is for a circle in a plane. We used 1951 USAF target dimensions. The results below confirm our target is also in the far-field.
a= radius (in this case, we use 4 um, i.e., the average radius of the white blood cell).
L = distance from plane to image plane = 1.5 mm for 1951 USAF target λ = 380 nm
| (6) |
Hence, our developed setup would be in the Fraunhofer region, where the shadow of each cell on the image sensor would be a 2-dimensional fractional Fourier transform of the initial cell distribution.
D. IMAGE SIZE EXPECTED DUE TO AIRY DISKS.
When light passes through an aperture, the phenomenon of diffraction happens. The diffraction pattern has a bright region in the center along with the series of concentric rings of decreasing intensity around it, is called the Airy Disk.
Due to the construction of the microfluidic chip, there is enough distance to the CMOS sensor to define it as in the far-field, and therefore, the Fraunhofer equations are appropriate.
Imaging subjects will appear as diffraction patterns on the sensor plane, which will be airy disks.
The CMOS pixel measures 1.25 μm and is a limiting resolution to the items that can be viewed directly using the lensless imaging setup. It is of interest to calculate what size the pattern would be for cells we would like to image.
The Fraunhofer diffraction pattern is predicted for a circular opening by the Airy equation that defines the first null point or radius. It can be derived from a first-order Bessel function that roughly resembles a decaying sine wave. Eugene Hect explains this derivation of an Airy disk from a circular aperture [17].
| (7) |
The aperture of opening radius = a
| (8) |
The angle of Observation = φ
The radial distance on optics axis = q
Observation distance to the edge of Airy pattern = R
Bessel function of the first kind =
| (9) |
f is the focal length or approximate distance to the imaging device.
D is the diameter of the lens or optical field diameter of the system.
D = 2a the radius of the input to the system.
The diameter of Airy disk d is given by equation (10).
| (10) |
f/D=F stop
R is the hypotenuse to the top of d, the airy disk.
| (11) |
Supplementary Table 1 lists various Airy disk diameters, which describe the diffraction patterns expected for different f stops, the ratio of focal length to aperture opening. A light wave passing by a cell in a lensless system will also generate a diffraction pattern in the far-field, as shown in Fig. 2.
FIGURE 2.

A digitally enlarged picture of a diffraction pattern of neutrophils taken with lensless imaging system.
E. IMAGING DEVICES:
A physical lens preforms a two-dimension Fourier transform, and given the proper shape and system, it will focus on an image. Both the lens and lensless approaches use the same imaging device and a computer for recording and enhancement. An IDS imaging system, shown in Fig. 3(a), with an ON semiconductor, CMOS, with 18 Megapixels, and 1.25 μm pixel size is used [18]. The same imaging software, “uEyeCockpit” is used for both systems.
FIGURE 3:

(a) IDS board with ON, CMOS device, 18 Megapixels, 1.25 μm pixel size, and image size of 6.14 mm × 4.604mm, UI 3592LE. (b) Sony IMX 219, 1.12 μm pixel, 8 Megapixel size.
The major cost difference between the two imaging systems is due to the price of the lens. Our early work had used raspberry Pi computers and cameras for lens and lensless experiments. Raspberry Pi cameras, as shown in Fig. 3b, cost $26, along with Raspberry Pi computers are relatively inexpensive with or without a lens system, at under $100 total for all materials, as shown in Supplementary Table 6. A Lab quality, bench equipment, lensless setup, costs considerably more and is shown in Fig. 4. This system was constructed using the Thorlabs lens tube-based system that is capable of movement in the X, Y, and Z directions. The LED light source at the top can be easily changed for achieving illumination at different frequencies. The LED is driven by a variable power supply that allows the light intensity to be varied. The objective was to create a more physically stable system that can be used to record detailed images. The output of Fig. 4 feeds into a computer and an optical program from IDS, “uEyeCockpit.” The system was created using a solid breadboard and a Cerna Body from Thorlabs. An X, Y, and Z, manual control was coupled to the base. Each is spring-loaded and adjustable.
A custom cradle holder is designed and created using a 3-D printer to hold the IDS CMOS device, as shown in Fig. 3. It is designed to mount the X, Y, and Z stages. The imaging device can easily be moved as per requirements. A fixed arm at the top holds the illuminating source. A custom-designed holder is designed and fabricated using a 3 D printer, and this holder is interfaced with the arm. The holder is removable; it holds the pinhole to create a plane wave and keeps the cable in place. A second arm is a space lower and just above the CMOS holder. An isolation tube with a variable iris is fitted between the arms. The top of the tube has a rubber gasket to fit in the tube easily. The bottom of the tube screws into the bottom arm. Underneath the bottom arm is slide holder springs that hold the microfluidic chip or glass slide in place. The slide is held firm in a place, and the detector is moved back and forth of the slide. The detector is spring-loaded against the slide. Special care must be taken to find the most appropriate pressure for the detector movement.
An adjustable power supply is utilized for biasing the LED. Hence, it provides the most convenient way to adjust the intensity of the emitted light. It is also possible to use an LED having various wavelengths. Supplementary Table 2 enlists the necessary components utilized to build the lensless device. The respective part number, quantity, and manufacturer are also provided. A lensless system with fine controls in the X, Y, and Z was constructed with 25 mm distance/separation in the X direction allowing wide view photography, and the ability to stitch photographs together. The definition of the photo is dependent on pixel size, wavelength, and distance from the object to the CMOS detector. The built lensless setup is shown in Fig. 4.
F. LENS BASED SYSTEM DESIGN:
Standard microscopes directly trade-off magnification vs. field of view. A lens-based optical system was designed with the explicit aim to have the same field of view (or better) as that of lensless setup. Both the lens-based and lensless systems utilized the same image recording device. In addition, a variable lens was added so that some lens magnification could be introduced with a corresponding decrease in the viewing area. Both systems will feed into the same computer application program since both devices have the same image recording device.
The numerical aperture diagram is as shown in Fig. 5:
| (12) |
FIGURE 5:

Numerical aperture of a two-lens system. The numerical aperture defines the cone of light that can enter the microscope. It is constant if the system is fixed but varies as the magnification and lens position varies.
Where n is the index of refraction
φ is the half-angle of the marginal ray of the system.
NA defines the resolving power of the lens, assuming the diffraction-limited lens system is approximate:
| (13) |
G. OPTICAL DESIGN AND MATERIAL OF LENS SYSTEM
An optical design program (Optical Wizard) was used to select the optical components and verify the design, and eventually, show the results [19]. The design results are presented in Supplementary Table 3. In order to achieve the desired goals of making the field of view of the lens-based system exactly the same as that of the lensless setup and to take advantage of a focused image, a compound lens system was needed. We designed a variable lens system that allows up to 5X magnification, which then enhances the camera or pixel resolution limit. Fig. 6 shows a picture of the built lens-based system. The top element is a camera, which is connected to a coupler, then an adapter tube, followed by a zoom lens and a fixed lens. One of the objectives is to improve the resolution of wide-field images. Since the lens system can go as low as 0.7X, we can observe a field of view of 9.43mm × 12.57 mm. If the definition of the CMOS imaging device could be improved by going to a smaller pixel size [20], larger number of pixels and a shorter wavelength, the definition of the system can be further enhanced in the future.
FIGURE 6:

Navitar Lens System attached to a C mount IDS, CMOS imaging device, UI 3592LE.
The 18 MP camera used in the lens-based system is the same base as the lensless system but with a lens. A coupler and adaptor are connected. This is connected to a 6.5X Zoom lens and then to a 1.5X lens. Supplementary Table 4 presents information about the components of lens-based system. Supplementary Table 5 compares the basic parameters of the wide field of view imaging systems. Supplementary Table 6 shows current retail costs for putting together a small portable system (either lens-based or lensless).
III. RESULTS AND DISCUSSION
A. RESOLUTION OF OPTICAL STANDARDS AND PARTICLES OF INTEREST
Various methods of computer enhancement can be used for either system. The lensless system can use diffraction from an elevated slide to magnify the image of a cell. This can work well if it is known that the subjects are the same and uniform in size. Various methods can be utilized to compensate with digital holography for diffraction issues. A reverse diffraction program was used to enhance the 1951 USAF pattern. The results are presented in Table 1. A graduated scale, 5 microns sized uniform microbeads, and a 1951 USAF resolution target was used as fixed standards for comparison purposes. Fig. 7 compares a scale imaged with both lensless and lens-based systems. It is pertinent to mention here that the scale has 10 μm separation between two consecutive lines and a line width of 2 μm.
Table 1:
Target Resolution Comparison in a Lensless (Thor LL) and Large Area Lens system (Navitar)
FIGURE 7:

Image on the left side is taken with the lensless system, and the right side with the lens-based setup. Both images have an equal field of view. The corresponding bottom images show the digital magnification of the central part of the top pictures.
The pictures have been converted to an 8-bit grayscale using Corel Ware software. 5 μm sized microbeads were utilized as an imaging subject, as presented in Fig. 9. The lens-based system shows a focused image. The lensless shows an expanded Airy disk pattern. The focused image obtained using our lens-based setup provides better clarity. These microbeads have approximately the same size as that of human red blood cells. Microbeads size is slightly smaller than neutrophils, which are cells of interest.
FIGURE 9:

(a) 1951 USAF resolution target on Lensless system (Thorlabs) Picture is cropped. 23 lp/mm. (b) Reverse processed (at a distance of 1.2 mm) image of USAF pattern with the resolution improvement from 23 to 29 line pairs/mm. (c) 1951 USAF Resolution, 143 line pairs per mm on the Navitar, 1X lens system.
B. USAF 1951 PATTERN
A negative wheel pattern, of the USAF 1951 layout, R2L1S4N, was purchased from Thorlabs. The glass substrate containing the pattern has dimensions of 76.2 mm × 25.4 mm × 1.5 mm. This pattern has resolution test targets that are made by plating low-reflectivity, vacuum-sputtered chrome on a soda-lime glass substrate. The 3” × 1” wheel pattern targets have 9 USAF 1951 targets, each with 6 groups (+2 to +7) enabling a maximum resolution of 228.0 line pairs per millimeter. Because these targets feature sets of three lines, they reduce the occurrence of spurious resolution and thus help prevent inaccurate resolution measurements. Fig. 9 presents the images of USAF 1951 pattern recorded by both lensless and lens-based setups.
The pattern is examined to find the group and the element in that group where two lines are not distinguishable from one another. Then resolution is calculated by the following formula:
| (14) |
A summary of the results is shown in Table 1. Images were taken on several different optical systems.
The lensless system (Thor LL), in Fig 9(a) without any digital computer improvement, has a resolution of 23 lines pairs per mm for a glass thickness of 1.5 mm spacing.
The reverse diffracted pattern is shown in Fig. 9b. Here, the resolution is 29 line pairs per mm. Fig. 9c shows the result of using the Navitar lens system with 143 line pairs per mm.
C. IMAGING RESULTS OF BIOLOGICAL SPECIMENS
The biological images of drosophila and neutrophils, a type of white cell, were also recorded to compare the performance of both setups.
1. DROSOPHILA IMAGES
Image of a drosophila present on a Levenhuk N20 slide 3 was taken using lens-based and lensless setup. Both setups showed good details. Fig. 10 image portrays a drosophila with a lensless system on the left. The image on the right side is taken with the lens-based setup.
FIGURE 10:

Drosophila image, left side image taken with lensless optical system. The right image is obtained using 1X lens-based system.
2. NEUTROPHIL IMAGING
Neutrophils are a type of white blood cells that has a characteristic nucleus with a complex shape that contains 3 to 5 lobes. This number of lobes increases with the age of cells. Neutrophils have a diameter of 9 to 15 μm. Neutrophils play a key role in the fight against infections. The low count of neutrophils in the blood, also known as neutropenia, may make a person more vulnerable to opportunistic infections. Chemotherapeutic drugs also contribute neutropenia that may cause various health-related problems along with the financial burden [21]. We have isolated the neutrophils from whole blood using the EasySep Direct Human Neutrophil Isolation Kit (Catalog# 19666) from STEMCELL Technologies. We obtained neutrophil cells with high purity using magnetic beads conjugated with antibodies. This immunomagnetic negative selection-based process eliminates preprocessing steps and centrifugation. The whole process takes only 30 minutes. It is possible to enumerate the neutrophils using both lens-based and lensless setups. The wide field of view, and small pixel size, enables counting the cells even when the count per volume is small. Fig. 11 is magnified for human visual purposes.
FIGURE 11:

Neutrophils are shown as lensless on the left and 1X lens on the right. Both pictures are digitally magnified in order to see the small images which have a diameter of 9 to 15 μm.
Fig.11 shows that a lensless system is capable of magnification due to the projection of the image over a distance and that diffraction widens the image with airy disks. The 1X lens-based setup focuses an image. In either case, the subject cells need to be specific without other interfering entities.
The design of lens-based system is capable of additional magnification. Although the field of view is decreased, the image size is the same, but pixel definition is enhanced by the lens magnification. This feature is present in commercial compound microscopes.
If the field of view is reduced to 1.5 μm × 2.0 μm, then our lens-based system is capable of 4.7X amplification. The examples of this higher amplification are presented in Fig. 12 to illustrate this enhanced amplification feature.
FIGURE 12.

4.7X magnification of Drosophila image on a Levenhuk N20 slide 3 (image on left side). Enhanced view of lens-based system at 4.7X amplification of Neutrophils (image on right side).
IV. CONCLUSION
A wide field of view, lens-based, and lensless optical systems were designed and built to compare their optical properties. Both optical systems have the same field of view, to be used for potential point of care biological imagining needs. Besides the advantages of a wider view, these systems also allow a larger sample area to enable accurate counting and recognition of biological samples that normally have a low count per unit volume, but at a lower cost per assay. The same type of imaging sensor was used in both systems. Both imaging setups had a field of view of 6.14 mm × 4.604mm. The same software and computer were used for both systems. The small pixel size (1.25 μm) allows digital magnification of small objects such as small microbeads or cells. A novel strategy was presented to enhance the field of view of a lens-based system. This was achieved using a compound lens design used in an explicit aim to have the same field of view as that of the basic lensless setup. The novelty of this work is an enhancement for lens-based setup. Then the performance of both optical systems was compared. The lens-based systems exhibit a better definition. The lensless systems, utilizing a near monochromatic plane wave, show a better depth of field, a degree of magnification due to the projection of the image into a far-field, but less definition. Reverse diffraction software can improve the image recorded with the lensless setup. More advanced holographic and multiple picture enhancement techniques could also be used but at increase complexity and cost, often not suitable for a point of care location.
The lensless system, due to the effective magnification of the image, illuminates a larger number of pixels that are not well focused but are detectable. As semiconductor technology decreases the size of the pixel and increases the number of pixels, the definition and image size of the image sensors will improve while still using these systems and enabling improved and cost-effective point of care assays.
Supplementary Material
FIGURE 8:

Image on the left side is taken with a lensless setup while the image on the right side is captured using a lens-based setup (1X).
Highlights.
A lensless biological imaging system is developed to enable a wide field of view imaging.
A new compound lens-based system is designed and developed to have the exact same wide field of view as that of a basic lensless setup.
Then the characteristics of these two optical imaging setups (lensless and lens-based setups) are compared at this level of complexity to see what the minimal systems principles are needed to achieve the biological imaging goals for simplified and less expensive future designs. An exhaustive comparison between the performance characteristics of both systems is carried out using optical standards and biological images.
ACKNOWLEDGMENT
We are thankful to Professor Rhodes for help on near and far-field theory, Professor Marques for computer analysis of images, Benjamin Coleman for technical discussions, and Charles Perry Wienthal for allowing us to use CEECS lab facilities to fabricate various 3D components.
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Conflict of Interest:
Authors declare no conflict of interest.
REFERENCES
- [1].Li W, Knoll T, and Thielecke H, “On-chip integrated lensless microscopy module for optical monitoring of adherent growing mammalian cells,” in 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, 2010, pp. 1012–1015. [DOI] [PubMed] [Google Scholar]
- [2].Wu Y and Ozcan A, “Lensless digital holographic microscopy and its applications in biomedicine and environmental monitoring,” Methods, vol. 136, pp. 4–16, 2018. [DOI] [PubMed] [Google Scholar]
- [3].Ozcan A and McLeod E, “Lensless imaging and sensing,” Annual review of biomedical engineering, vol. 18, pp. 77–102, 2016. [DOI] [PubMed] [Google Scholar]
- [4].Daloglu MU, Ray A, Gorocs Z, Xiong M, Malik R, Bitan G, et al. , “Computational on-chip imaging of nanoparticles and biomolecules using ultraviolet light,” Scientific reports, vol. 7, pp. 1–12, 2017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [5].Fennel R and Asghar W, “Image sensor road map and solid-state imaging devices,” NanoWorld, vol. 1, pp. 10–14, 2017. [Google Scholar]
- [6].Werley CA, Nagle SF, Ferrante JM, and Wasserman SC, “An ultra-widefield microscope for high-speed, all-optical electrophysiology,” in Optics and the Brain, 2017, p. BrM4B. 3. [Google Scholar]
- [7].Young M, “The pinhole camera: Imaging without lenses or mirrors,” The Physics Teacher, vol. 27, pp. 648–655, 1989. [Google Scholar]
- [8].Coarsey C, Coleman B, Kabir MA, Sher M, and Asghar W, “Development of a flow-free magnetic actuation platform for an automated microfluidic ELISA,” RSC advances, vol. 9, pp. 8159–8168, 2019. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [9].Rappa K, Samargia J, Sher M, Pino JS, Rodriguez HF, and Asghar W, “Quantitative analysis of sperm rheotaxis using a microfluidic device,” Microfluidics and Nanofluidics, vol. 22, p. 100, 2018. [Google Scholar]
- [10].Sher M and Asghar W, “Development of a multiplex fully automated assay for rapid quantification of CD4+ T cells from whole blood,” Biosensors and Bioelectronics, vol. 142, p. 111490, 2019. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [11].Kabir MA, Zilouchian H, Sher M, and Asghar W, “Development of a Flow-Free Automated Colorimetric Detection Assay Integrated with Smartphone for Zika NS1,” Diagnostics, vol. 10, p. 42, 2020. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [12].Sher M, Zhuang R, Demirci U, and Asghar W, “based analytical devices for clinical diagnosis: recent advances in the fabrication techniques and sensing mechanisms,” Expert review of molecular diagnostics, vol. 17, pp. 351–366, 2017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [13].Ilyas S, Simonson AE, and Asghar W, “Emerging Point-of-Care Technologies for Sickle Cell Disease Diagnostics,” Clinica Chimica Acta, 2019. [DOI] [PubMed] [Google Scholar]
- [14].Iqbal SMA and Butt NZ, “Design and analysis of microfluidic cell counter using spice simulation,” SN Applied Sciences, vol. 1, p. 1290, 2019. [Google Scholar]
- [15].Ilyas S, Sher M, Du E, and Asghar W, “Smartphone-based Sickle Cell Disease Detection and Monitoring for Point-of-Care Settings,” Biosensors and Bioelectronics, vol. 165, July/09/2020. 2020. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [16].Asghar W, Sher M, Khan NS, Vyas JM, and Demirci U, “Microfluidic chip for detection of fungal infections,” ACS omega, vol. 4, pp. 7474–7481, 2019. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [17].Hecht E, Optics, 5e: Pearson Education India, 2002. [Google Scholar]
- [18].(2020, April-05-2020). UI-3592LE Rev. 2 Available: https://en.ids-imaging.com/store/ui-3592le-rev-2.html
- [19].Navitar Optical Wizard. Available: https://www.opticalwizard.com/
- [20].Bishara W, Sikora U, Mudanyali O, Su T-W, Yaglidere O, Luckhart S, et al. , “Holographic pixel super-resolution in portable lensless on-chip microscopy using a fiber-optic array,” Lab on a Chip, vol. 11, pp. 1276–1279, 2011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [21].Inan H, Kingsley JL, Ozen MO, Tekin HC, Hoerner CR, Imae Y, et al. , “Monitoring Neutropenia for Cancer Patients at the Point of Care,” Small methods, vol. 1, p. 1700193, 2017. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
