Abstract
Cameras are a crucial part of microscopes and are also built into many kinds of instruments. To make their output comparable takes standards.
Subject terms: Business and industry, Technology, Cellular imaging, Scientific community
The academics and company scientists in the group Quality Assessment and Reproducibility for Instruments & Images in Light Microscopy (QUAREP-LiMi) are developing standards for microscopy camera output.
As in other areas of standards development, working with companies is crucial; “after all they are the expert of the hardware they are producing,” says Caterina Strambio-de-Castillia, a researcher at the University of Massachusetts Medical School’s Program in Molecular Medicine and a Chan Zuckerberg Imaging Scientist, who spearheads this effort within QUAREP-LiMi. A separate story in this issue of Nature Methods about emerging standards in microscopy can be found in this issue.
Part of the work in developing standards for cameras in microscopy and imaging is about creating common definitions as a public resource. “The QUAREP-ers are moving on all that quite well,” says Jason Swedlow of the University of Dundee, who is a co-founder of the Open Microscopy Environment. It “would be great,” he says, to have definitions as public resources for the detector and the chip, the gain, offset, read noise, readout speed, analog-to-digital conversion and full well capacity, which refers to the charge an individual pixel can hold before it is saturated.
Quantitative, standardized output
They make “precision scientific instruments,” which is why taking part in QUAREP-LiMi matters to her company, says Stephanie Fullerton, life science marketing manager at Hamamatsu, which manufactures cameras, light sources, optical sensors and other optical instruments. Her Hamamatsu colleague Sebastian Beer takes part in QUAREP. What motivates them is helping to make sure customers can get reproducible and reliable data.
Given that images are now thought of “as quantitative data rather than qualitative pictures,” it’s a necessity to categorize and capture the details of the entire imaging system, says Fullerton. “It’s just good scientific practice.”
Such metadata from an experiment are an important aspect of experiments. But as Strambio-de-Castillia and others have pointed out, ways to record metadata well have not stayed abreast with technical advancements. She and colleagues presented tiered specifications to extend the Open Microscopy Environment data model to include the needs of the 4D Nucleome Initiative Imaging Standards Working Group, BioImaging North America (BINA) and QUAREP-LiMi.
Imaging systems, with their many components and specifications, make it “daunting to conceptualize a way to organize all this metadata,” says Fullerton. The proposal’s tiered structure is an approachable way to get started. As with any new process, making it easy and intuitive for researchers to use will help with its acceptance. “I suspect there will be unexpected glitches along the way that will require iterative revisions but that is the nature of research anyway,” she says. “It’s important that we get started.”
At the company Phi Optics, originally a University of Illinois Beckman Institute of Advanced Technology spinout that he co-founded, Catalin Chiritescu wears many hats. He reaches out to users and is involved with standards organizations. The company sells platforms for quantitative phase imaging, an imaging modality like fluorescence or brightfield. Phi Optics modules can be added to microscopes, including those made by any of the ‘Big Four’, as the companies Leica, Nikon, Olympus and Zeiss are sometimes called.
Data from the company’s module include the 3D shape of a specimen, refractive index, dry mass and other quantitative measures, which all need to be reproducible. A live specimen such as a bacterial culture can have different orientations, which change the readout. He recommends making a baseline measurement with standardized samples such as plastic or glass beads of known dimensions and properties.
Software is generally increasingly important in microscopy, says Chiritescu. He and his colleagues integrate the Phi Optics module with individual microscope types. “That’s where the magic behind the curtain works,” he says. He and his team adjust the Phi Optics module to work in tandem with the microscope’s electronics and the optics so that the phase-imaging module delivers consistent measurements. All of today’s biology, chemistry, physics and materials science focuses on generating reproducible measurements, says Chiritescu. Standards, he says, help to weed out science that lacks proper follow-through with a scientific method.
“You can’t really punish the users for being curious,” says Catalin Chiritescu.
Sometimes, users modify instruments, which Phi Optics sees as a “two-way street,” he says. “You can’t really punish the users for being curious.” The modifications can lead to instrument improvements, and the company gladly offers to help a lab make modifications.
Standards organizations such as QUAREP are useful to Phi Optics, he says, to interact with researchers and learn about their needs and those of imaging facility managers. Those needs shape what Chiritescu and others in the industry do. “This is where they start,” he says. Standards and procedures developed in academia often later become established in the commercial world.
Like any human endeavor, QUAREP and others are right to present companies with “a big ask,” he says. Academics might not get all they want, and some things might not be practicable or there may be proprietary aspects to consider.
Committed to open file formats
From its founding, Phi Optics committed itself to open file formats. Data from its instruments can be opened with ImageJ or with Fiji tools, or any software that can handle TIFF files, says Chiritescu. The files contain the image properties and the associated metadata, such as date and time of image acquisition, software used for data acquisition, type of microscope to which the Phi Optics module is attached, magnification, sample temperature and other details. “The camera itself, it’s a microcosm of metadata in itself,” he says.
Scientists want to document everything about the camera output that a user can modify, such as exposure, how long the sensor’s iris stays open and anything else that another lab needs to reproduce the work, he says. Slightly different conditions, such as an iris open for one versus two seconds, might lead to different observations. Labs want to be sure that what’s causing the shift is biology: are the bacteria changing their state, or is a different result due to a different fluorescence filter? As a scientist, he understands that academics need as much metadata as possible. As an industry person, he finds that “there’s only so much metadata that can be crammed into a file” before the effort to do so delivers diminishing returns.
Working on standards means getting precise definitions across camera types. He recalls one discussion about ‘camera gain’, which might seem straightforward. Yet, though he can explain what that means from the Phi Optics perspective, another company might define it differently. Talking such matters through in groups such as QUAREP is a way to agree on the most accurate way to describe a parameter. And camera gain may just be “one single line item in the pile of metadata,” he says.
To reach agreement on standards involves synching these definitions. Therefore, it’s in everyone’s interest to take part in such discussions. To collaborate and reach reproductible science, scientists want to compare data across modalities and instruments from different manufacturers. “We as industry would like to support that effort,” says Chiritescu.
“The camera itself, it’s a microcosm of metadata in itself,” says Catalin Chiritescu.
Cameras: CCD, CMOS, sCMOS
At PCO, Gerhard Holst has long directed the research department, where he also ran grant-funded research projects with which the company explored new approaches for its cameras. Those include complementary metal-oxide semiconductor (CMOS), charge-coupled device (CCD)-based cameras and high-speed cameras. In 2021, Excelitas Technologies acquired PCO. Excelitas works in photonics and sells detectors, light sources and optics supplies for the life sciences, materials sciences and avionics.
PCO’s cameras can be used on microscopes, and some cameras are built into instruments, such as a PCR instrument from a manufacturer he prefers not to name. The manufacturer approached PCO with a need for a camera with low drift, which is the term for when a camera begins to pan in one direction or another. Demand for the product was likely going to be a niche market. The instrument was launched, says Holst, right as a swine flu outbreak emerged that made PCR-based virus identification critical. Other public health emergencies, such as COVID-19, have taken PCR instruments far beyond a niche market, he says.
To take good measurements, a scientific camera needs low noise and wide dynamic range, says Holst. The majority of scientific cameras are monochrome. Some labs use smart phones for their work and, in his view, “it’s nice for the effect or proof of principle, but not for measuring.” Those cameras have small pixels and tend to use much software to make up for the low light capture, but that image processing gives scientists too little control, he says.
Developing standards means collaborating against the backdrop of competitive contexts. In 2010, says Holst, the camera world changed when an advance in transistor design—scientific complementary metal-oxide semiconductor (sCMOS) technology—led to a new type of camera. Three companies—Fairchild Imaging, later bought by BAE Systems; Andor Technology, now part of Oxford Instruments; and PCO, now part of Excelitas—developed and presented such cameras. They had different options for pixel structure, which the companies discussed with one another. Each chose a slightly different tack and began selling sCMOS cameras.
A bit later, Hamamatsu released a sCMOS camera with, to Holst’s knowledge, a different sensor. And, to his knowledge, Fairchild later sold some of its technology to Hamamatsu. Some back and forth between the companies ensued, he says, and in his recollection, Andor and PCO were given access to the image sensor from the Fairchild–Hamamatsu deal. Speaking more generally, he says, the interest in scientific CMOS-sensor-based cameras came because they have lower noise than CCD cameras and are faster.
Hamamatsu’s Fullerton says she cannot speak in detail about its partners for its sCMOS technology. Overall, she says, in the past few years, advances have led to lower CMOS noise and to uniformity across the sensor. The company’s latest camera, the ORCA-Quest, “has low enough read noise to permit resolving of individual photons.”
Standards have long been part of the scientific camera world. According to the CMOS Data Book, a 1977 Fairchild Camera and Instrument Corporation publication, when the active pixel technology using CMOS was introduced in the early 1970s and “as each new generation of designs was developed, a large variety of functional and performance parameters were generated by the industry creating a great deal of customer confusion.”
In late 1976, the authors continue, “the CMOS vendor community accepted the formidable task of clearing this confusion via industry-wide standardization.“ This led to the Jedec Industry Standard ‘B’ Series CMOS specification. JEDEC is a microelectronics organization with more than 300 company members that connects suppliers and manufacturers to develop standards.
JEDEC grew out of the Joint Electron Tube Engineering Council, established in 1944. It later began to include solid state devices. In 1958, it renamed itself Joint Electronic Device Engineering Council (JEDEC), with a council for tubes and one for semiconductors. Many standards in the electronics industry, such as those for RAM chips, have been worked out by JEDEC.
Compare and contrast
“In the end, it’s always the application that defines what the best camera is,” says Holst. A scientist wants to trust the right tool for the questions he or she has. PCO has, for example, supported the standard for machine vision cameras. Machine vision describes cameras used, for instance, in manufacturing for quality control. These are usually non-cooled, smaller cameras, which PCO also sells, says Holst.
The first module of this machine vision standard took time to develop and was launched in 2005. There have been new releases since then. Development is and was led by the European Machine Vision Association (EMVA), a nonprofit that brings together the machine vision industry in Europe.
One EMVA standard is EMVA 1288, for the specification and measurement of machine vision sensors and cameras. It defines “reliable and exact measurement procedures.” Although this standard originated with the EMVA, which is not specifically focused on science applications, EMVA 1288 can be and is used for scientific cameras, says Holst. The standard defines how specs should be measured and presented but not which specs themselves have to be achieved. “The specs to be achieved are defined solely by the application,” he says. Another EMVA standard, GeniCam, is for programming interfaces for cameras and devices.
Hamamatsu follows EMVA 1288 to measure camera specs, says Fullerton, and “this has been especially useful for clarifying how we arrive at more subtle specs such as linearity.” With calibration measurements, it is often not feasible in practice for researchers to replicate them, “and in fact, we’ve seen noise introduced into analyses done with improper calibrations.” By following EMVA standards, she and her colleagues hope “we build trust with our users so they know they can rely on our cameras to be quantitative.”
Within standards organizations, PCO’s Holst has experienced intense discussion about how quality parameters of image sensors and cameras should be measured and presented. He also takes part in the discussions related to microscopy, including QUAREP.
Holst knows how much it matters to labs to describe the parameters of a microscope setup, and that what is captured has to be stored so that experiments can be repeated and the data can be trusted. When he joined QUAREP, he thought his experience in describing camera quality parameters would come in handy, “and they don’t need to invent the wheel a second time,” he says. He is happy to take part in the discussions for that reason and also to learn about customer needs and wants.
Some researchers develop new tools and are keen on all of a camera’s parameters, and want it made to exactly match their requirements, say, in terms of speed, noise or linearity. Others, he says, buy an instrument, such as from Horiba or Perkin Elmer, with a built-in camera, and those scientists are less preoccupied with the camera’s parameters.
When a PCO camera is built into a microscope, such as for total internal reflection microscopy, super-resolution or light-sheet microscopy, Holst and his colleagues are involved with the manufacturer.
For example, Zeiss’s Z7 light-sheet microscope, the Elyra Lattice SIM system, uses PCO cameras. Microscopes from other manufacturers use PCO cameras, too, but Holst says he is contractually not permitted to name them. PCO also helps labs build tailored systems. Overall, standards are “for us, it’s not a big deal,” he says. “We are doing 1288 measurements for all of our cameras” to assure the cameras meet the EMVA 1288 specifications. PCO gives these data to customers.
“In the end, it’s always the application that defines what the best camera is,” says Gerhard Holst.
Given the large amount of data that microscope experiments produce, he sees some companies developing data-compression algorithms. He works with these companies to make sure the readout from the PCO camera remains intact. In some cases, the algorithms are integrated into PCO firmware. How the information that the camera captures is transferred within a microscope system is not something PCO controls.
Software packages from microscope manufacturers have become huge, says Holst. Overall, the captured metadata have increased. “For them, the large amount of metadata is more work,” he says.
Some software packages look like they were “developed by biophysicists for biophysicists,” he says. He sympathizes with a desire for control over many software parameters. “As an engineer, I would like to control everything,” he says. But during his seven years at the Max Planck Institute of Marine Microbiology, where he developed optochemical microsensors and imaging systems, he learned that not every user needs to turn every knob. It would have sent many of his biology and geology colleagues “running out of the lab crying” if he had developed large, complex software packages with too many controls.
Scientists in core facilities want systems to perform reliably every day and need different kinds of cameras, perhaps a cooled camera or a non-cooled one, one with larger or smaller pixels, or they may have an application in which slightly higher readout noise is acceptable, says Holst.
Some technical details are ones that working scientists might like to remain aware of. Different voltages to a camera sensor will change the output signal in ways that might, to some, be surprising, says Holst. Comparing cameras thus takes careful thought. A small pixel size might seem to deliver a less sensitivity, but “it’s simply geometry,” he says. Scientists can fall into a trap when they compare cameras in terms of sensitivity but only look at the number of pixels and omit pixel pitch, which is the size of the pixel itself. With more pixels, each pixel gets less signal, says Holst. Thus, comparison involves taking into account the number of photons that reach each pixel.
Sometimes he has to console scientists whose data seem to have gone entirely missing. They forget, he says, about dynamic range, which leads a 16-bit image rendered on an 8-bit screen to be entirely black.
Standards need clear definitions, says Fullerton. Specs such as pixel number and size are straightforward. Pixel gain, however, is an example of a critically important spec with a variety of definitions and is often a source of confusion, she says. Hamamatsu reports gain in units of electrons per analog-to-digital unit, and this value enables gray level intensity in a pixel to be converted to a non-arbitrary unit of photoelectrons. In her view, defining intensity in photoelectrons allows meaningful comparisons between images taken from different cameras. And, with knowledge of the quantum efficiency at the wavelength of interest, one can approximate pixel intensity in photons. She and her colleagues share with customers the, as she says, “importance and value of ‘thinking in photons’.”
Standards need clear definitions, says Stephanie Fullerton.
Each Hamamatsu sCMOS and qCMOS camera reaches a scientist with a quality-control document that provides the factory-measured gain value for that camera. “However, not every camera manufacturer does this and there are other types of gain, such as analog and EMCCD gain, that confound the issue,” she says, referring to electron-multiplying CCD gain.
Another example in defining specs is read noise, she says. For CCDs, read noise was always indicated as “electrons RMS,” or root mean square. With the advent of CMOS cameras, some manufacturers chose to specify read noise as the “median.” The median is generally a lower number than the RMS, which can lead to false expectations.
Subtleties beyond data sheets are involved when comparing different cameras. Even live, head-to-head demos can lead to erroneous interpretations without an understanding of sensors, says Fullerton. Echoing Holst, she says pixel size is an often-overlooked issue in cameras. Bigger pixels will collect more photons in a given time. By adjusting the optics, scientists can let the same photon density fall on pixels of different sizes. Or they can address differences due to pixel size and optics in post-processing. “We’d rather see optical methods,” she says. Otherwise “a camera with a larger pixel can appear to be more sensitive when all you’ve really got is a bigger ‘photon’ bucket,” she says.
Holst says that with the QUAREP camera standard in the works, “my hope is that we can help to make it a good standard.” It can be one that pertains to all the needed parameters, such as photon count and resolution, readout noise and pixel size, that are stored on the camera and more generally on the detection side of the microscope. Holst knows his colleagues at Hamamatsu, Andor and Photometrics and says relations between colleagues in the industry are cordial. He is happy that organizations such as QUAREP and others have set up a way for manufacturers and academics to work together.
Holst enjoys the QUAREP discussions and, he says, Strambio-de-Castillia does a “tremendous job” in the camera standards working group. “It’s a huge task,” he says, but a good one. All microscope users, he says, need ways to draw the best conclusions from their data without being a camera expert.