Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2011 Aug 14.
Published in final edited form as: Cold Spring Harb Protoc. 2009 Dec;2009(12):pdb.top65. doi: 10.1101/pdb.top65

Computational image analysis of cellular dynamics: A case study based on particle tracking

Khuloud Jaqaman 1, Gaudenz Danuser 1
PMCID: PMC3155779  NIHMSID: NIHMS307062  PMID: 20150102

Introduction

Obtaining quantitative data from live cell images is the key to testing mechanistic hypotheses of molecular and cellular processes. The importance of using computer vision-based methods to accomplish this task is well recognized (Eils and Athale 2003; Swedlow et al. 2003). However, in practice, investigators often encounter obstacles that render the application of computational image processing in cell biology far from routine: First, it is not always clear which measurements are necessary to characterize a molecular system, and whether these measurements are sufficient to characterize the cellular process investigated. Second, even if the requirements for measurements are well-defined, it is often difficult to find a software tool to extract these data. It is even more challenging to find software tools that can answer specific questions that are raised by the hypotheses underlying the experiments.

One solution is for investigators to develop their own software tools. This is feasible for some applications with the assistance of commercial and open source software packages that support the assembly and integration of custom-designed algorithms, even for users with limited computational expertise. Another solution is for investigators to develop interdisciplinary collaboration with computer scientists. Such collaborations require close interaction between the computer scientists and experimental biologists to jointly optimize the data acquisition and analysis procedures, which must be tightly coupled in any project applying computational analysis to biological image data.

This chapter aims to introduce basic concepts that make the application of computational image processing to live cell image data successful. While the concepts are general, examples will be taken from the case study of particle tracking (PT), one of the most frequently encountered problems in cell biology. For a broader discussion of computer vision in live cell imaging, we refer to (Dorn et al. 2008).

Why use computational image analysis?

Efficiency

Efficient extraction of quantitative measurements is a major motivation for the use of computational image analysis, especially in the context of screens. With the development of microscopes for live cell genome-wide screens (Smith and Eisenstein 2005; Bakal et al. 2007), it is possible to acquire vast amounts of data in ever shorter times. For example, even at low spatiotemporal sampling, a live cell siRNA screen of 49 mitotic genes generated over 100 GB of image data (Neumann et al. 2006). Such quantities of movies make data management challenging and manual data analysis unrealistic. Instead, these types of experiments require computational image analysis to extract image features for the classification of cell behavior in response to perturbations. For screens, robustness is vital. Thus, simple algorithms that produce meaningful features without the need for manual validation of image analysis results have been mostly applied (Abraham et al. 2004). Alternatively, robustness has been achieved by manually training the computer to recognize a small number of phenotypes (Conrad et al. 2004; Chen et al. 2006).

Consistency

Computational image analysis yields consistent data, i.e. different experiments are processed based on the same parameter settings and criteria for the validation of measurements. This eliminates uncertainty associated with subjective interpretations of image contents among investigators and even by one investigator in different instances. Furthermore, computational image analysis permits the quantification of measurement uncertainty that originates from noise in the raw imagery. High consistency and known uncertainty are particularly useful when the study of a certain cell function demands distinction between weak yet significant phenotypes (Dorn et al. 2005).

Completeness

Computational image analysis yields complete data, i.e. every image event that fulfills an objective set of criteria is considered. Humans have a tendency – by nature or necessity – of concentrating on the apparently interesting events. This may bias the analysis and may increase the risk of overlooking rare events associated with weaker phenotypes. In contrast, complete image measurements permit the statistical selection of obvious and less obvious events, including highly transient events. Image transients are particularly relevant to establish functional linkages between the dominant image events.

Case study: Particle tracking (PT)

Live-cell images often consist of large numbers of punctate features (“particles”) representing single fluorophores tagging single molecules (Sako et al. 2000; Fujiwara et al. 2002; Groc et al. 2004), fluorophore clusters associated with sub-resolution molecular assemblies (Zenisek et al. 2000; Ewers et al. 2005; Danuser and Waterman-Storer 2006), or fluorophore blobs associated with vesicles or more extended organelles (Ehrlich et al. 2004; Tirnauer et al. 2004). To capture the full spatio-temporal complexity of sub-cellular particle dynamics and to link them to the underlying molecular processes, data must be extracted from live-cell images using automated PT techniques.

PT consists of two major steps: (1) particle detection in each frame of the time-lapse sequence, and (2) particle trajectory construction across the time-lapse sequence (Fig. 1). While in some frameworks particle detection and trajectory construction are coupled and feedback into each other (Ponti et al. 2005; Racine et al. 2007), in most computational analysis frameworks the information flow is one way from detection to trajectory construction. Either case, trajectory construction can be assisted by using particle motion models that predict the particle positions in a frame based on the positions in the past and thus reduce the ambiguity of establishing particle correspondences between frames (Fig. 1). Furthermore, PT must generally include a trajectory diagnosis module that assesses the quality of the tracking results and through which the tracking parameters, as well as the motion modeling parameters, can be optimized (Fig. 1). Below, we discuss the detection, trajectory construction and motion modeling modules. In the following two sections, we discuss the design of experiments that yield image data optimized for automated image analysis, and the diagnosis of image analysis results to assess tracking quality and adjust analysis parameters.

Figure 1.

Figure 1

PT builds on essential steps (detection and trajectory construction) and optional but recommended steps (motion modeling and trajectory diagnosis). Image acquisition and analysis are tightly coupled.

Detection

The goal of particle detection is to obtain numerical representations of the location and properties of image features (Starck et al. 2000; Nixon and Aguado 2002). Image features are local intensity maxima whose intensity level is significantly different from their neighborhood. Consequently, particle detection techniques must define a meaning of “neighborhood” for the computation of a representative background intensity distribution; and a meaning of “significantly different”. The most rigorous way is to cast the comparison of foreground to background intensity as a statistical test.

Sub-resolution features above a dark background, as encountered in single molecule imaging, can be detected by comparing the intensity of local maxima to the local background intensity distribution (Jaqaman et al. 2008). For low signal-to-noise (SNR) time-lapse sequences (SNR < 3, where SNR is defined as the ratio of signal above mean local background to local background variation), image time-averaging can be used to enhance detection efficiency (Jaqaman et al. 2008). If features are sub-resolution but lie above a sea of fluorescence, such as speckles marking dense macromolecular assemblies, more sophisticated algorithms that compare local intensity maxima to their neighboring local intensity minima, given a pre-calibrated model of camera noise, must be employed (Ponti et al. 2003).

After the detection of significant local maxima, the sub-pixel positions and peak intensities of particles can be estimated via point spread function (PSF) fitting (Thomann et al. 2002; Yildiz and Selvin 2005; Jaqaman et al. 2008). For particles in isolation, the achieved positional precision depends only on the SNR; single nanometer precision can be achieved if sufficient photons are collected (Yildiz and Selvin 2005). For particles not in isolation, iterative PSF fitting can be used to obtain unbiased position estimates and at the same time enhance resolution in detecting closely juxtaposed particles (Thomann et al. 2002; Dorn et al. 2005; Jaqaman et al. 2008). Based on simulations and indirect experimental evidence, iterative PSF fitting was found to overcome the diffraction-limited resolution of a microscope by a factor 2 – 3 (Thomann et al. 2002). Thus, distances of 100 nm can be measured without the use of super-resolution imaging (Bates et al. 2007; Shroff et al. 2007). The methods described here for the detection and localization of sub-resolution features are readily applicable in both two and three dimensions.

PSF fitting cannot be applied for the detection of particles representing objects larger than the diffraction limit, especially if their size varies. For particles with variable size but that are still relatively isotropic, wavelet-based algorithms can be applied (Olivo-Marin 2002). For particles with additional shape variations, (Tvarusko et al. 1999) employed an edge detection-based algorithm to find particle contours. While the only properties of sub-resolution features are position and intensity, larger particles can be described also by their size and shape. These additional characteristics are invaluable information to support the construction of particle trajectories. Note that the detection of anisotropic larger image features in three dimensions is a very difficult problem with currently no general solution.

Trajectory construction

Arguably, the key step of PT is the establishment of the correspondence between particle images in a sequence of frames in order to construct particle trajectories throughout the time-lapse sequence. Establishing correspondence is complicated by various factors, most notably high particle density, particle motion heterogeneity, temporary particle disappearance (e.g. due to out-of-focus motion and detection failure), particle merging (i.e. two particles approaching each other within distances below the resolution limit), and particle splitting (i.e. two unresolved particles diverging to resolvable distances) (Meijering et al. 2006; Kalaidzidis 2007). Historically, many of these challenges have been overcome by diluting the fluorescent probes, resulting in a low particle density with almost unambiguous particle correspondence (Ghosh and Webb 1994; Crocker and Grier 1996). Under such conditions, PT is indeed reduced to a simple particle detection and localization problem. However, while low particle densities reveal motion characteristics, they do not allow probing of the interactions between particles. Also, the amount of data collected per experiment is low, limiting the observation of spatially and temporally heterogeneous particle behavior and hindering the capture of rare events. Furthermore, even with low particle density, low SNR and probe flicker complicate the search for particle correspondence. Therefore, for most cell biological studies, there is a great need for robust trajectory construction methods that address the challenges mentioned above.

In the very low density case, where the ratio between particle displacement and the mean nearest neighbor distance is ≪ 0.5, particle frame-to-frame assignment can be achieved via a simple local nearest neighbor (LNN) algorithm (Fig. 2A). Stepping through the list of particles in one frame, particles are linked to the closest particle in the next frame1.

Figure 2.

Figure 2

Trajectory construction via local nearest neighbor (LNN) assignment. (A) LNN succeeds when ρ = (average frame-to-frame displacement)/( average nearest neighbor distance) ≪ 0.5. (B) LNN fails when ρ = > ~0.2.

The LNN approach breaks down when particle density is high enough such that particles have more than one candidate assignment in the next frame (Fig. 2B). The outcome of a LNN algorithm in these situations depends on the order by which the assignments are made. In the example of Fig. 2B, if the triangle correspondence is assigned before the circle’s correspondence, then the triangle and circle will get the wrong assignments. In contrast, if the circle correspondence is assigned before the triangle, then the assignments for all three interfering particles will be correct. In general, the best order of particle assignments is undefined. In some cases, simple heuristics may be sufficient to remedy the situation (Ponti et al. 2003). However, in general, a global solution is required to achieve satisfactory tracking results.

The most accurate and globally optimal solution to PT is provided by the method of multiple-hypothesis tracking (MHT) (Reid 1979). In MHT, given the particle positions in every frame, all particle paths within the bounds of expected particle behavior are constructed throughout the whole movie. The largest non-conflicting ensemble of paths is then chosen as the solution (‘non-conflicting’ means that no two paths share in any frame the same particle). This solution is globally optimal in both space and time, i.e. it is the best solution that can be found by simultaneously accounting for all particle positions at all time points. Clearly, MHT is computationally prohibitive even for problems with a few tens of particles tracked over a few tens of frames.

Heuristic algorithms with higher computational efficiency have been proposed to approximate the MHT solution. Most of these algorithms are greedy, i.e. they seek to approach the globally optimal solution by taking a series of locally optimal solutions. Usually, this means that particle correspondence is determined step-by-step between consecutive frames, reducing computational complexity at the expense of temporal globality. Many tracking algorithms then solve the frame-to-frame correspondence problem in a spatially global manner, thus they are referred to as global nearest neighbor (GNN) approaches. GNN approaches have been developed in the field of radar tracking and computer vision, and many have been recently applied to cell biological studies (Vallotton et al. 2003; Bonneau et al. 2005; Sage et al. 2005; Sbalzarini and Koumoutsakos 2005; Shafique and Shah 2005; Genovesio et al. 2006)2. Some algorithms deal with the additional factors complicating PT, namely temporary particle disappearance (Chetverikov and Verestoy 1999; Veenman et al. 2001; Bonneau et al. 2005; Sbalzarini and Koumoutsakos 2005; Shafique and Shah 2005; Genovesio et al. 2006), particle merging and splitting (Genovesio and Olivo-Marin 2004; Jiang et al. 2007), and particle motion heterogeneity (Genovesio et al. 2006).

(Jaqaman et al. 2008) describes the most recent tracking algorithm for cell biological applications3. It uses a single, efficient mathematical framework, the linear assignment problem (LAP) (Burkard and Cela 1999), to provide an accurate solution to all the PT challenges listed above. Given a set of detected particles throughout a time-lapse image sequence, the algorithm first links the detected particles between consecutive frames, and then links the track segments generated in the first step to simultaneously close gaps and capture particle merge and split events. Thus, the initial particle assignment is spatially global but temporally greedy, while the subsequent track segment assignment is accomplished via spatially and temporally global optimization, overcoming the shortcomings of algorithms relying solely on greedy assignment strategies. The algorithm is general, and can be applied to both two dimensional and three dimensional problems. Overall, this approach defines an accurate yet computationally feasible approximation to MHT, allowing the robust tracking of particles under high density conditions, as usually found in live cell images.

Motion modeling

The robustness of GNN assignment under high density conditions can be increased by motion prediction. The assignment is no longer made based on particle positions in the target frame t+1 and source frame t, but based on particle positions in the target frame t+1 and the predicted positions of the particles from the source frame t to the target frame t+1. Possible approaches to particle motion prediction between frames are to estimate the global organization of particle motion iteratively from the available particle assignments (Ponti et al. 2005) or other tracking methods (Ji and Danuser 2005), or to formulate explicit motion models for each particle, whose parameters are inferred based on the already tracked particle paths (Genovesio et al. 2006; Jaqaman et al. 2008).

Acquisition of optimized fluorescent images

PT algorithms will fail to capture live cell dynamics unless image acquisition is adjusted to the process of interest in terms of spatial and temporal sampling, SNR, and the movie length necessary to capture all possible process states. However, spatiotemporal sampling, SNR, and observation length are interdependent and partially conflicting imaging parameters. For example, high spatiotemporal sampling implies fast acquisition at high magnification, which results in fewer photons reaching the imaging sensor and thus low SNR. SNR could be improved by prolonging the exposures or by increasing the power of the illumination; however, this in turn will increase photobleaching and phototoxicity, limiting the number of possible exposures and hence observation length.

Image analysis algorithms impose an additional layer of conflicting requirements on data acquisition. For example, tracking quality decreases as the ratio of particle frame-to-frame displacements to inter-particle distances increases. To improve tracking quality at the same particle density, images must be acquired faster. However, faster acquisition may reduce the image SNR, thus reducing detection quality and leading to more temporary particle disappearances. The occurrence of temporary particle disappearances increases the risk of erroneous particle linking between frames under high particle density conditions. Image acquisition and tracking parameters must thus be iteratively adjusted to optimize tracking quality and minimize tracking errors (Fig. 1). As a general rule, image acquisition and analysis are tightly coupled in a quantitative live cell imaging project.

To design a quantitative imaging experiment, the minimum requirements of spatiotemporal sampling, SNR and observation length need to be defined, and their compatibility with the available specimen and microscope hardware must be tested. For each of the three imaging parameters, the specimen and microscope hardware define a maximum performance point, which can be derived from the microscope specifications (fastest acquisition rate of the camera, highest magnification) or can be determined experimentally (acquisition time before bleaching or phototoxic damaging of the specimen; SNR obtained under very long exposures, e.g. in fixed specimens). Given the mutual interdependence between the parameters, the joint performance of an experimental setup can be conceptualized by the plane through the three maximum performance points (Fig. 3). If the minimum requirements of a specific experiment fall in a point beyond the performance plane of an experimental setup, it is impossible to acquire all the necessary image data using that setup.

Figure 3.

Figure 3

Performance triangle of an experimental setup, as determined by the specimen, microscope hardware, and image analysis software. Spatiotemporal sampling, SNR and observation length are interdependent and conflicting imaging parameters. Modified, with permission, from (Dorn et al. 2008).

There are two solutions to the problem of an insufficient experimental setup. First, the setup can be redesigned, for example by investing in better microscopy hardware or by improving the stability and efficiency of the fluorescent probes. Second, one can compromise at the level of individual movies and instead combine data from different experiments at the analysis level, under the assumption that cells imaged in different experiments are statistically equivalent. For example, in (Loerke et al. 2009), data from fast temporal sampling but short movies and data from slow temporal sampling but long movies were combined to obtain a comprehensive coverage of the wide distribution of clathrin-coated pit lifetimes.

In the following, we provide some general guidelines on how to determine the minimal requirements for a specific experiment.

Sampling

To allow any computational image analysis of the spatiotemporal dynamics of a live cell, the specimen must be sampled at least three times finer than the highest spatial and temporal frequency of interest (Stelzer 2000). Approximating the microscope 3D PSF by an ellipsoid with short (and equal) semi-axes in the lateral direction and a long semi-axis in the axial direction, sufficient sampling means that the magnification of the microscope must be selected such that (1) the pixel side length is at least one-third the PSF short semi-axis in the lateral direction, and (2) the z-slice thickness is at least one-third the PSF long semi-axis in the axial direction (Inoue and Spring 1997).

For sampling in time, the characteristic time scale of the probed dynamics either is assumed a priori based on previous work or simulations of the molecular processes of interest, or is determined by analyzing the dynamic data themselves. For example, for estimating the diffusion coefficient of a particle undergoing confined diffusive motion, one often plots the mean square displacement (MSD) of the particle over time lag. If the particle dynamics are well-sampled, the MSD would first grow linearly with time, and then reach a plateau reflecting the confinement radius (Fig. 4A, black dots). The initial linear part of the MSD plot can yield a good estimate of the diffusion coefficient, estimated by fitting a straight line through the first few points of the MSD curve (Fig. 4A, black line) (Huet et al. 2006). On the other hand, for under-sampled dynamics, the particle bounces from the boundaries many times within the sampling period. As a consequence, the linear phase of the MSD plot vanishes, precluding any accurate estimate of the diffusion coefficient (Fig. 4A, cyan and red symbols and lines). Similarly, if a particle is undergoing periodic or quasi-periodic movements, the observed particle positions are dictated by the number of motion reversals within a sampling interval. If the dynamics are under-sampled, the measured speed is inversely proportional to the sampling interval. The optimal sampling interval can thus be determined by first acquiring image data at the maximum sampling rate affordable, ignoring the limitations that too fast sampling imposes on the observation length. Then, the image sequence can be downsampled artificially, and the optimal sampling interval determined as the one where the plot of measured velocity versus time interval deviates from a horizontal line (Fig. 4B).

Figure 4.

Figure 4

Data analysis to ensure proper temporal sampling of the measured dynamics. (A) Effect of sampling rate on diffusion coefficient estimation (line fits) for confined diffusive motion. (B) Effect of sampling rate on speed estimation for periodic and quasi-periodic movement.

Signal-to-noise ratio (SNR)

SNR requirements are determined entirely by the image analysis algorithm. Given a high SNR image of the specimen, for example taken with long exposures of a fixed sample, the detection fidelity and the break-down of an algorithm can be identified by the simulation of increasing noise levels on this image. Subsequently, the illumination conditions and exposure times that produce an SNR above the break-down limit can be determined experimentally. These imaging parameters have to be defined such that the SNR conditions are satisfied at the end of the time-lapse image sequence, where the effect of photobleaching is strongest.

Observation length

The observation length required to capture all possible states of a dynamic molecular system is the most difficult criterion to determine a priori. Data from multiple experiments must be pooled together until there is statistical evidence that all system states have been sampled, for example parameter distributions converge to a fixed point, under the assumption that all cells behave equivalently. Because of the ability to pool data from multiple experiments, the observation length is oftentimes the least essential criterion to satisfy in a single experiment. In contrast, too slow sampling or too low SNR yield a loss of primary information that cannot be recovered by data pooling.

Adjustment of control parameters and diagnostics for track evaluation

Maximum efficiency, consistency and completeness in image measurements imply minimal user input for software control. Yet, it is impossible to design image analysis algorithms that are universally robust to achieve complete understanding of image contents in all applications. User input is always required to supply an algorithm with application-specific prior knowledge.

However, the amount of information provided and the effect it has on the outcome of the image measurement have to be carefully analyzed with every experiment. Also, a practical compromise between total automation (which might be impossible) and complete dependence on user-specified parameters (which might bias the data) can be achieved by designing self-adaptive algorithms that learn parameters on-the-fly while analyzing the images, but where the user defines lower and upper bounds to prevent drifts in self-adaptation. For example, in (Jaqaman et al. 2008), instead of depending on a user-specified search radius for linking particles between frames, the software estimates the search radius from the constructed particle trajectories and relies on user-input only to define the very fastest speed that can be expected for a particle. Also, while the user specifies whether the algorithm is to consider only Brownian motion or Brownian and linear motion, the decision on whether an individual particle is undergoing linear motion or Brownian motion is determined by the software, as are the parameters characterizing each motion model.

Independent of the level of user input, software outputs always have to be benchmarked carefully. Importantly, while visual inspection is good first practice, in many cases of PT the visual impression of particle dynamics can be deceiving and many significant particle behaviors are simply missed. Thus, before manual measurements are accepted as the ground truth for benchmarking, the inter- and intra-operator variability of the manual dataset has to be determined and documented. We have encountered several PT projects where these performances were as low as 50% and 70%, respectively (unpublished observations). Such data are insufficient to evaluate the results of computational PT. In the following we discuss two approaches that allow a more objective benchmarking of tracking outputs.

Simulation-based benchmarking

Simulation experiments provide a means for determining absolute measures of false positives, false negatives and other performance parameters of tracking under different conditions. For example, the selectivity of a point detector, which relies on a statistical test to distinguish particle signal from noise (Ponti et al. 2003; Jaqaman et al. 2008), is controlled via the confidence threshold required for accepting a local maximum as a particle. Lowering the threshold increases the number of false positives, raising the threshold increases the number of false negatives. The ratio between the two fractions is a nonlinear function of the threshold and the movie SNR. Thus, using simulated images of various SNR, the performance of the point detector, i.e. the number of false positives and negatives it generates, can be evaluated. Such a performance graph can then be used to determine the optimal threshold for a movie with a particular SNR. Similarly, using ground truth tracks, the quality of the trajectory construction algorithm can be evaluated as a function of particle density and movie SNR (Jaqaman et al. 2008). This identifies the breakdown point of the trajectory construction algorithm, and determines the expected quality of the constructed trajectories given the particle density in a movie and its SNR.

Data-based diagnostics

In addition to simulation-based software benchmarking, diagnostics that analyze and evaluate the particle trajectories obtained from the experimental images can be used to optimize the trajectory construction parameters. For example, in determining the search radius for particle linking between frames, the distribution of particle displacements obtained from the tracking must be investigated for distortions. If the displacement histogram is cut off (Fig. 5A), the search radius had been set to a too small value. In contrast, if the histogram decays gradually to zero (Fig. 5B), the search radius is large enough to capture all possible displacements. Caution: When the trajectory construction algorithm employs motion propagation, one should not analyze the particle frame-to-frame displacements, but rather the distances between particle propagated positions and the particles they get linked to. Thus, in Fig. 5A and B, the x-axis is labeled as “frame-to-frame linking distance,” and not “frame-to-frame displacement.”

Figure 5.

Figure 5

Trajectory diagnosis for evaluation of tracking. (A) Cut-off histogram of frame-to-frame linking distances, a sign that the search radius is too small. (B) Slowly decaying histogram of frame-to-frame linking distances, indicating a sufficiently large search radius. (C) Sampling rate-independent distribution of gap lengths, expressed in frames. (D) Histogram of gap lengths with a too large gap closing time window (gray) and an appropriate time window (black).

Another critical parameter to optimize for trajectory construction is the time window for closing trajectory gaps resulting from temporary particle disappearance (Jaqaman et al. 2008). In live-cell time-lapse sequences, particles temporarily disappear either because of detection false negatives or because of random particle motion in and out of focus. Two consequences follow from this: First, trajectory gap length distributions, when measured in frames, should be independent of movie sampling rate (if the same exposure time is used) (Fig. 5C). Second, for any sampling rate, longer gaps should be encountered less frequently than shorter gaps. Thus, a plateau in the tail of the histogram of gap lengths indicates that the time window used for gap closing is too large, resulting in falsely closed gaps (Fig. 5D, gray histogram). In this case, the time window for gap closing must be reduced until there is no longer a plateau (Fig. 5D, black histogram).

Conclusion

Computational image analysis is a complex yet increasingly central component of live-cell imaging experiments. Much has to be done to make it useful for cell biological investigation. First, algorithms have to be transparent, not necessarily at the level of the code, but in terms of their sensitivity to changing image quality and in terms of the effect that control parameters have on the output. Second, the design of imaging experiments must be tightly coupled to the design of the analysis software. All too often, images are taken without careful planning of the subsequent analysis and then they are forwarded to the computer scientist “to retrieve some information from the images”. To avoid these problems, the communication has to be initiated early on and experiments must be built with the appreciation that data acquisition and analysis are equivalent components. Third, software development and application require careful controls, as customary for molecular cell biology experiments. This chapter provides a limited glimpse of ideas useful to conduct these controls. Within the cell biological literature, we hope to see a more extensive discussion about what measures have been taken to substantiate the validity of results from image analysis. On the other hand, manual image analysis should no longer be an option. As discussed in our chapter, manual analysis falls short in consistency and completeness, two essential criteria underlying the validity of a scientific model derived from image data.

Footnotes

1

The Crocker and Grier particle tracking package (one of the most wide-spread trackers using LNN) can be downloaded from http://www.physics.emory.edu/~weeks/idl/.

2

The Sbalzarini and Koumoutsakos particle tracking package can be downloaded from http://www.mosaic.ethz.ch/Downloads/ParticleTracker.

3

The Jaqaman et al. particle tracking package can be downloaded from http://lccb.scripps.edu, “Download” hyperlink.

References

  1. Abraham VC, Taylor DL, Haskins JR. High content screening applied to large-scale cell biology. Trends in biotechnology. 2004;22(1):15–22. doi: 10.1016/j.tibtech.2003.10.012. [DOI] [PubMed] [Google Scholar]
  2. Bakal C, Aach J, Church G, Perrimon N. Quantitative Morphological Signatures Define Local Signaling Networks Regulating Cell Morphology. Science. 2007;316(5832):1753–1756. doi: 10.1126/science.1140324. [DOI] [PubMed] [Google Scholar]
  3. Bates M, Huang B, Dempsey GT, Zhuang XW. Multicolor super-resolution imaging with photo-switchable fluorescent probes. Science. 2007;317(5845):1749–1753. doi: 10.1126/science.1146598. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Bonneau S, Dahan M, Cohen LD. Single Quantum Dot Tracking Based on Perceptual Grouping Using Minimal Paths in a Spatiotemporal Volume. IEEE T Image Process. 2005;14(9):1384–1395. doi: 10.1109/tip.2005.852794. [DOI] [PubMed] [Google Scholar]
  5. Burkard KE, Cela E. Linear Assignment Problems and Extensions. In: Du DZ, Pardalos PM, editors. Handbook of Combinatorial Optimization - Supplement Volume A. Dordrecht, NL: Kluwer Academic Publishers; 1999. pp. 75–149. [Google Scholar]
  6. Chen X, Murphy RF, Kwang WJ. International Review of Cytology. Academic Press; 2006. Automated Interpretation of Protein Subcellular Location Patterns; pp. 193–227. [DOI] [PubMed] [Google Scholar]
  7. Chetverikov D, Verestoy J. Feature Point Tracking for Incomplete Trajectories. Computing. 1999;62:321–338. [Google Scholar]
  8. Conrad C, Erfle H, Warnat P, Daigle N, Lorch T, Ellenberg J, Pepperkok R, Eils R. Automatic Identification of Subcellular Phenotypes on Human Cell Arrays. Genome Res. 2004;14(6):1130–1136. doi: 10.1101/gr.2383804. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Crocker JC, Grier DG. Methods of Digital Video Microscopy for Colloidal Studies. J Colloid Interf Sci. 1996;179(1):298–310. [Google Scholar]
  10. Danuser G, Waterman-Storer CM. Quantitative Fluorescent Speckle Microscopy of Cytoskeleton Dynamics. Ann Rev of Biophys Biomol Struct. 2006;35:361–387. doi: 10.1146/annurev.biophys.35.040405.102114. [DOI] [PubMed] [Google Scholar]
  11. Dorn JF, Danuser G, Yang G. Fluorescent Proteins. Second Edition. San Diego: Elsevier Academic Press Inc; 2008. Computational processing and analysis of dynamic fluorescence image data; p. 497. -+. [DOI] [PubMed] [Google Scholar]
  12. Dorn JF, Jaqaman K, Rines DR, Jelson GS, Sorger PK, Danuser G. Yeast kinetochore microtubule dynamics analyzed by high-resolution three-dimensional microscopy. Biophys J. 2005;89:2835–2854. doi: 10.1529/biophysj.104.058461. [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Ehrlich M, Boll W, van Oijen A, Hariharan R, Chandran K, Nibert ML, Kirchhausen T. Endocytosis by Random Initiation and Stabilization of Clathrin-Coated Pits. Cell. 2004;118(5):591–605. doi: 10.1016/j.cell.2004.08.017. [DOI] [PubMed] [Google Scholar]
  14. Eils R, Athale C. Computational imaging in cell biology. J Cell Biol. 2003;161(3):477–481. doi: 10.1083/jcb.200302097. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Ewers H, Smith AE, Sbalzarini IF, Lilie H, Koumoutsakos P, Helenius A. Single-particle tracking of murine polyoma virus-like particles on live cells and artificial membranes. PNAS. 2005;102(42):15110–15115. doi: 10.1073/pnas.0504407102. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Fujiwara T, Ritchie K, Murakoshi H, Jacobson K, Kusumi A. Phospholipids undergo hop diffusion in compartmentalized cell membrane. J Cell Biol. 2002;157(6):1071–1081. doi: 10.1083/jcb.200202050. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Genovesio A, Liedl T, Emiliani V, Parak WJ, Coppey-Moisan M, Olivo-Marin JC. Multiple particle tracking in 3-D+t microscopy: Method and application to the tracking of endocytosed quantum dots. IEEE T Image Process. 2006;15(5):1062–1070. doi: 10.1109/tip.2006.872323. [DOI] [PubMed] [Google Scholar]
  18. Genovesio A, Olivo-Marin J-C. Split and merge data association filter for dense multi-target tracking. IEEE ICPR'04. 2004;4:677–680. [Google Scholar]
  19. Ghosh RN, Webb WW. Automated Detection and Tracking of Individual and Clustered Cell-Surface Low-Density-Lipoprotein Receptor Molecules. Biophys J. 1994;66(5):1301–1318. doi: 10.1016/S0006-3495(94)80939-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Groc L, Heine M, Cognet L, Brickley K, Stephenson FA, Lounis B, Choquet D. Differential activity-dependent regulation of the lateral mobilities of AMPA and NMDA receptors. Nat Neurosci. 2004;7(7):695–696. doi: 10.1038/nn1270. [DOI] [PubMed] [Google Scholar]
  21. Huet S, Karatekin E, Tran VS, Fanget I, Cribier S, Henry JP. Analysis of transient behavior in complex trajectories: application to secretory vesicle dynamics. Biophys J. 2006;91(9):3542–3559. doi: 10.1529/biophysj.105.080622. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Inoue S, Spring KR. Video Microscopy: The Fundamentals. New York and London: Plenum; 1997. [Google Scholar]
  23. Jaqaman K, Loerke D, Mettlen M, Kuwata H, Grinstein S, Schmid SL, Danuser G. Robust single particle tracking in live-cell time-lapse sequences. Nat Methods. 2008;5(8):695–702. doi: 10.1038/nmeth.1237. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Ji L, Danuser G. Tracking quasi-stationary flow of weak fluorescent signals by adaptive multi-frame correlation. J Microscopy. 2005;220:150–167. doi: 10.1111/j.1365-2818.2005.01522.x. [DOI] [PubMed] [Google Scholar]
  25. Jiang S, Zhou XB, Kirchhausen T, Wong STC. Tracking molecular particles in live cells using fuzzy rule-based system. Cytom Part A. 2007;71A(8):576–584. doi: 10.1002/cyto.a.20411. [DOI] [PubMed] [Google Scholar]
  26. Kalaidzidis Y. Intracellular objects tracking. Eur J Cell Biol. 2007;86(9):569–578. doi: 10.1016/j.ejcb.2007.05.005. [DOI] [PubMed] [Google Scholar]
  27. Loerke D, Mettlen M, Yarar D, Jaqaman K, Jaqaman H, Danuser G, Schmid SL. Cargo and dynamin regulate clathrin coated pit maturation. PLoS Biology. 2009;10(3) doi: 10.1371/journal.pbio.1000057. e1000057. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Meijering E, Smal I, Danuser G. Tracking in molecular bioimaging. IEEE Signal Proc Mag. 2006;23(3):46–53. [Google Scholar]
  29. Neumann B, Held M, Liebel U, Erfle H, Rogers P, Pepperkok R, Ellenberg J. High-throughput RNAi screening by time-lapse imaging of live human cells. Nat Meth. 2006;3(5):385–390. doi: 10.1038/nmeth876. [DOI] [PubMed] [Google Scholar]
  30. Nixon M, Aguado A. Feature Extraction in Computer Vision and Image Processing. Oxford: Butterworth-Heinemann/Newnes; 2002. [Google Scholar]
  31. Olivo-Marin J-C. Extraction of spots in biological images using multiscale products. Pattern Recogn. 2002;35:1989–1996. [Google Scholar]
  32. Ponti A, Matov A, Adams M, Gupton S, Waterman-Storer CM, Danuser G. Periodic patterns of actin turnover in lamellipodia and lamellae of migrating epithelial cells analyzed by Quantitative Fluorescent Speckle Microscopy. Biophys J. 2005;89:3456–3469. doi: 10.1529/biophysj.104.058701. [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Ponti A, Vallotton P, Salmon WC, Waterman-Storer CM, Danuser G. Computational Analysis of F-Actin Turnover in Cortical Actin Meshworks Using Fluorescent Speckle Microscopy. Biophys J. 2003;84(5):3336–3352. doi: 10.1016/S0006-3495(03)70058-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Racine V, Sachse M, Salamero J, Fraisier V, Trubuil A, Sibarita J-B. Visualization and quantification of vesicle trafficking on a three-dimensional cytoskeleton network in living cells. J Microsc - Oxford. 2007;225(3):214–228. doi: 10.1111/j.1365-2818.2007.01723.x. [DOI] [PubMed] [Google Scholar]
  35. Reid DB. An algorithm for tracking multiple targets. IEEE T Automat Contr. 1979;24(6):843–854. [Google Scholar]
  36. Sage D, Neumann FR, Hediger F, Gasser SM, Unser M. Automatic Tracking of Individual Fluorescence Particles: Application to the Study of Chromosome Dynamics. IEEE Trans Image Processing. 2005;14(9):1372–1383. doi: 10.1109/tip.2005.852787. [DOI] [PubMed] [Google Scholar]
  37. Sako Y, Minoguchi S, Yanagida T. Single-molecule imaging of EGFR signalling on the surface of living cells. Nat Cell Biol. 2000;2:168–172. doi: 10.1038/35004044. [DOI] [PubMed] [Google Scholar]
  38. Sbalzarini IF, Koumoutsakos P. Feature point tracking and trajectory analysis for video imaging in cell biology. J Struct Biol. 2005;151(2):182–195. doi: 10.1016/j.jsb.2005.06.002. [DOI] [PubMed] [Google Scholar]
  39. Shafique K, Shah M. A Noniterative Greedy Algorithm for Multiframe Point Correspondence. IEEE T Pattern Anal. 2005;27(1):51–65. doi: 10.1109/TPAMI.2005.1. [DOI] [PubMed] [Google Scholar]
  40. Shroff H, Galbraith CG, Galbraith JA, White H, Gillette J, Olenych S, Davidson MW, Betzig E. Dual-color superresolution imaging of genetically expressed probes within individual adhesion complexes. Proceedings of the National Academy of Sciences of the United States of America. 2007;104(51):20308–20313. doi: 10.1073/pnas.0710517105. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Smith C, Eisenstein M. Automated imaging: data as far as the eye can see. Nat Meth. 2005;2(7):547–555. [Google Scholar]
  42. Starck JL, Murtagh F, Bijaoui A. Image Processing and Data Analysis: the Multiscale Approach. Cambridge: Cambridge University Press; 2000. [Google Scholar]
  43. Stelzer EHK. Practical Limits to Resolution in fluorescence Light Microscopy. In: Yuste R, Lanni F, Konnerth A, editors. Imaging Neurons. Cold Spring Harbor, N.Y.: Cold Spring Harbor Press; 2000. pp. 12.11–12.19. [Google Scholar]
  44. Swedlow JR, Goldberg I, Brauner E, Sorger PK. Informatics and Quantitative Analysis in Biological Imaging. Science. 2003;300(5616):100–102. doi: 10.1126/science.1082602. [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. Thomann D, Rines DR, Sorger PK, Danuser G. Automatic fluorescent tag detection in 3D with super- resolution: application to the analysis of chromosome movement. J Microsc - Oxford. 2002;208:49–64. doi: 10.1046/j.1365-2818.2002.01066.x. [DOI] [PubMed] [Google Scholar]
  46. Tirnauer JS, Salmon ED, Mitchison TJ. Microtubule plus-end dynamics in xenopus egg extract spindles. Mol Biol Cell. 2004;15(4):1776–1784. doi: 10.1091/mbc.E03-11-0824. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Tvarusko W, Bentele M, Misteli T, Rudolf R, Kaether C, Spector DL, Gerdes HH, Eils R. Time-resolved analysis and visualization of dynamic processes in living cells. Proc Natl Acad Sci USA. 1999;96(14):7950–7955. doi: 10.1073/pnas.96.14.7950. [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. Vallotton P, Ponti A, Waterman-Storer CM, Salmon ED, Danuser G. Recovery, visualization, and analysis of actin and tubulin polymer flow in live cells: A Fluorescence Speckle Microscopy Study. Biophys J. 2003;85:1289–1306. doi: 10.1016/S0006-3495(03)74564-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. Veenman CJ, Reinders MJT, Backer E. Resolving Motion Correspondence for Densely Moving Points. IEEE T Pattern Anal. 2001;23(1):54–72. [Google Scholar]
  50. Yildiz A, Selvin PR. Fluorescence imaging with one manometer accuracy: Application to molecular motors. Accounts Chem Res. 2005;38(7):574–582. doi: 10.1021/ar040136s. [DOI] [PubMed] [Google Scholar]
  51. Zenisek D, Steyer JA, Almers W. Transport, capture and exocytosis of single synaptic vesicles at active zones. Nature. 2000;406(6798):849–854. doi: 10.1038/35022500. [DOI] [PubMed] [Google Scholar]

RESOURCES