Summary
Here, we present a protocol to quantify and analyze multiplexed fluorescence in situ hybridization (mFISH) data using two open-source tools, FijiFISH and RUHi. FijiFISH, an ImageJ-based plugin, enables image registration, cell segmentation, and gene expression quantification. RUHi, an R-based package, supports dimensionality reduction, clustering, and visualization through both code and a Shiny app. The protocol also accommodates experimentally induced exogenous fluorophores, providing multimodal, single-cell resolution insights into spatial gene expression.
For complete details on the use and execution of this protocol, please refer to Sullivan et al.1
Subject areas: Bioinformatics, Microscopy, In Situ Hybridization
Graphical abstract

Highlights
-
•
Steps for mFISH image registration, segmentation, and quantification with FijiFISH
-
•
Instructions for dimensionality reduction and clustering using RUHi and Shiny app
-
•
Guidance for visualization and interpretation of mFISH data
Publisher’s note: Undertaking any experimental protocol requires adherence to local institutional guidelines for laboratory safety and ethics.
Here, we present a protocol to quantify and analyze multiplexed fluorescence in situ hybridization (mFISH) data using two open-source tools, FijiFISH and RUHi. FijiFISH, an ImageJ-based plugin, enables image registration, cell segmentation, and gene expression quantification. RUHi, an R-based package, supports dimensionality reduction, clustering, and visualization through both code and a Shiny app. The protocol also accommodates experimentally induced exogenous fluorophores, providing multimodal, single-cell resolution insights into spatial gene expression.
Before you begin
Single molecule fluorescence in situ hybridization (FISH)2 is a useful technique allowing for the visualization and quantification of gene expression with single-cell resolution. Exciting advances in both biochemistry and technology are continuously optimizing and expanding the experimental limits of FISH, providing the ability to create rich datasets at vast scales. With the development and growing accessibility of multiplexed FISH (mFISH) techniques (HiPlex,1,3,4,5,6,7 osmFISH,8 MERFISH,9 10× Xenium4), FISH experiments have transformed substantially from qualitative visualizations of 2–3 genes to quantitative analyses of up to thousands of genes – shifting mFISH’s utility from a validation technique to one with its own predictive potential.10,11 While able to provide the user with powerful and informative datasets, these mFISH experiments come with a host of computational challenges such as: registration of multiple imaging rounds, cell segmentation, and meaningful big-data interpretation.
Here, we present two easy-to-use tools for the processing and analysis of mFISH data. First, a Fiji12 menu item (FijiFISH) for the handling and quantification of mFISH images. This point-and-click style plugin includes computational registration of multiple rounds of imaging, cell body segmentation, quantification of gene expression, and qualitative image production (Figure 1A). Second, a completely code-free Shiny13 app (goFISH) with a corresponding R14 package (R-based Utilities for HiPlex; RUHi) for the analysis of quantified mFISH data. These R-based tools encapsulate the analysis pipeline from quality control to dimensionality reduction and clustering (Figure 1B). Both tools are designed to require minimal or no coding experience and are accompanied by extensive documentation tailored to non-computational researchers.
Figure 1.
mFISH tools overview
(A) FijiFISH registers all images both linearly and nonlinearly according to their DAPI signal. Then, each nucleus is segmented by binarizing the DAPI signal and expanding by a given user-selected radius. Finally, fluorescent signal for each gene is binarized and quantified by adding up the pixels in a given ROI and normalizing to the area.
(B) RUHi and goFISH allow for the normalization, filtering, dimensionality reduction, clustering, and visualization of quantified gene expression.
The RNAscope HiPlex mFISH technique involves iterative rounds of imaging using cleavable fluorophores, as previously described in our past publications1,3,4,5,6,7 (find a step-by-step protocol here). In general, the tissue first undergoes antigen retrieval (in the case of paraformaldehyde-fixed tissues) and protease treatment. Then, up to 12 complementary probes are hybridized to the tissue section and amplified. These probes are then visualized via iterative rounds of confocal imaging using cleavable fluorophores. Depending on the microscope’s constraints in terms of number of detectors, probes can be imaged 3 or 4 at a time, with imaging channels including 405 nm for DAPI and 488 nm, 550 nm, 647 nm, and an option 750 nm channel for probes. Following these 3 rounds of imaging and cleaving the fluorophores, the probes can be stripped (via RNAscope HiPlexUP) and up to 12 more probes can be hybridized. Current capabilities allow this cycle to be repeated up to four times. After this, the images can be registered, segmented, quantified, and analyzed using FijiFISH and RUHi.
The analysis tools presented here were originally developed for RNAscope HiPlex and Multiplex images taken on confocal microscopes. However, we have worked to generalize these tools for different assays as well as purposes such as co-expression analysis in immunohistochemistry (IHC) or mixed modes of histology.
Innovation
This protocol introduces an integrated workflow for multiplexed fluorescent single molecule in situ hybridization (mFISH) that combines image analysis and downstream data interpretation via two open-source tools, FijiFISH and RUHi. While mFISH has emerged as a powerful approach for assessing spatial expression of multiple genes at single-cell resolution, analysis has been limited by existing tools. Current methods either require users to purchase analysis software, or require manual image registration, custom segmentation scripts, and complex coding expertise for data analysis, which can introduce difficulty in adopting the technique and hinder reproducibility.
FijiFISH addresses these challenges by providing a user-friendly ImageJ-based plugin that automates key image processing steps, including multi-round registration, image cropping, and cell segmentation. FijiFISH consolidates these processes into an accessible graphical interface, minimizing manual intervention and standardizing output formats.
Complementing this, the RUHi R package streamlines downstream analysis by offering mFISH-specific functions for dimensionality reduction, clustering, and visualization of the analyzed imaging data. RUHi also includes a code-free Shiny application, to help make advanced analysis accessible to researchers with lower programming experience. Together, FijiFISH and RUHi create a framework to allow users to transition from raw microscopy images to interpretable single-cell gene expression.
This workflow is optimized for mFISH of endogenous gene expression, but has been tested for experimentally introduced fluorophores, supporting multimodal imaging experiments. By merging open-source image analysis with flexible computational tools, this protocol lowers the technical barrier to mFISH adoption and enhances experimental reproducibility.
Institutional permissions
Our protocol here is entirely computational, with experimental procedures associated with deposited data approved by the Animal Care Committee at the University of British Columbia.
Key resources table
| REAGENT or RESOURCE | SOURCE | IDENTIFIER |
|---|---|---|
| Bacterial and virus strains | ||
| AAVrg-CAG-tdT | Addgene | 59462-AAVrg |
| AAVrg-CAG-GFP | Addgene | 37825-AAVrg |
| Biological samples | ||
| Wild-type adult male C57BL/6 mice | Jackson Laboratory | RRID:IMSR_JAX:000664 |
| Critical commercial assays | ||
| RNAscope HiPlex | Advanced Cell Diagnostics (ACD) | Cat#324419 |
| Deposited data | ||
| Mouse data from spatially patterned excitatory neuron subtypes and projections of the claustrum | Erwin et al.6 | 0.7554/eLife.68967 |
| Step-by-Step video tutorials | Sullivan et al.1 | Zenodo: https://doi.org/10.5281/zenodo.17080282 |
| Down-sampled images for the FijiFISH tutorial | Erwin et al.6 | FigShare: https://doi.org/10.6084/m9.figshare.28910930 |
| Full-sized images for the FijiFISH tutorial | Erwin et al.6 | FigShare: https://doi.org/10.6084/m9.figshare.28910936 |
| Analyzed CSVs of down-sampled images for the RUHi tutorial | Erwin et al.6 | FigShare: https://doi.org/10.6084/m9.figshare.28911218 |
| Analyzed CSVs of full-sized images for the RUHi tutorial (these are the files used in the tutorial) | Erwin et al.6 | FigShare: https://doi.org/10.6084/m9.figshare.28911227 |
| Oligonucleotides | ||
| Mm-Cdh9-T1 | Advanced Cell Diagnostics | 443221-T1 |
| Mm-Ctgf-T2 | Advanced Cell Diagnostics | 314541-T2 |
| Mm-Slc17a6-T3 | Advanced Cell Diagnostics | 319171-T3 |
| Mm-Lxn-T4 | Advanced Cell Diagnostics | 585801-T4 |
| Mm-Slc30a3-T5 | Advanced Cell Diagnostics | 496291-T5 |
| Mm-Gfra1-T6 | Advanced Cell Diagnostics | 431781-T6 |
| Mm-Spon1-T7 | Advanced Cell Diagnostics | 492671-T7 |
| Mm-Gnb4-T8 | Advanced Cell Diagnostics | 460951-T8 |
| Mm-Nnat-T9 | Advanced Cell Diagnostics | 432631-T9 |
| Mm-Synpr-T10 | Advanced Cell Diagnostics | 500961-T10 |
| Mm-Pcp4-T11 | Advanced Cell Diagnostics | 402311-T11 |
| Mm-Slc17a7-T12 | Advanced Cell Diagnostics | 416631-T12 |
| Software and algorithms | ||
| Fiji (Fiji Is Just ImageJ) | Schindelin et al.12 | RRID:SCR_002285 |
| R | R Development Core Team14 | RRID:SCR_001905 |
| bUnwarpJ | Arganda-Carreras et al.15 | https://imagej.net/plugins/bunwarpj/ |
| UMAP | McInnes et al.16 | RRID:SCR_018217 |
| Custom FijiFISH code | O’Leary et al.7 | Zenodo: https://doi.org/10.5281/zenodo.17080223 |
| Custom RUHi R package | Sullivan et al.1 | Zenodo: https://doi.org/10.5281/zenodo.17080256 |
Step-by-step method details
Install FijiFISH and RUHi
Timing: ∼15 min
This step ensures proper installation of the tools for downstream analysis.
Note: Detailed instructions are also provided in the Video Tutorial Hub.
-
1.Installing FijiFISH.
-
a.Install Fiji.Note: users need only select one of the following installation methods below: permanent or temporary.
-
b.Permanently install FijiFISH (works in most cases).
-
i.Copy the code from the latest version of the FijiFISH.ijm file (from the GitHub repo kaitsull/FijiFISH or cembrowskilab/FijiFISH; as of publication: FijiFISH-v505.ijm).
-
ii.In Fiji, open Plugins > Macros > StartUp Macros…
-
iii.Copy-and-paste code into end of Fiji StartUpMacros.txt.
-
iv.Close and re-open Fiji.
-
v.The FijiFISH drop-down menu item should now appear permanently each time you open Fiji.
-
i.
-
c.Temporarily install FijiFISH (works if Fiji application is Read Only).
-
i.Save the code from the latest version of FijiFISH (from the GitHub repo kaitsull/FijiFISH or cembrowskilab/FijiFISH; as of publication: FijiFISH-v505.ijm).
-
ii.In Fiji, open Plugins > Macros > Install…
-
iii.Select FijiFISH-v5.ijm files.
-
iv.You should now see a new drop-down menu icon appear.Note: This method will keep FijiFISH installed for as long as Fiji is open. You will need to re-install FijiFISH each time Fiji is reopened with this temporary method.
-
i.
-
a.
Image pre-preparation
Timing: ∼20 min
This step ensures FijiFISH can properly read and analyze the provided microscopy images.
Use the full sized or down-sampled images to follow along with the video tutorials.
-
2.Ensure images are:
-
a.In micron format.
- i.
-
b.8-bit TIFF format.Note: Higher bit depths (16- or 32-bit) increase memory usage and slow performance but will not affect image processing overall, since masks are forced into 8-bit after Step 18.
CRITICAL: Channel images should be exported individually with their original look-up table (LUT). This means the GFP channel for instance will be green when opened. Retaining the original LUT of the image can help down the line with troubleshooting to identify instances of incorrectly labeled image. -
c.Each channel is saved as a separate image with original LUT.
-
d.Images have the same settings across rounds.
-
i.Resolution.
-
ii.Objective magnification.
-
iii.Laser gain.
-
iv.Z-stack step size.
-
i.
-
a.
-
3.
Place individual channels from a given experiment into a single folder.
Note: FijiFISH will generate nested folders to prevent overwriting data. See Figure 2 for folder structure.
-
4.Re-name files to FijiFISH formatting: R#_X_GeneName.tif.
-
a.R# = imaging round number.
-
i.If only analyzing a single imaging round all images would begin with ‘R1’.
-
i.
-
b.X = fluorophore emission wavelength.
-
c.GeneName = the name of the mRNA (or endogenous fluorophore) visualized in the image.
-
d.Full example: R1_405_DAPI.tif.
-
a.
CRITICAL: Incorrect file names and improper pixel to micron scaling can cause issues at multiple steps, see troubleshooting, problem 1 and problem 4.
CRITICAL: Files without proper name structure will not be read into FijiFISH (see video tutorials for more).
Figure 2.
FijiFISH file folder output structure
An example of the file folder output structure from FijiFISH for a single tissue section. The left column outlines folders from which to drag-and-drop for each step of the protocol, as well as the starting point for RUHi. The right column outlines the respective protocol step. Note that not every folder will appear: if all images are 2D, max will not appear as there is no need to project a z-stack; if all images are the exact same dimensions across rounds, crop will not appear; if you are only running 3 or less channels of FISH on a single imaging round, one can skip straight to Segmentation – meaning the only file structure will be the four folders from Steps 15 onward.
Image registration
Timing: Image size dependent (∼5–30 min)
In this step, multiple rounds of DAPI images are registered to the first-round image (Figure 3).
-
5.Initiate Registration.
-
a.Drag-and-drop R1_405_DAPI.tif into Fiji.
-
b.Select “Registration…” from the FijiFISH drop-down menu.
-
a.
-
6.Maximum Intensity Projection (no user input).
-
a.A maximum intensity projection of every channel for each imaging round will be saved in a new nested folder called max.
-
a.
Note: If your image is 2D (a single plane and not a z-stack) this step will be skipped, and the max folder will not be generated or populated.
-
7.Image Cropping (user selects Automatic or Manual).Note: This step will be skipped if all imaging rounds are of the same XY dimension.
-
a.Automatic.
-
i.Proceed with this if your images are within relatively similar ROIs across rounds.
-
i.
-
b.Manual.
-
i.Proceed with this if certain rounds are very askew from the first round.
-
ii.Drag the edge of the yellow box that appears to move it around.
CRITICAL: Do not change the shape of the yellow box as this will cause issues downstream (see video tutorials for a demonstration on proper manual cropping). If the image is improperly cropped, restart Registration by drag-and-dropping the maximum intensity projection of R1_405_DAPI.tif from the max folder. See troubleshooting, problem 2 for more.
-
i.
-
c.All images will be saved in a nested folder within max called crop unless the image is 2D, in which case no max folder will appear or if the images across rounds are all the exact same dimensions, in which case no crop folder will appear.
-
a.
-
8.Linear Registration (no user input).
-
a.Images have a 2X by 2X window taken from the image center which undergoes a fast-Fourier transform cross-correlation (Figure 3B).Note: Fast-Fourier transform (FFT) cross-correlation computes the optimal translational shift in X and Y between two images. It does this by identifying the peak in their frequency domain cross-power spectrum, enabling fast and precise alignment.
CRITICAL: FFT cross-correlation translates in x, y only. It does not take into account rotation, which should be minimal enough to be corrected through non-linear registration. However, it is important to ensure the orientation of one’s tissue section is the same across imaging rounds prior to running registration. -
b.The X and Y distance from the center coordinate of the resulting image to the highest pixel value is calculated, saved, and used for the affine registration (Figure 3B).
-
c.Images are saved with the suffix “_registered.tif” in a nested folder within crop called registeredImages.
-
d.A DAPI overlay demonstrating the accuracy of the linear registration will be saved in a folder called composite.
-
a.
-
9.Nonlinear Registration (user can optionally select to run or not).
-
a.User is prompted to proceed with bUnwarpJ b-spline based nonlinear unwarping (Figure 3C).Note: bUnwarpJ performs nonlinear image registration by iteratively aligning local image features using elastic deformations, allowing for precise correction of non-uniform distortions between images. This is particularly useful in brain regions near white matter or near the edge of tissue section that tend to warp. This step can be lengthy and is not necessary if the rigid, linear registration suffices. In the case where the user wishes to skip Cell Segmentation, foregosteps 10 and 11).
CRITICAL: There are certain instances where nonlinear registration may not yield ideal results. See troubleshootingproblem 3 for more. -
b.All images are saved with suffix “_NL.tif” in a folder nested within registeredImages called nonLinear.
-
c.A DAPI overlay demonstrating the accuracy of the nonlinear registration will be saved in a folder called composite.
-
a.
Figure 3.
FijiFISH: Registration step-by-step
(A) Each round of imaging undergoes a maximum intensity projection and is cropped to be the same dimensions across all rounds.
(B) A square window is made and a Cross Correlation via the Fast Fourier Transform is run. The distance from the center to the highest intensity value pixel provides the coordinates for the rigid, linear transformation.
(C) bUnwarpJ is used to nonlinearly align the linearly registered images. While not always necessary, the nonlinear registration can be particularly useful when ROIs are near white matter tracts or tissue edges, which can warp during decoverslipping.
(D) Example images showing DAPI signal between 2 rounds before registration (left), following linear registration (middle) and after nonlinear registration (right) steps. All scale bars represent 150 microns except inset scale bars, which represent 20 microns.
Cell segmentation
Timing: variable and technique dependent
In this step, the DAPI signal is segmented, and the user can choose to expand the segmented ROIs to include the surrounding cytosol (Figure 4).
CRITICAL: Proper scaling in microns is vital for this step. See troubleshooting, problem 4 for more information.
CRITICAL: Over multiple decoverslipping rounds, DAPI can leach from the nucleus into the surrounding extracellular matrix. This can result in poor contrast and difficulty binarizing for cell segmentation. See troubleshooting, problem 5 for workarounds.
-
10.User can optionally use other popular segmentation algorithms that provide Fiji-readable ROI Zip files as output:
-
a.Ensure that the images you use are from the correct registration folder.
-
b.Examples of other segmentation algorithms include:
-
i.Baysor17 (for improving upon built-in segmentation).
-
ii.Cellpose18 (to replace built-in segmentation).
-
iii.See troubleshooting, problem 5 for more on implementation.
-
i.
-
c.When completed:
-
i.Name your file ROIset.zip.
-
ii.Create a folder within your registration folder called analyzedTables.
-
iii.Place ROIset.zip into this folder.
-
i.
-
a.
-
11.Initiate Segmentation.
-
a.From the registeredImages or nonlinear folder (depending on which the user has chosen to use going forward), drag-and-drop R1_DAPI_405.tif_registered.tif or R1_DAPI_405.tif_registered.tif_NL.tif into Fiji.
-
b.Select “Segmentation…” from the FijiFISH drop-down menu.
-
a.
-
12.User selects rounds to use.
-
a.See troubleshooting, problem 5 on when to exclude rounds as well as alternate options.
-
b.If using only a single round of imaging, it will automatically proceed to the next step.
-
a.
-
13.Image Binarization (user selects Manual or Automatic).
-
a.Automatic.
-
i.Gaussian Blur (sigma=3; scaled in microns) is applied to all DAPI images.
-
ii.Triangle thresholding is applied to all DAPI images.
-
i.
-
b.Manual.
-
i.Gaussian Blur (sigma=3; scaled in microns) is applied to all DAPI images.
-
ii.User defines threshold.
-
i.
-
a.
CRITICAL: If an image is in the 750 channel, the image will undergo the built-in linear ‘Smooth’ function. This is due to inherent noise in the detector on the Leica SP8 white light laser confocal. However, if you are using a different microscope, or alternatively, if you would like to perform noise removal or smoothing steps on all images, see troubleshooting, problem 4 and problem 6.
-
14.Image Multiplication (no user input – multi-round experiments only).
-
a.Each binarized DAPI image is multiplied to one another to retain only cells present across all imaging rounds.
-
b.ROIs between (30 μm2–250 μm2) are created and populated in the ROI Manager.
-
c.See troubleshooting, problem 4 for how to change these limits.
-
a.
-
15.Region Selection for Segmentation (mandatory).
-
a.The user will be prompted to select an area of the image to be segmented.
-
i.To select the entire image: use the square tool to select the whole image window.
-
ii.To select only a specific tissue region (either for specificity or to avoid imaging abnormalities like dead tissue or autofluorescence): use the polygon tool to select the region you would like to segment.
-
i.
-
a.
CRITICAL: If you skip this step, segmentation will stop. Please create your ROI prior to pressing OK on the dialog prompt (see video tutorial for detailed walkthrough).
-
16.ROI Expansion (user selects 0–15).
-
a.Use slider to select expansion value.
-
i.A choice of 0 μm expansion radius will proceed directly to quantification.
-
ii.A choice of >0 μm expansion radius will have a runtime of O(n), where n=number of segmented nuclei.
-
i.
-
a.
-
17.
ROIs will be saved as ROIset.zip in the analyzedTables folder.
Figure 4.
FijiFISH: Segmentation and quantification
(A) Each round of DAPI is binarized and multiplied to the first image to retain only nuclei found across all rounds.
(B) Overlapping cells are separated by a watershed algorithm and then expanded by a user-designated radius.
(C) Gene expression is binarized and total particles are normalized to the ROI area and saved in a CSV. All scale bars represent 150 microns except inset bars, which represent 20 microns.
Quantification
Timing: ∼5–10 min
In this step, mFISH signal is binarized and the ROIs from segmentation are loaded up to quantify the amount of gene expression within a given ROI.
-
18.Initiate Quantification.
-
a.From the registeredImages or nonlinear folder (depending on which the user has chosen to use going forward), drag-and-drop the first gene image (e.g.: R1_488_Gnb4.tif_registered.tif or R1_488_Gnb4.tif_registered_NL.tif).
-
b.Select “Quantification…” from the FijiFISH drop-down menu.
-
a.
-
19.Image Binarization (user selects Manual or Automatic).
-
a.Automatic.
-
i.Built-in MaxEntropy autothresholding is applied to all mFISH probe images.
-
i.
-
b.Manual.
-
i.User defines threshold of every probe with slider.
CRITICAL: see troubleshooting, problem 6 for how to deal with background removal and other noise filtering.Note: Once performed, if the user is unsatisfied with automated thresholding of certain gene probes: -
ii.Drag-and-drop the registered image to re-threshold.
-
iii.Select Single Channel Quantification and adjust accordingly.
-
iv.This will overwrite the previous quantification files.
-
i.
-
a.
-
20.All files are saved in nested folders:
-
a.Binarized Images in overlay.
-
b.Quantified CSVs in analyzedTables.
-
a.
Note: Raw data from FijiFISH will be the optical density (Mean value from Fiji’s: Analyze > Analyze Particles… > Measure), which is the sum of the gray values within the ROI (0 or 255) divided by the area in pixels of the ROI.
Overlay generation
Timing: ∼1–20 min
Here a user can automatically or manually create a qualitative image overlay.
CRITICAL: If you wish to run the Automatic Overlay, please first download the custom LUTs from the GitHub repo and place them in the luts folder of your Fiji app. See troubleshooting, problem 7 for more.
-
21.Automatic Overlay (No user input on colors or order).
-
a.Initiate Overlay.
-
i.Drag-and-drop the first gene image from the overlay folder into Fiji (ensure that all of the Quantification files have already been populated, particularly overlay from which the images will be drawn from).
-
ii.Select “HiPlex Overlay…”.
-
i.
-
a.
Note: Images will be loaded and opaquely overlaid from highest to lowest expression. The overlay image will be saved in the overlay folder.
CRITICAL: This drop-down menu item will not work unless you have downloaded the supplementary LUTs (see troubleshooting 7).
-
22.Manual Overlay (User defines colors and order).
-
a.Open each registered and binarized image from the overlay folder.
-
b.Change each gene to the desired LUT color.
-
c.Select the highest expressing gene image.
-
i.Select Image > Overlay > Add Image…
-
ii.Select the next highest expressing gene.
-
iii.Repeat until all genes have been overlaid and save as a PNG or JPG.
-
i.
-
a.
CRITICAL: Ensure you have clicked ‘Zero Transparent’ so that only the signal and not the background is overlaid (see video tutorials for more).
Install the RUHi package
Timing: ∼5 min
-
23.RUHi.
-
a.Install R.
-
b.Install RStudio.
-
c.Install and load RUHi in RStudio using the code below:
-
a.
#make sure to have devtools installed:
#install.packages("devtools")
#remove old version if updating
#remove.packages("RUHi")
#reinstall from kaitsull/RUHi or cembrowskilab/RUHi
devtools::install_github("kaitsull/RUHi")
Combining quantified tables in R
Timing: ∼5 min
In this step, the user takes the quantification output from Fiji and combines all imaged genes into a single table (Figures 5A–5C). Here, multiple experiments can also be concatenated into one large dataset. Finally, the user creates an mFISH object (Figure 5D) to be used in downstream analysis.
-
24.Set up proper file structures.
-
a.Create a new analysis folder outside of the folder used for Fiji image analysis (see video tutorial for more).
-
b.Name each folder after the metadata separated by underscores (see Figure 5 for more on file folder structure and naming conventions).
-
a.
-
25.
To conjoin the individual genes from one tissue section into a single table, run:
library(RUHi)
mydata <- ruRead("∼/path/to/data",
#metadata variable arguments
region = "intermediate", anum = "123456", section = "1")
CRITICAL: Ensure the metadata arguments are filled out to in order to accurately separate individual experiments from one another when plotting in space (see video tutorials for more).
-
26.
To combine multiple experiments:
#let’s say we have four experiments that we have already read in with ruRead()
experiments <- list(df1, df2, df3, df4)
#combine into one
#NOTE: this WILL error if missing metadata or misspelled genes
mydata <- ruCombine(experiments)
CRITICAL: Make sure all metadata and gene names are consistent throughout, or else this will error. See troubleshootingproblem 8 for more.
-
27.
Create an mFISH object (see Figure 5D).
myobj <- ruMake(mydata)
Figure 5.
RUHi mFISH object structure
(A) CSV files generated from Quantification will be populated in the analyzedTables folder. Each experiment should remain in their own separate folders.
(B) Following ruRead(), all CSVs will be bound together into a single object of class data.frame. There are mandatory arguments to include in order to adequately tell experiments apart including: section – the imaging section number, anum – the animal number, and region – A/P axis location.
(C) To combine multiple experiments, use ruCombine() to concatenate a list of data frames output from ruRead().
(D) The user can then use ruMake() on the final data.frame to create an object with slots for raw data, filtered and normalized data, metadata, and a list of attributes storing all the analysis variables used.
Code-free previewing with goFISH (optional)
Timing: ∼5–10 min (dataset dependent)
The user can preview analysis of an imaging round, or the entire dataset in a code-free manner. Note that the larger the dataset, the slower this code-free implementation will be.
-
28.
Once you have created the mFISH object in the previous step, use the following code to launch the ShinyApp:
goFISH(myobj,
#option to filter by gene expression
filter.by = "Slc17a7",
#option to select number of clusters
k = 5,
#option to specify normalization method
norm = "PAC")
-
29.Here, one can select various options for analysis such as:
-
a.Number of PCs to use in UMAP and clustering.
-
b.Number of clusters to display.
-
c.Specific gene expression patterns in UMAP and geographic space, as well as boxplots by cluster.
-
d.Clustering in UMAP and geographic space.
-
e.Split data by metadata variables.
-
a.
Data preprocessing
Timing: ∼5–10 min (dataset dependent)
Here, the user can filter, normalize, run dimensionality reduction, and cluster their data.
CRITICAL: Computational post-processing of noise removal can also occur at this step. See troubleshootingproblem 9.
-
30.Option to filter data by the expression of a given gene.
-
a.For example, only neurons:
-
a.
myobj <- ruFilter(myobj,
#filter by a gene
filter.by="Snap25",
#select gene threshold
threshold=0.1,
#the user can choose to excludenoisy or irrelevant genes
exclude = NA)
-
31.
Normalize and run linear dimensionality reduction (PCA).
CRITICAL: Ensure you choose a normalization method that works with your data. Options include log normalization [log2(x+1)], sum-to-one normalization [(x/sum(x)∗100); all genes add to equal 1], percent area covered [(x/255)∗100; raw data from FijiFISH will be the average fluorescence from 0–255 divided by the area of the ROI]
CRITICAL: If one only has 2–3 genes, dimensionality reduction and clustering are probably overkill. See troubleshootingproblem 10 for more on how to directly access data from mFISH objects and turn them back into data frames for bespoke analysis.
myobj <- ruProcess(myobj,
#user can choose normalization
norm = "PAC",
#user can choose to remove cells that express too few or too many genes. Automatically set to false
remove.outliers=F,
#if remove.outliers=T, the user can suggest the min and max number of genes a cell should express
outlier.thresh = c(1,11))
-
32.
Check the dimensions of the data.
Note: This will visualize the number of principal components contributing variance to the dataset, and the red line will be drawn at the mark where 95% of variance is accounted for. The value at the red line will be automatically used, if you wish to change it, use the npc argument in steps 30 and 31.
plotVar(myobj)
-
33.
Nonlinear dimensionality reduction (UMAP).
myobj <- ruUMAP(myobj, npc=6)
-
34.
Hierarchical clustering (R’s hclust with method “ward.D2”).
myobj <- ruCluster(myobj, k=5, p=2)
#plot the dendrogram to ensure proper clustering
plotDendro(myobj) #change k value to match dendrogram
Data visualization
Timing: ∼5–10 min (dataset dependent)
In this final step, the user can use a variety of built-in functions to plot the data in embedded and geographic space. Users are welcome to also access the data frames stored in the mFISH object directly for more bespoke analysis and visualization.
-
35.
Plot in space (see Figures 6B, 6C, and 6F for output).
#automatically coloured by cluster
plotSpace(myobj)
#split graph by metadata variables
plotSpace(myobj, group.by="cluster")
#colour by normalized gene expression
plotSpace(myobj, colour.by="Ctgf")
#colour by gene and split by metadata
plotSpace(myobj, colour.by="Ctgf", group.by="cluster")
-
36.
Plot UMAP (see Figure 6B for output).
#automatically coloured by cluster
plotDim(myobj)
#split graph by metadata variables
plotDim(myobj, group.by="cluster")
-
37.
Gene expression boxplots (Figure 6E).
#plot gene expression by cluster
geneBoxPlot(myobj, "Ctgf")
#plot all genes expressed in one cluster
clusterBoxPlot(myobj, clus="5")
-
38.
Calculate and plot spatial properties (see Figure 6G for output).
#extract section coordinates and mfish data
coordsSection_1 <- getCoords(myobj, section_id = "1",
rotate_coords = F)
#compute distances: select 5-10 coords around nucleus/ROI
section1data = analyzeDistance('section1',
coord_data = coordsSection_1)
#for distances to centre, subtract the min distance value from stored values
section1data$distance_to_centre <-
section1data$distance_to_boundary -
min(section1data$distance_to_boundary)
#create boxplot
ggplot(section1data, aes(x = factor(cluster),
y = distance_to_centre, fill = factor(cluster))) + geom_boxplot()
#add metadata of choice to mfish object
myobj<- metaAdd(mfish_object = myobj,
metadata_to_add = section1data,
metadata_variables_to_add = "distance_to_centre")
#plot cells in space coloured by distance values
plotSpace(myobj, colour.by = "distance_to_centre")
#plot UMAP coloured by distance values
plotDim(myobj, colour.by = 'distance_to_centre')
Figure 6.
RUHi plotting functionality
(A) A final qualitative mFISH image overlay from FijiFISH after registration, segmentation, and quantification. Each color represents an individual gene which is saved as a CSV that can be read into RUHi. Scale bars: overview: 200 μm, expansion: 50 μm.
(B) Representative output from RUHi for image in A, with dimensionality reduction, clustering, and spatial visualization of quantified gene expression.
(C) Example of plotting by metadata, in this case using viral expression from tracing images taken prior to mFISH and registered together via FijiFISH.
(D) Pie charts show injection metadata groups partitioned by cluster.
(E) Boxplots show quantified gene expression for corresponding clusters shown in (D) performed with RUHi.
(F) Cells in their registered spatial location and UMAP embedding values colored by Gnb4 and Nnat expression performed with RUHi.
(G) Example output of spatial analysis and visualization performed with RUHi. Cells in their spatial location and UMAP embeddings are colored by computed distance to CLA center. Example boxplots show computed distance colored by cluster.
All box plots represent the first to third quartile of the data, with horizontal lines representing the median and whiskers representing the lowest or highest point within 1.5x of the interquartile range.
Expected outcomes
The presented tools should take a user from a collection of microscopy images to a fully quantified and analyzed dataset. On our GitHub, we provide multiple datasets where users can follow along with step-by-step tutorials (see GitHub for our video format step-by-step tutorials) using data that is the exact format and naming styles expected.
Prior to beginning FijiFISH, the user should have their file folder set up akin to Step 3. By the end of FijiFISH, the user should have a file folder structure akin to Figure 2. The Registration step will result in maximum intensity projections (Figure 3A), cropping (Figure 3A), linear registration (Figure 3B), and optional nonlinear registration (Figure 3C) applied and saved in nested folders (Figure 2). The Segmentation step can optionally be run using the built-in method or extraneous segmentation tools (such as Baysor17 or Cellpose18) that provide Fiji-readable ROIset.zip files. If using the built-in segmentation, this step will result in segmentation of cells present across all rounds of imaging (Figure 4A), dilated to a chosen radius (Figure 4B). If using external segmentation, the ROIset.zip need only be placed in the registration folder of choice before continuing to quantification. The Quantification step will result in quantified CSVs for each gene and their corresponding binarized images to be saved (Figure 4C).
Prior to beginning RUHi, the user should have their file folders set up akin to Figure 5A. First, the user will create their mFISH object by combining all genes from a given experiment into a single data frame (Figure 5B). Then, multiple experiments can be concatenated and transformed into an mFISH object (Figure 5C). By the end of RUHi, the user will have a filtered and normalized dataset (Figure 5D) with linear and nonlinear dimensionality reduction as well as clustering performed (Figures 6A and 6B). The expected final outcome will be visualizations in embedded (UMAP) space or geographic space of clusters, gene expression, and other meta properties of the data – such as long-range projection targets (Figures 6C–6F) or distance to unique tissue landmarks (Figure 6G).
Limitations
These tools provide an open-source and editable framework for facilitating the analysis of multiplexed histology data. However, certain analytical and technical limitations exist within this workflow. For example, the nonlinear registration can become problematic and inaccurate when large shifts are required for initial rigid data alignment. Moreover, the accurate segmentation of cell bodies remains a key challenge in spatial probe-based transcriptomics, as mRNA primarily exists in the cell body but can also be present in other distal cell processes, and thus can be difficult to delineate from adjacent cells. Additionally, densely packed cell bodies present segmentation difficulties, leading to potential cross-contamination of mRNA quantification between neighboring cells. mFISH imaging of RNA molecules is diffraction-limited, meaning each RNA molecule appears as a point spread function rather than its true spatial extent. During binarization, this means that closely spaced molecules within the diffraction limit may appear as a single object. To address this, our method quantifies the optical area covered by each gene’s signal as a proxy for molecular abundance. Users should consider this when interpreting results. While autothresholding techniques have been employed for this binarization, they remain imperfect, and advancements such as artificial intelligence-based methods could provide more accurate solutions. Since these tools were developed and specifically tailored to RNAscope HiPlex experiments, there may be small hurdles to circumvent and customization required to address different assays or imaging hardware outputs, as well as overall end goal. Fortunately, the open-source nature of this work allows for ongoing improvements and customization, enabling future refinements and adaptations as technology and user needs evolves.
Troubleshooting
Problem 1
No images present error in Step 5, Step 11, Step 18, or Step 21.
When starting up any of the FijiFISH steps (Registration, Segmentation, Quantification, or Overlay), an initial error reading: “Incorrectly titled image or wrong image kind.” might appear.
Potential solution
There could be a few underlying reasons this occurs, including incorrect image naming or incorrect file structure. The key here will be checking all the files and folders and making sure everything is properly formatted.
-
•Check that the images follow these strict guidelines.
-
○Naming structure: “R1_405_DAPI.tif”.
-
○File type: 8-bit TIFF.
-
○Scaling: microns.
-
○
-
•Check that all images are placed in the same file folder.
-
○See Figure 2 for file structure.
-
○
-
•
Ensure the correct image from the correct folder was drag-and-dropped into Fiji.
Problem 2
Image dimension errors thrown after manual cropping in Step 6b.
If the yellow box is altered in dimensions during manual cropping, one will come across issues during the following linear registration step. The error will most likely state that some images cannot be merged due to differences in dimension and the registration code will halt.
Potential solution
To fix this problem, one must start Registration from the start, from either the max folder or the base experiment folder.
-
•Drag-and-drop image from the appropriate folder.
-
○Max folder will yield the fastest results as there isn’t a need to re-run the projections.
-
○
-
•
Start Registration.
-
•When prompted to, select Manual Cropping.
-
○Place the cursor on the yellow box.
-
○Avoid the white circles placed on the yellow box.
-
○Do not drag the yellow box if your cursor is shaped like a finger – this will alter the dimensions of the final image.
-
○Do not drag the yellow box beyond the limits of the image – this will make the dimensions of the image too small.
-
○
Problem 3
Nonlinear registration results in strange and inaccurate image warping in Step 9a.
After linear registration, some images might have large blank spaces on the sides. This can affect the way in which the nonlinear registration works as bUnwarpJ can be sensitive to these large, global nonlinearities between the images. Similarly, if images contain blank tiles due to polygon-shaped ROIs during imaging, the nonlinear registration might try to warp these changes to the template image, rather than focusing on individual cell bodies.
Potential solution
There are many workarounds that involve either using, augmenting, or completely foregoing the nonlinear registration.
-
•If you are happy with the linear registration results:
-
○First, check the composite folder to see how the DAPI overlay looks.
-
○Proceed with Segmentation and Quantification working from the linear folder.
-
○
-
•If you are unhappy with the linear registration results:
-
○Try Manual Cropping to make the cropped ROIs more similar, thus requiring smaller shifts during linear registration.
-
○Try augmenting the variables fed into nonlinear registration:
-
-Open Plugins > Macros > StartUp Macros…
-
-Ctrl + F for troubleshooting 3.
-
-Augment the arguments sent to bUnwarpJ such as: registration, initial_deformation, final_deformation, landmark_weight, etc.
-
-Re-start Fiji before proceeding.
-
-
-
○
Problem 4
No or very few cells segmented in Steps 11–17.
Some users may find that despite there being thousands of cells in their image, only a few – or no cells at all – are segmented.
Potential solution
This problem almost always has to do with improper scaling of images, since only cells with an area between 30 and 250 are selected for segmentation. In other cases, this can be due to poor signal-to-noise across DAPI rounds affecting the image multiplication, or even poor registration affecting the multiplication.
-
•First, check the scaling of the image:
-
○Is the scaling in microns?
-
○Does the scaling make sense?
-
-Use the Free Selection tool and outline a nucleus.
-
-Press Ctrl + M.
-
-A result table should appear with the measured area of the cell.
-
-If this seems incorrect (too small or too large), check the pixel to micron scaling that you used during imaging via the metadata or by physically checking the settings on your microscope.
-
-This can be achieved through Analyze -> Set Scale…
-
-
-
○
-
•Some DAPI images are noisy and difficult to binarize or were poorly registered, affecting the image multiplication.
-
○There is an option to include only certain rounds of DAPI in the final segmentation.
-
○Select only the rounds that are viable for image multiplication.
-
○Poor registration might affect the downstream analysis, very carefully consider re-running linear registration using some of the tips in troubleshooting, problem 3.
-
○
-
•If you wish to change these min/max limits:
-
○Open Plugins > Macros > StartUp Macros…
-
○Ctrl + F for troubleshooting 4.
-
○Change the values below the commented lines.
-
○
Problem 5
Segmented cells are highly agglomerated or are fragmented into small pieces in Steps 11–17.
Potential solution
High cell density can lead to agglomeration of cells during segmentation, as closely packed cells limit the algorithm’s ability to resolve clear boundaries. Fragmentation into smaller pieces often occurs due to highly condensed chromatin, resulting in punctate DAPI signals that produce multiple smaller segmentations. This is particularly common in large neurons, like human neurons.
-
•Cell agglomeration in high density scenarios:
-
○When using the built-in segmentation:
-
-Use an expansion of 0 to avoid overlaps.
-
-Try to under-do the binarization so that the nuclei are smaller and more separated.
-
-
-
○Using alternative methods that may be more robust:
-
-Train a model in Cellpose18 and output ImageJ-readable ROIs using the --save_rois flag (see Cellpose’s documents on Native ImageJ ROI archive output for more on this).
-
-Try improving built-in Segmentation by using the ROIset.zip generated in combination with the expression density from generated CSVs from Quantification in Baysor17 (Note: this may involve extensive fine-tuning to input CSVs and to the output CSV for mFISH objects creating in RUHi afterward; see Figure 5 for the exact CSV structure required to run RUHi).
-
-Run these on the template (Round 1) image and drop the ROIset.zip file into the analyzedTables file folder.
-
-
-
○
-
•Cells are fragmented:
-
○Try augmenting the sigma level of the Gaussian blur in Fiji.
-
-Plugins > Macros > StartUp Macros…
-
-Ctrl + F for troubleshooting 5.
-
-Increase the value of sigma in the Gaussian blur argument.
-
-Re-start Fiji before proceeding.
-
-
-
○This increase in blurring should help to prevent the cell body from getting sliced up or oddly segmented.
-
○
Problem 6
Probe images are noisy and difficult to binarize in Steps 18–20.
This can occur due to suboptimal imaging settings (such as too high of gain) or biological issues (poor sample perfusion), resulting in issues removing background noise and resolving probe signal from images.
Potential solution
There are many ways to get around this issue. It is extremely important to be clear about the types of transformations performed on your images and report them in your methods section.
-
•Try using filters.
-
○Run Remove Background for all images.
-
○If you don’t care for single molecule resolution, try the Smooth function on all images.
-
○If there is single pixel width detector noise, try Remove Outliers.
-
○To avoid doing this by hand:
-
-Plugins > Macros > StartUp Macros…
-
-Ctrl + F for troubleshooting 6 (there should be two).
-
-Add in the choice of function you would like to have automatically applied to each image (both of these options are present but commented out, so all the user needs to do is remove ‘//’ in front of the code line).
-
-Re-start Fiji before proceeding.
-
-
-
○
Problem 7
HiPlex Overlay throws LUT color errors in Step 21.
Potential solution
Please make sure you have downloaded the individual LUTs from GitHub and put them in the luts folder for Fiji. Alternatively, try making an overlay manually following the instructions from Overlay Generation.
Problem 8
RUcombine() fails to run in Steps 24–26.
Potential solution
When RUcombine() fails, there could be multiple reasons.
-
•
Check that each experiment has the same metadata variables.
-
•
Check that each experiment has the same gene name spelling (including capitalization).
-
•
Check that each experiment is stored in a list.
-
•
Check that each experiment is a data frame and not yet an mFISH object.
Problem 9
Preventative and post-quantification noise removal strategies in Step 30.
In certain scenarios, high background noise due to certain microscope detector issues, laser gain, fluorescence accumulation due to lipofuscin or unperfused blood vessels might be present across all images or certain channels.
Potential solution
To prevent fluorescence bleed-through during mFISH due to endogenous fluorophore.
-
•
Photobleach by placing tissue in a reflective box with a bright light for 12–24 h (tissue-dependent variable) prior to beginning mFISH protocol (good for lipid-based fluorescence like lipofuscin or red blood cell autofluorescence).
-
•
Boil tissue for target retrieval (good for endogenous fluorophores like virus or transgene expression).
To computationally remove this noise post-quantification in RUHi.
-
•During ruProcess(), use the optional argument remove.outliers=c(1,11).
-
○Change the upper limit number within this numeric vector to remove cells that express every single gene, which likely denotes autofluorescence in every round.
-
○This value is automatically set to 11 and should be altered based on your experimental requirements.
-
○The set values of 1 and 11 are due to using 12 gene mFISH panels, wherein single gene expressing populations became strange outliers in dimensionally reduced space and 11,12 gene expressors specifically were identified as cells that were autofluorescent, rather than reflecting true expression.
-
○If you wish for no cells to be filtered based on the number of genes, simply make the lower limit 0 and the upper limit greater than your total number of genes (eg: c(0,13) for a dataset containing 12 genes).
-
○
-
•
If your UMAP has “confetti”-like cells that express only a single gene, you can also remove these cells from analysis by increasing the cut-off to c(1,12).
Problem 10
Accessing data from the mFISH object to perform downstream analysis in Steps 35–38.
Potential solution
If you wish to perform and plot different analyses than the basic functions provided, one can access the many different types of data stored in the mFISH object using provided examples below. See Figure 5D for more details on the structure of the mFISH object.
-
•
Using the raw data with the metadata.
library(dplyr)
raw <- myobj@rawData
meta <- myobj@metaData
rawdata <- left_join(meta, raw, by="id")
-
•
Using the normalized and filtered data with the metadata.
library(dplyr)
norm <- myobj@filteredData
meta <- myobj@metaData
#remove metadata that was filtered out
meta <- filter(meta, fil==T)
normdata <- left_join(meta, norm, by="id")
-
•
Accessing the PCA information.
#to access the output of prcomp()
pca <- myobj@attributes$pca
Once accessed and saved as a new variable, these structures will act like normal R data frames that can be analyzed and plotted using base R or classic packages like dplyr and ggplot2.
Resource availability
Lead contact
Requests for further information and resources should be directed to and will be fulfilled by the lead contact, Mark S. Cembrowski (mark.cembrowski@ubc.ca).
Technical contact
Technical questions on executing this protocol should be directed to and will be answered by the technical contact, Margarita Kapustina (margokap@student.ubc.ca).
Materials availability
All RNAscope HiPlex reagents are available from Advanced Cell Diagnostics. Both Fiji, R, and the commonly used R-integrated development environment, RStudio are free tools available at the following links.
-
•
Fiji - RRID:SCR_002285 imagej.net/software/fiji/downloads.
-
•
R -RRID: SCR_001905 cran.r-project.org/mirrors.html.
-
•
RStudio -RRID: SCR_000432 posit.co/downloads/.
-
•
FijiFISH - Zenodo: https://doi.org/10.5281/zenodo.17080223.
-
•
RUHi - Zenodo: https://doi.org/10.5281/zenodo.17080256.
-
•
VideoTutorials - Zenodo: https://doi.org/10.5281/zenodo.17080282.
-
•
Downsampled images - FigShare: https://doi.org/10.6084/m9.figshare.28910930.
-
•
Downsampled CSVs - FigShare: https://doi.org/10.6084/m9.figshare.28911218.
-
•
Full size Images - FigShare: https://doi.org/10.6084/m9.figshare.28910936.
-
•
Full size CSVs - FigShare: https://doi.org/10.6084/m9.figshare.28911227.
Data and code availability
-
•All data and code used in this protocol can be found on our GitHub at the repository links:
-
○FijiFISH repository - Zenodo: github.com/cembrowskilab/FijiFISH.
-
○RUHi repository - Zenodo: github.com/cembrowskilab/RUHi.
-
○
-
•
Original publication source material and results for the mouse claustrum can be found here at https://doi.org/10.7554/eLife.68967.
Acknowledgments
M.S.C. is supported by the University of British Columbia (Department of Cellular and Physiological Sciences, Djavad Mowafaghian Centre for Brain Health, and the Faculty of Medicine Research Office) and the Natural Sciences and Engineering Research Council of Canada (RGPIN-2019-04507 and RGPIN-2024-03916). K.E.S. is supported by a Royal Canadian Legion Masters Scholarship in Veteran Health Research from the Canadian Institute for Military and Veteran Health Research, a Canada Graduate Scholarship – Masters from the Canadian Institute of Health Research, and a Canada Graduate Scholarship – Doctoral from the Natural Sciences and Engineering Research Council. B.N.B. is supported by a Canada Graduate Scholarship – Masters from the Canadian Institute of Health Research, a Dorothy May Ladner Memorial Fellowship from the University of British Columbia, and a Canadian Graduate Scholarship – Doctoral from the National Sciences and Engineering Research Council. M.K. is supported by a Canada Graduate Scholarship – Masters from the Canadian Institute of Health Research and a Postgraduate Scholarship – Doctoral from the Natural Sciences and Engineering Research Council. This work was supported by resources made available through the NeuroImaging and NeuroComputation Centre at the Djavad Mowafaghian Centre for Brain Health (RRID: SCR_019086). We thank members of the Cembrowski lab for helpful discussions and Jeffrey LeDue for insight and guidance in image acquisition and analysis.
Author contributions
K.E.S. created combined FijiFISH macro, R Shiny app, RUHi package, and its base functions; created all documentation and video tutorials; created figures; edited manuscript text and figures; and wrote the manuscript. M.K. created RUHi advanced functions for distance calculations and adding metadata, edited manuscript text and figures, and continues to monitor GitHub Issues as a technical contact. B.N.B. performed tissue preparation and mFISH benchwork, generated original mFISH figures for original Erwin et al.,6 and edited manuscript text and figures. M.S.C. conceptualized the registration, segmentation, and quantification codes in FijiFISH; oversaw experiments and analysis; and edited manuscript text and figures.
Declaration of interests
The authors declare no competing interests.
Contributor Information
Margarita Kapustina, Email: margokap@student.ubc.ca.
Mark S. Cembrowski, Email: mark.cembrowski@ubc.ca.
References
- 1.Sullivan K.E., Kraus L., Kapustina M., Wang L., Stach T.R., Lemire A.L., Clements J., Cembrowski M.S. Sharp cell-type-identity changes differentiate the retrosplenial cortex from the neocortex. Cell Rep. 2023;42 doi: 10.1016/j.celrep.2023.112206. [DOI] [PubMed] [Google Scholar]
- 2.Femino A.M., Fay F.S., Fogarty K., Singer R.H. Visualization of single RNA transcripts in situ. Science. 1998;280:585–590. doi: 10.1126/science.280.5363.585. [DOI] [PubMed] [Google Scholar]
- 3.Kinman A.I., Merryweather D.N., Erwin S.R., Campbell R.E., Sullivan K.E., Kraus L., Kapustina M., Bristow B.N., Zhang M.Y., Elder M.W., et al. Atypical hippocampal excitatory neurons express and govern object memory. Nat. Commun. 2025;16:1195. doi: 10.1038/s41467-025-56260-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Kapustina M., Zhang A.A., Tsai J.Y.J., Bristow B.N., Kraus L., Sullivan K.E., Erwin S.R., Wang L., Stach T.R., Clements J., et al. The cell-type-specific spatial organization of the anterior thalamic nuclei of the mouse brain. Cell Rep. 2024;43 doi: 10.1016/j.celrep.2024.113842. [DOI] [PubMed] [Google Scholar]
- 5.O'Leary T.P., Kendrick R.M., Bristow B.N., Sullivan K.E., Wang L., Clements J., Lemire A.L., Cembrowski M.S. Neuronal cell types, projections, and spatial organization of the central amygdala. iScience. 2022;25 doi: 10.1016/j.isci.2022.105497. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Erwin S.R., Bristow B.N., Sullivan K.E., Kendrick R.M., Marriott B., Wang L., Clements J., Lemire A.L., Jackson J., Cembrowski M.S. Spatially patterned excitatory neuron subtypes and projections of the claustrum. eLife. 2021;10 doi: 10.7554/eLife.68967. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.O'Leary T.P., Sullivan K.E., Wang L., Clements J., Lemire A.L., Cembrowski M.S. Extensive and spatially variable within-cell-type heterogeneity across the basolateral amygdala. eLife. 2020;9 doi: 10.7554/eLife.59003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Codeluppi S., Borm L.E., Zeisel A., La Manno G., van Lunteren J.A., Svensson C.I., Linnarsson S. Spatial organization of the somatosensory cortex revealed by osmFISH. Nat. Methods. 2018;15:932–935. doi: 10.1038/s41592-018-0175-z. [DOI] [PubMed] [Google Scholar]
- 9.Moffitt J.R., Zhuang X. RNA Imaging with Multiplexed Error-Robust Fluorescence In Situ Hybridization (MERFISH) Methods Enzymol. 2016;572:1–49. doi: 10.1016/bs.mie.2016.03.020. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Cembrowski M.S. Single-cell transcriptomics as a framework and roadmap for understanding the brain. J. Neurosci. Methods. 2019;326 doi: 10.1016/j.jneumeth.2019.108353. [DOI] [PubMed] [Google Scholar]
- 11.Sullivan K.E., Kendrick R.M., Cembrowski M.S. Elucidating memory in the brain via single-cell transcriptomics. J. Neurochem. 2021;157:982–992. doi: 10.1111/jnc.15250. [DOI] [PubMed] [Google Scholar]
- 12.Schindelin J., Arganda-Carreras I., Frise E., Kaynig V., Longair M., Pietzsch T., Preibisch S., Rueden C., Saalfeld S., Schmid B., et al. Fiji: an open-source platform for biological-image analysis. Nat. Methods. 2012;9:676–682. doi: 10.1038/nmeth.2019. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Kasprzak P.M. L., Kravchuk O., Timmins A. Six Years of Shiny in Research - Collaborative Development of Web Tools in R. R J. 2021;12:155–162. [Google Scholar]
- 14.R Core Team . R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing; Vienna, Austria: 2024. https://www.R-project.org/ [Google Scholar]
- 15.Arganda-Carreras I., Fernández-González R., Muñoz-Barrutia A., Ortiz-De-Solorzano C. 3D reconstruction of histological sections: Application to mammary gland tissue. Microsc. Res. Tech. 2010;73:1019–1029. doi: 10.1002/jemt.20829. [DOI] [PubMed] [Google Scholar]
- 16.McInnes L., Healy J., Großberger L., Großberger L. UMAP: Uniform Manifold Approximation and Projection. J. Open Source Softw. 2018;3:861. [Google Scholar]
- 17.Petukhov V., Xu R.J., Soldatov R.A., Cadinu P., Khodosevich K., Moffitt J.R., Kharchenko P.V. Cell segmentation in imaging-based spatial transcriptomics. Nat. Biotechnol. 2022;40:345–354. doi: 10.1038/s41587-021-01044-w. [DOI] [PubMed] [Google Scholar]
- 18.Stringer C., Wang T., Michaelos M., Pachitariu M. Cellpose: a generalist algorithm for cellular segmentation. Nat. Methods. 2021;18:100–106. doi: 10.1038/s41592-020-01018-x. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
-
•All data and code used in this protocol can be found on our GitHub at the repository links:
-
○FijiFISH repository - Zenodo: github.com/cembrowskilab/FijiFISH.
-
○RUHi repository - Zenodo: github.com/cembrowskilab/RUHi.
-
○
-
•
Original publication source material and results for the mouse claustrum can be found here at https://doi.org/10.7554/eLife.68967.


Timing: ∼15 min



