How to Optimize Flow Cytometry Hardware For Rare Event Analysis

“Not everything that can be counted counts and not everything that counts can be counted.” — William Bruce Cameron (but often misattributed to Albert Einstein)

What does this quote mean in terms of flow cytometry? Flow cytometry can yield multi-parametric data on millions of cells, which makes it an excellent tool for the detection of rare biological events — cells with a frequency of less than 1 in 1,000.

With the development and commercialization of tools such as the Symphony, the ZE5, and others which can measure 20 or more fluorescent parameters at the same time, researchers now have the ability to characterize miniscule population subsets that continue to inspire more and more complex questions.

When planning experiments to detect — and potentially sort — rare events using flow cytometry, we need to optimize our hardware to ensure that optimal signals are being generated and that rare events of interest are not lost in the system noise. This noise is also exacerbated by poor practices when running the flow cytometer.

There are 3 areas of hardware limitations that we need to consider when performing rare event flow cytometry.

1. Speed of the fluidics

The first step in running cells on the flow cytometer is setting up the fluidics to ensure the best flow possible while minimizing coincident events and data spread.

Hydrodynamic focusing is the process which focuses our cells inside the core stream, pushes them along, and spreads them out along the velocity axis, so that the cells line up single file and go through the focal point of the laser beam.

But, if the differential pressure is increased, what happens?

An increase in differential pressure between the sheath fluid and the sample fluid being introduced to the flow cytometer causes the core stream to widen. And, as it widens, more cells can pass through the laser per unit time.

There are 2 reasons why this is a concern, especially for rare event analysis:

  1. 2 cells can pass through the laser at the same time, resulting in what is measured as a doublet, and therefore both must be excluded.

By having to exclude more cells, the chances of detecting a rare event decrease.

FIGURE 1: Impact of increasing differential pressure on flow cytometry data.

  1. As the core stream widens, the cells at the edge are more poorly illuminated, and therefore emit less intensely.

When we increase differential pressure, we increase the flow rate and core stream width, allowing the cells to move and meander within the core stream. Some of these cells will not be exposed to the full laser power.

Therefore, the CV of the data spreads and we lose resolution between 2 populations, as seen in the graph below, on the right.

FIGURE 2: Effects of differential pressure on flow cytometry data. Peak CVs spread at higher flow rates.

Thus, there is a trade off between speed of acquisition and the quality of your resolution.

Best practice for rare event analysis is to run the system at low differential pressure so that the event rate is no more than 10,000 events per second (depending on your instrument).

It is often even better to run at a lower rate, such as 5,000 events per second. While this means that acquisition time will take twice as long, the quality of data will be improved. Is the trade-off worth it? For rare event analysis, it is almost a requirement.

Newer technology, like acoustic focusing from Thermo Fisher, is helping to diminish this effect. Acoustic focusing uses a standing acoustical wave that forces the cells into the center of the core stream, allowing you to run much, much faster than a traditional flow cytometer, without the data spreading found in conventional systems relying on hydrodynamic focusing alone.

2. Coincident events and aborts

What is a coincident event and how does this impact the data?

It all starts with the measurement of the electronic pulse. The schematic of pulse generation is shown in Figure 3.

As a cell passes the laser intercept, photons are received by the PMT, which converts the photons into photocurrent. When the cell is fully inside the laser, the maximal number of photons is being generated and pulse reaches the peak (the height measurement) before falling back to 0.

Figure 3: Pulse generation as a cell passes through the laser.

A problem arises if a second cell passes into the laser intercept before the first pulse finishes being processed, and both events will be aborted, resulting in lost data.

Thus, the size of the pulse matters.

The size of the pulse is ultimately going to be the size of the cell plus the beam height.

A hypothetical 5-micron cell and a 20-micron laser beam yields a 25-micron pulse. The stream of a typical analyzer travels at 5 meters per second and 30 meters per second for a sorter. Thus, it takes roughly 0.83 microseconds on a cell sorter for the typical pulse to be processed.

On some instruments, there is an additional period added to this processing time, called the window extension, on BD instruments. This extension increases the time that the system is looking for a pulse and is depicted in Figure 4.

Figure 4: The impact of window extension on pulse processing.

Imagine a cell has just passed through the laser intercept and the pulse is being processed. The next event cannot enter this extended window space until this first cell leaves the window, otherwise it’s considered a coincident event and excluded. This window can be increased or decreased, based on the size of the cell.

The consequence of altering window extension is shown in Figure 5. Window extension was increased and the number of electronic aborts were measured using 2 different sort masks after approximately 600,000 events.

At higher window extensions, there can be as much as a 13% loss of events.

Figure 5: The effect of window extension on abort rate based on 2 different sort masks.

3. Electronic limitations set by manufacturers

The final piece of the hardware are those limitations that have been set by the vendors. These limitations can include the maximal number of events allowed in a file, the number of events per second that can be acquired, the flow rate, the number of gates in a gating hierarchy, and more.

It is critical to understand these limitations while planning the details of the experiment. For analytical flow, you may need to acquire multiple files of the same tube to ensure collection of sufficient event numbers. With sorting, gating hierarchy limitations require careful thought on how to identify the target cells.

Preparing for rare event analysis requires an understanding of the power and limitation of the instrument to be used. From how fast to run the fluidics, to how the signal is processed, to the number of gates that can be used in the sorting experiment, each factor impacts the outcome of the experiment. With these hardware limitations understood, the next step is to understand how to address the sample preparation and identification of the target cells.

To learn more about How to Optimize Flow Cytometry Hardware For Rare Event Analysis, and to get access to all of our advanced materials including 20 training videos, presentations, workbooks, and private group membership, get on the Flow Cytometry Mastery Class wait list.

Join Expert Cytometry's Mastery Class
Tim Bushnell, PhD
Tim Bushnell, PhD

Tim Bushnell holds a PhD in Biology from the Rensselaer Polytechnic Institute. He is a co-founder of—and didactic mind behind—ExCyte, the world’s leading flow cytometry training company, which organization boasts a veritable library of in-the-lab resources on sequencing, microscopy, and related topics in the life sciences.

Similar Articles

How To Do Variant Calling From RNASeq NGS Data

How To Do Variant Calling From RNASeq NGS Data

By: Deepak Kumar, PhD

Developing variant calling and analysis pipelines for NGS sequenced data have become a norm in clinical labs. These pipelines include a strategic integration of several tools and techniques to identify molecular and structural variants. That eventually helps in the apt variant annotation and interpretation. This blog will delve into the concepts and intricacies of developing a “variant calling” pipeline using GATK. “Variant calling” can also be performed using tools other than GATK, such as FREEBAYES and SAMTOOLS.  In this blog, I will walk you through variant calling methods on Illumina germline RNASeq data. In the steps, wherever required, I will…

Understanding Clinical Trials And Drug Development As A Research Scientist

Understanding Clinical Trials And Drug Development As A Research Scientist

By: Deepak Kumar, PhD

Clinical trials are studies designed to test the novel methods of diagnosing and treating health conditions – by observing the outcomes of human subjects under experimental conditions.  These are interventional studies that are performed under stringent clinical laboratory settings. Contrariwise, non-interventional studies are performed outside the clinical trial settings that provide researchers an opportunity to monitor the effect of drugs in real-life situations. Non-interventional trials are also termed observational studies as they include post-marketing surveillance studies (PMS) and post-authorization safety studies (PASS). Clinical trials are preferred for testing newly developed drugs since interventional studies are conducted in a highly monitored…

Which Fluorophores To Use For Your Microscopy Experiment

Which Fluorophores To Use For Your Microscopy Experiment

By: Heather Brown-Harding, PhD

Fluorophore selection is important. I have often been asked by my facility users which fluorophore is best suited for their experiments. The answer to this is mostly dependent on whether they are using a widefield microscope with set excitation/emission cubes or a laser based system that lets you select the laser and the emission window. Once you have narrowed down which fluorophores you can excite and collect the correct emission, you can further refine the specific fluorophore that is best for your experiment.  In this blog  we will discuss how to determine what can work with your microscope, and how…

How To Profile DNA And RNA Expression Using Next Generation Sequencing (Part-2)

How To Profile DNA And RNA Expression Using Next Generation Sequencing (Part-2)

By: Deepak Kumar, PhD

In the first blog of this series, we explored the power of sequencing the genome at various levels. We also dealt with how the characterization of the RNA expression levels helps us to understand the changes at the genome level. These changes impact the downstream expression of the target genes. In this blog, we will explore how NGS sequencing can help us comprehend DNA modification that affect the expression pattern of the given genes (epigenetic profiling) as well as characterizing the DNA-protein interactions that allow for the identification of genes that may be regulated by a given protein.  DNA Methylation Profiling…

4 No Cost Ways To Improve Your Microscopy Image Quality

4 No Cost Ways To Improve Your Microscopy Image Quality

By: Heather Brown-Harding, PhD

Image quality is critical for accurate and reproducible data. Many people get stuck on the magnification of the objective or on using a confocal instead of a widefield microscope. There are several other factors that affect the image quality such as the numerical aperture of the objective, the signal-to-noise ratio of the system, or the brightness of the sample.  Numerical aperture is the ability of an objective to collect light from a sample, but it contributes to two key formulas that will affect your image quality. The first is the theoretical resolution of the objective. It is expressed with the…

How To Profile DNA And RNA Expression Using Next Generation Sequencing

How To Profile DNA And RNA Expression Using Next Generation Sequencing

By: Deepak Kumar, PhD

Why is Next Generation Sequencing so powerful to explore and answer both clinical and research questions. With the ability to sequence whole genomes, identifying novel changes between individuals, to exploring what RNA sequences are being expressed, or to examine DNA modifications and protein-DNA interactions occurring that can help researchers better understand the complex regulation of transcription. This, in turn, allows them to characterize changes during different disease states, which can suggest a way to treat said disease.  Over the next two blogs, I will highlight these different methods along with illustrating how these can help clinical diagnostics as well as…

What Is Total Internal Reflection Fluorescence (TIRF) Microscopy & Is It Right For You?

What Is Total Internal Reflection Fluorescence (TIRF) Microscopy & Is It Right For You?

By: Heather Brown-Harding, PhD

TIRF is not as common as other microscopy based techniques due to certain restrictions. We will discuss these restrictions, then analyze why it might be perfect for your experiment.  TIRF relies on an evanescent wave, created through a critical angle of coherent light (i.e. laser) that reaches a refractive index mismatch.  What does it mean in practice?  A high angle laser reflects off the interface of the coverslip and the sample. Although the depth that this wave penetrates is dependent on the wavelength of the light, in practice it is approximately 50-300nm from the coverslip. Therefore, the cell membrane is…

What Is Next Generation Sequencing (NGS) And How Is It Used In Drug Development

What Is Next Generation Sequencing (NGS) And How Is It Used In Drug Development

By: Deepak Kumar, PhD

NGS methodologies have been used to produce high-throughput sequence data. These data with appropriate computational analyses facilitate variant identification and prove to be extremely valuable in pharmaceutical industries and clinical practice for developing drug molecules inhibiting disease progression. Thus, by providing a comprehensive profile of an individual’s variome — particularly that of clinical relevance consisting of pathogenic variants — NGS helps in determining new disease genes. The information thus obtained on genetic variations and the target disease genes can be used by the Pharma companies to develop drugs impeding these variants and their disease-causing effect. However simple this may allude…

5 Drool Worthy Imaging Advances Of 2020

5 Drool Worthy Imaging Advances Of 2020

By: Heather Brown-Harding, PhD

2020 was a difficult year for many, with their own research being interrupted- either by lab shutdowns or recruitment into the race against COVID-19. Despite the challenges, scientists have continued to be creative and have pushed the boundaries of what is possible. These are the techniques and technologies that every microscopist was envious of in 2020. Spatially Resolved Transcriptomics Nature Methods declared that spatially resolved transcriptomics was the 2020 method of the year. These are a  group of methods that combine gene expression with their physical location. Single-cell RNA sequencing (scRNAseq) was originally developed for cells that had been dissociated…

Top Technical Training eBooks

Get the Advanced Microscopy eBook

Get the Advanced Microscopy eBook

Heather Brown-Harding, PhD

Learn the best practices and advanced techniques across the diverse fields of microscopy, including instrumentation, experimental setup, image analysis, figure preparation, and more.

Get The Free Modern Flow Cytometry eBook

Get The Free Modern Flow Cytometry eBook

Tim Bushnell, PhD

Learn the best practices of flow cytometry experimentation, data analysis, figure preparation, antibody panel design, instrumentation and more.

Get The Free 4-10 Compensation eBook

Get The Free 4-10 Compensation eBook

Tim Bushnell, PhD

Advanced 4-10 Color Compensation, Learn strategies for designing advanced antibody compensation panels and how to use your compensation matrix to analyze your experimental data.