Bringing the Cell Image into Focus

Featured In: Imaging | Microscopy

Tuesday, November 2, 2023

newsvine diigo google
slashdot
Share
Loading...

Improvements in transmission electron microscope (TEM) technology increase the power of this imaging tool for the study of cell biology.

The first published report demonstrating the improved resolving power of the transmission electron microscope (TEM) over light microscopy for imaging cell ultrastructure was published in 1945 by Keith Porter, Albert Claude, and Ernest Fullam (Porter K, Claude A, Fullam E. A Study of Tissue Culture Cells by Electron Microscopy. J Exp Med 1945 81:233-246. doi:10.1084/jem.81.3.233). Just four years later, in 1949, Philips Electron Optics, which continues in operation today as FEI Company, unveiled its first commercial TEM, the EM100 (figure 1). Since then, TEM has become a powerful tool for scientists, especially cell biologists. Its use as a probe into the molecular architecture of cells has been longstanding and continues today. That history, however, has not been without challenge. Meeting those challenges with continued improvements has made TEM what it is today.

Figure 1: Philips EM100 TEM. (Source: FEI Company)
In addition to having a key role in the material and physical sciences, TEM has many life science applications, especially in molecular cell biology. TEM can be used to perform detailed microscopic examinations of a wide range of cell types, and more recently, on fresh cell or tissue samples.

Significant technical developments incorporated into the modern TEM, together with powerful computer programs that guide and assist users, now mean that biological samples can be imaged easily for either two-dimensional (2D) or three-dimensional (3D) investigations. In parallel with significant improvements in light microscopy over the past few years, recent advances in TEM hardware and software now afford significant improvements in the level of structural detail revealed by electron microscopic imaging and elemental analysis.

Cryo-TEM
TEM alone combines the power to resolve biological structures with the ability to visualize them in their functional, three-dimensional context, and native, fully-hydrated state. Conventional TEM sample preparation is a lengthy, cumbersome process that typically consists of several steps: fixing, dehydration, staining, embedding in plastic resin, sectioning of cells/tissue, and then transferring the sections to a metal support (‘grid’) covered with a thin layer of electron transparent material. These protocols are well established and routinely performed, but the negative impact they may have on biological structure in some instances—especially at the molecular level—cannot be dismissed. Cryogenic sample preparation avoids many of these concerns by immobilizing molecules, cells and tissues intact at the instant they are fast-frozen, thus capturing them in their native state without chemical cross-linking. These features make Cryo-TEM an excellent tool for examining virus particles, for example (figure 2).

Figure 2: Negatively stained H1N1 virus particles imaged using a TEM reveal key ultrastructural features of the virus. (Source: Dr. Cynthia Goldsmith, Infectious Disease Pathology Branch, Centers for Disease Control and Prevention)

Faster, more reproducible sample preparation, using the Vitrobot from FEI Company, for example, is another improvement in TEM investigations. The Vitrobot is a vitrification robot that can be used to provide a complete solution for Cryo-TEM, and typically yields more reproducible and successful results than previous manual plunge-freezing devices. Vitrification cools the sample so rapidly that water molecules do not have time to crystallize, forming instead an amorphous solid that does little or no damage to the sample structure.

3D Structure Modeling
The ability to see all three dimensions in biological specimens is an invaluable aid to understanding the relationships between structure and function throughout the range of scales and across all levels of the organizational hierarchy. However, it becomes especially important at smaller scales where the structures of interest are much smaller than the thickness of the sample and the occurrence of overlapping structures in the projected TEM image becomes problematic. Structural information along the electron beam direction is averaged for each point in the image, making overlapping structures difficult or impossible to interpret. TEM resolution in this direction is essentially limited by the thickness of the sample (≥30 nm). This is a significant limitation for three-dimensional molecular structures where the scale of information is a nanometer (roughly 10 atomic diameters) or less. Advanced computational techniques, which combine 2D TEM images to generate detailed 3D models, have resolved structural details as small as a few tenths of a nanometer (nm) (figure 3). Two different approaches to 3D modeling, electron tomography (ET) and single particle analysis (SPA), are widely used.

Figure 3: Keyhole Limpet Hemocyanin (KLH1) structure elucidation by Cryo-TEM and angular reconstitution. (Source: Dr. Felix de Haas, FEI NanoPort)

Electron Tomography
Electron Tomography (ET) creates a detailed, virtual 3D model of the specimen from a series of 2D images acquired over a large angular range as the sample is tilted in known increments about an axis perpendicular to direction of the beam. It is similar to other tomographic imaging techniques used in clinical medicine (e.g., CAT scan) and other industrial applications. Software packages simplify the process of acquiring image “tilt series” for tomographic reconstruction by providing an efficient, streamlined, fully integrated workflow. In addition to integrated commercial software, a number of academic software packages are also available to assist in image data collection and processing for ET.

ET is limited by its inability to collect data at high tilt angles where the apparent thickness of the sample becomes excessive—known as the “missing wedge” problem. Dual-axis tomography addresses this limitation by using two tilt series about two different axes orthogonal to each other and the beam, thus reducing the missing wedge to a missing pyramid.

Scanning transmission electron microscopy (STEM) has advantages over TEM in some biological tomography applications. STEM can produce clear images from stained specimens that are typically too thick (up to 1 μm) for TEM. Dynamic focusing in STEM can keep the probe focused over large fields on tilted specimens. High tilt images in the tilt series contain much better information than corresponding TEM images where dynamic focusing is not possible.

A simplified and efficient approach for mapping and analyzing entire mammalian cells in 3D using ET has been developed within the past few years and provides precise quantitative information about cellular organization and ultrastructure on a global scale at resolutions up to ~10 nm. This technique has provided unique insights into fundamental structure-function relationships among key cellular organelles on a timescale of months rather than years. It has been applied to image, reconstruct and analyze entire pancreatic beta cells in 3D at an order of magnitude better resolution than standard light microscopy (figure 4).

Figure 4: The upper panels show an X-Z view of a 3D reconstruction at intermediate resolution (~10nm) of a whole cell assembled from serial section tomograms of semi-thick (350nm) plastic sections cut from high-pressure frozen/freeze-substituted pancreatic islet cells after plastic-embedding. The panel on the right shows a zoomed view of the Golgi region revealing the complexity of spatial relationships between membranes of the Golgi ribbon (main ribbon, grey; trans-Golgi, red), mitochondria (green, thick) and microtubules (green, thin). The lower panels show an image slice from a high-resolution (~5nm) tomogram of the Golgi region in a similarly prepared pancreatic islet cell. The pseudo-colored panel on the right shows the organization of the stacked Golgi cisternae of the ribbon together with insulin secretory granules (dark blue) and microtubules (green) in the region. Bar, 500nm. (Source: Dr. Brad Marsh, Institute for Molecular Bioscience & ARC Centre of Excellence in Bioinformatics, The University of Queensland)

Single Particle Analysis
Single Particle Analysis (SPA) is a powerful technique that uses the high resolving power of the TEM to visualize single molecules and macromolecular assemblies for detailed structural analysis. Somewhat of a misnomer, SPA actually relies on using a large number of images of many separate, but identical, particles to compute a 3D model of the particle’s structure. The composite image is constructed by first sorting a large number of arbitrarily-oriented particles into different categories or ‘classes’ based on similar orientation. Then, a ‘class-averaging’ approach is used to generate a high quality, low noise composite image. Finally, composite images are combined to compute a 3D model. SPA is particularly well suited to structures with a high degree of symmetry, such as the icosahedral symmetry exhibited by some viruses. Because it operates on the presumption that the majority of particles are identical, it is challenging to study variations among individual particles that might reflect the range of molecular conformations associated with different functional states. However, software is currently being developed to take the heterogeneity of samples into consideration during the processing of SPA data.

Usability
For decades, there were a number of major challenges associated with carrying out extensive structural studies of cells and/or tissues by TEM. In general, TEM systems developed before 1990 were not very user-friendly. TEM use in that era required lots of training, and often a TEM user needed to undertake a PhD in electron microscopy. TEM technologies were generally not automated, which increased warm-up time, making them inefficient, especially for experiments that required high throughput. TEM systems had lengthy and complicated warm-up and alignment procedures and imaging was often difficult and cumbersome. Lengthy and complex sample preparation for TEM analysis also challenged the TEM user with inefficiency.

There have been many improvements in TEM technology over the years. Newer TEM systems are highly automated, thus enabling users in the life sciences (e.g. cell biologists) to perform experiments with less manpower and in less time than with traditional TEM. Extensive automation also allows users at all levels to concentrate on the science of the specimen rather than the operation of the microscope, enhancing speed, reducing operator fatigue, and improving the repeatability and reproducibility of results.

Most biological materials are sensitive to beam damage and it can be challenging to acquire images with sufficient contrast and signal-to-noise ratios without destroying the sample. Automated low dose imaging techniques can minimize the exposure of the imaged region to ensure that the maximum information is extracted from the specimen within the tolerated beam exposure.

Newer TEM systems allow single-button warm up, thus eliminating the wait time associated with the initial operating steps in older models. In some new TEM systems, a simple click on the light button automatically switches the high tension and the filament on, opens the column valves, and presents a low magnification image of the sample with appropriate intensity on the fluorescent screen or digital display. Digital control and imaging systems allow users to view images on a display screen while the computer drives the instrument, a welcome change from the knob-based controls used in older systems.

The implementation of digital cameras to capture images makes the data/images more manageable and facilitates follow-on analysis. This is much faster and easier than earlier systems that relied on film. Images stored in a digital file format are also easier to access and share.

Some of the most exciting development is occurring in the area of correlative workflow, which is the ability to move samples and information between instruments—typically between optical microscopes and TEM, or between dual beam (a combined focused ion beam and scanning electron microscope system) and scanning electron or transmission electron microscope. Correlative workflow provides seamless navigation between platforms, allowing the investigator to leverage the unique capabilities of each, i.e. the high resolution of TEM applied to structures localized by fluorescent tags in a light microscope.

It is becoming increasingly important that life science researchers integrate TEM data with data produced by other structural biology tools, such as nuclear magnetic resonance (NMR) and x-ray diffraction (XRD). Until recently, these tools were the primary methods used to determine 3D structure at the atomic and molecular scale. Despite the capability of resolving structure at the atomic scale, both XRD and NMR have significant limitations. NMR is limited to relatively small molecules and XRD requires that the analyzed molecules first be isolated and crystallized. Increasingly TEM is being used to provide an overall view at the macromolecular level, with XRD and NMR adding atomic scale detail when available.

Conclusion
In conclusion, TEM is a powerful tool of today’s life scientist. Over the years, this tool has gone through several improvements and is still on that path. Improvements such as automation, software, and easier sample preparation have led to an ever-expanding number of applications: Cryo-TEM, single particle analysis, electron tomography and 3D structure modeling, just to name a few. Technological improvements in current TEM systems have enabled life scientists to perform these applications and more, making it an invaluable research tool.

About the Author
Wim Voorhout joined FEI Company in 2000 and most recently served as product marketing manager for cellular biology solutions.

This article was published in Bioscience Technology magazine: Vol. 34, No. 10, October, 2010, pp. 16-19.

Join the Discussion
Rate Article: Average 0 out of 5
register or log in to comment on this article!

0 Comments

Add Comment

Text Only 2000 character limit

Page 1 of 1

Research Exchange

Bringing the Cell Image into Focus

Nov 2 2010

Improvements in transmission electron microscope (TEM) technology increase the power of this imaging tool for the study of cell biology.

Finding a Cure for Spinal Cord Injury with On-Demand LIMS

Oct 25 2010

The Miami Project to Cure Paralysis finds an on-demand laboratory information management system (LIMS) helps to accelerate discovery in its HCS projects.

Saving Cells: Image Processing for Improved Viability, Part II: Iterative Deconvolution

Oct 25 2010

3D light microscopy and deconvolution provide a means to investigate 3D structure, providing near-confocal quality images without the temporal requirements or potentially damaging phototoxicity associated with other 3D imaging technologies. This article is Part II in a series regarding viability, resolution improvement, and measurement in fluorescence imaging. Part I focused on spectral unmixing.

Saving Cells: Image Processing for Improved Viability

Sep 22 2010

This article is Part I of a two-part series regarding viability, resolution improvement, and measurement in fluorescence imaging. Part II will focus on deconvolution.

Selecting Robots for Use in Drug Discovery and Testing

Dec 6 2010

Drug discovery and testing, with their need for speed, repeatability and verification, are ideally suited to benefit from robot automation. It is therefore not surprising that robots have been at the forefront of automation developments in both these areas.

HP Scalable Network Storage Systems for Life Sciences

Sep 13 2010

Life sciences research today is advancing exponentially, each step bringing us closer to the realization of truly personalized medicine–preventive care and treatments designed specifically for each individual. In the near future, PCPGM healthcare researchers expect to be able to use predictive genetic testing to create custom treatment plans for individuals and deliver dramatic improvements over today’s one-size-fits-all approach. But research capabilities are only part of the equation; current storage and operating capacities must also evolve to accommodate ever-expanding amounts of data before the goal of personalized medicine can be realized.

Step up to the MIQE

Mar 30 2010

Over the years, polymerase chain reaction (PCR) has evolved into a readily automated, high throughput quantitative technology. Real-time quantitative PCR (qPCR) has become the industry standard for the detection and quantification of nucleic acids for multiple application, including quantification of RNA levels. But a lack of consensus among researchers on how to best perform and interpret qPCR experiments presents a major hurdle for advancement of the technology. This problem is exacerbated by insufficient experimental detail in published work, which impedes the ability of others to accurately evaluate or replicate reported results.

Fast Optimization of a Multiplex Influenza Identification Panel Using a Thermal Gradient

Mar 30 2010

The year 2009 was marked by the emergence of a novel influenza A (H1N1) virus that infects humans. There is a need to identify the different strains of influenza virus for purposes of monitoring the H1N1 strain pandemic and for other epidemiological and scientific purposes.

Using the Tecan Genesis Workstation to Automate a Cytometric Bead Array (CBA) Immunoassay

Mar 11 2010

The poster describe the process involved in automating a Cytometric Bead Array (CBA) immunoassay developed to measure relative concentrations of serum antibodies against Tetanus (TT), Sperm Whale Myoglobin (SWM) and Keyhole Limpet Hemocyanin (KLH) in KLH-immunized volunteers.

Ensuring Quality in Assays Performed with Automated Liquid Handlers

Feb 2 2010

The focus of this presentation is to highlight the need of ensuring quality in important assays performed with automated liquid handlers. Nearly all assays performed within a laboratory are volume-dependent. In turn, all concentrations of biological and chemical components in these assays, as well as the associated dilution protocols, are volume-dependent. Because analyte concentration is volume-dependent, an assay’s results might be falsely interpreted if liquid handler variability and inaccuracies are unknown or if the system(s) go unchecked for a long period.

Inkjet System for Protein Crystallography

Feb 1 2010

X-ray crystallography is used routinely by scientists to obtain the three dimensional structure of a biological molecule of interest.Such information can be used to determine how a pharmaceutical interacts with a protein target and what changes might improve functionality. However, the crystallization of macromolecules still remains a serious hindrance in structural determination despite impressive advances in screening methods and technologies.

Attention Deficit & Hyperactivity in a Drosophila Memory Mutant

Attention Deficit & Hyperactivity in a Drosophila Memory Mutant

Nov 9 2009

Action selection is modulated by external stimuli either directly or via memory retrieval. In a constantly changing environment, animals have evolved attention-like processes to effectively filter the incoming sensory stream. These attention-like processes, in turn, are modulated by memory. The neurobiological nature of how attention, action selection and memory are inter-connected is unknown. We describe here new phenotypes of the memory mutant radish in the fruit fly Drosophila.

Effects on enantiomeric drug disposition and open-field behavior after chronic treatment with venlafaxine in the P-glycoprotein knockout mice model.

Dec 31 2010

RATIONALE: P-glycoprotein (P-gp) plays an important role in the efflux of drugs from the brain back into the bloodstream and can influence the pharmacokinetics and pharmacodynamics of drug molecules. To our knowledge, no studies have reported pharmacodynamic...

Self-administration of cocaine and remifentanil by monkeys: choice between single drugs and mixtures.

Dec 31 2010

RATIONALE: Cocaine and opioids are often co-abused. As yet, however, there is no clear evidence that the drugs interact to make the mixture a more effective reinforcer. OBJECTIVE: The present study examined the relative reinforcing potency and maximum...

Complete genome sequence of a carbon monoxide-utilizing acetogen, Eubacterium limosum KIST612.

Dec 31 2010

Eubacterium limosum KIST612 is an anaerobic acetogenic bacterium that uses CO as the sole carbon/energy source and produces acetate, butyrate, and ethanol. To evaluate its potential as a syngas microbial catalyst, we have sequenced the complete 4.3-Mb genome of E. limosum...

Complete genome sequence of the bacterium Ketogulonicigenium vulgare Y25.

Dec 31 2010

Ketogulonicigenium vulgare is characterized by the efficient production of 2KGA from L-sorbose. Ketogulonicigenium vulgare Y25 is known as a 2-keto-L-gulonic acid-producing strain in the vitamin C industry. Here we report the finished, annotated genome sequence of...

Prokariotic Cell Collection in Denmark

Nov 6 2009

I would like to know about a prokariotic cell collection in Denmark. Is there a cell bank in this country? I need a Lactobacillus strain for a fermentation assay and this information about the bank is very helpful for me.

Request for Entries

Oct 16 2009

Ask the Experts is your chance to get the answers to questions on applications, materials, methods, processes, and technologies. Email you question to bst_web@advantagemedia.com, and the editors of Bioscience Technology will find an appropriate expert to answer it. Watch this space in the future to see the questions your colleagues are posting.

STAY INFORMED: SUBSCRIBE TO

Magazine and E-mail Newsletters

Loading...
E-mail:   

MULTIMEDIA

Video:

Viewing SureFocus Slides

Jun 11 2010

A demonstration of SureFocus Microscope Slides in the review of AFB Smears. SureFocus Slides are a patent-pending breakthrough in tuberculosis detection, as their fluorescent staining circle remains visible during review, Fluorescence Microscopy.

Podcasts:

Allen Institute for Brain Research

Allen Institute for Brain Research

Oct 14 2009

Discussed in this interview are both the mouse brain project and the human cortex project with an emphasis on the importance of these projects to neuroscience research.

Information: