Articles
LISTED UNDER:
Separating samples and then analyzing them often requires liquid chromatography (LC) and mass spectrometry (MS), respectively. To explore the ongoing advances in LC/MS technology, we talked with experts from AB SCIEX, JEOL, and Shimadzu.
Getting the best results from LC/MS depends on the entire system. “One challenge is running LC/MS with non-volatile buffers, like phosphate buffers,” says Chip Cody, PhD, product manager for mass spectroscopy at JEOL. In some cases, this leads to signal suppression. Nonetheless, Christian Klampfl of the Johannes Kepler University in Linz, Austria, and his colleagues resolved this problem by combining HPLC and direct analysis in real time (DART) MS. “This is another tool in the toolbox,” says Cody, “and it answers a key challenge in LC/MS.”
JEOL provides another new tool, the SpiralTOF MALDI-TOF-TOF mass spectrometer. “This provides the highest resolution you can get on a TOF system,” Cody says. “Our MALDI-TOF-TOF system lets you select a single peak in a cluster and get an MS-MS spectrum for it.” In a protein digest, for example, LC separates the peaks, and MS-MS grabs the data. “The combination of these technologies gives you the best of both worlds,” Cody explains.
Enhanced Acquisition & Analysis
To ensure the sampling of everything, AB SCIEX introduced its SWATH Acquisition technology, a form of data-independent analysis. Christie Hunter, PhD, the company’s director of omics applications, says, “Data-independent analysis in an LC timeframe is really enabled by the performance capabilities of the TripleTOF MS, MS-MS speed, sensitivity and resolution.” She adds, “You then generate libraries of MS-MS data and use them to mine the comprehensive data sets for quantitative information.”
A scientist can also get more information from a sample with AB SCIEX’s SelexION Technology, which provides differential mobility separations. As Hunter explains, “This gives you an additional dimension of separation as part of an LC/MS experiment.” This feature is available on the company’s QTRAP LC/MS systems and will soon come out on the TripleTOF systems. “It’s part of the sample-introduction interface,” Hunter says, “and you can tune it to filter on the compound of interest to improve data quality through the additional selectivity.”
High Sensitivity Analysis
Proteomics often requires the analysis of small quantities of samples. As Remco van Soest, product specialist at AB SCIEX, says, “Many people start out doing nano LC for discovery in proteomics, and they use nano LC for the high sensitivity that it provides. At some point, the scientists might want to work on biomarker verification or validation and still need good sensitivity but the robustness and throughput of micro LC.”
Previously, performing micro and nano LC required two instruments. Now, the Eksigent nanoLC 425 and 415 can do both. “These let users change the flow rate from nano to micro,” van Soest explains. “Some customers need the ability to switch back and forth maybe every few months.” To do that with the 425 or 415, the user just changes the flow module. “It’s like a cassette that you swap out,” van Soest says. “The instrument detects that the part has been changed and the software knows right away what gradients the system can run.” He adds, “No calibration is necessary.”
These systems even provide internal quality control. “We measure the flow that the pump delivers and provide constant correction,” van Soest says.
Data Dealings
“Currently, the greatest challenge in LC/MS is dealing with the massive amount of data being generated,” says Scott Kuzdzal, PhD, life science business manager at Shimadzu. “The issue of informatics is bigger than gigantic.” He adds, “We find that researchers are eager to jump into proteomics or metabolomics data mining, but they need a team of informatics people to crunch the data and answer biologically relevant questions.”
The data challenge includes three phases: storage, analysis and querying. “Biologists and MDs must be able to ask questions about the data,” says Kuzdzal. “That’s difficult when multiple vendor instrument data and processing programs are spread across disparate PCs throughout the labs.”
To help with these challenges, Shimadzu partnered with Integrated Analysis to launch the i3D Enterprise Informatics Service. “This cloud-based service integrates data storage, workflow processing and querying/reporting, allowing researchers to take control of their data,” Kuzdzal says. “All the end user needs is a browser—like Explorer or Chrome—and then you have access to i3D.”
This service sends data from various instruments—including LC, gas chromatograph and LC/MS—to a secure, private cloud, where the data can be analyzed and shared. “You don’t need to rely on one PC crunching your data,” Kuzdzal says. “You can have 40 or 100 CPUs tearing at that data. Results can be shared instantly with collaborators around the globe.”
As this article shows, the best results depend on sophisticated approaches to sample separation, data collection and analysis.