Insights

Two are better than one: new technologies for proteomics

Traditional mass spectrometry-based technologies have recently been joined by a wave of affinity-based and digestion-based “next-generation proteomics” technologies. Together, they offer unprecedented insight, but there is scope to make them more accessible in clinical and academic research settings by streamlining workflows, particularly for sample preparation.

Proteomics now provides more comprehensive insights into the complex world of proteins within our cells than ever before – from their fundamental roles in every aspect of human health to their critical functions as architects of disease.  

In 2024, the global proteomics market was valued at over USD 33 billion, forecast to grow to  USD 60 billion by 2029, largely driven by the drug discovery and clinical diagnostic sectors, increasing demand for personalised medicines and advancements in proteomics technologies [1].

Traditional methods to detect and infer the abundance of proteins in the proteome extend to a variety of complex mass spectrometry-based techniques requiring detailed expertise in sample preparation, instrument operation and data analysis.  

Scientists can now use mass spectrometry to perform exquisite proteome analyses even at the single-cell level, with the power to robustly detect thousands of different protein variants and post-translational modifications in a single sample.

But in the past few years a wave of affinity-based and digestion-based “next-generation proteomics” technologies has hit the market, providing scientists with a platter of novel options to assess different aspects of the proteome with varying sample throughputs and sensitivities.  

While some of these next-generation proteomics technologies currently lack the throughput of modern mass spectrometry-based approaches, they were born with a focus on ease of use and affordability at their core following the democratisation of next-generation sequencing for genomics and transcriptomics.

In this article, we explore state-of-the-art of mass spectrometry and next-generation proteomics, and the opportunities for technological advances.

The power of mass spectrometry

Since the inception of mass spectrometry over 100 years ago, extensive collaboration between biochemists, physicists and engineers has led to profound increases in the sensitivity, throughput and accuracy of modern instruments.

The development of ionization methods for macromolecules provided scientists with thousands of mass-to-charge ratios – so-called protein mass “fingerprints” ─ that could be matched to proteins and peptides in databases to help identify novel targets with striking accuracy. Ambitious consortia, such as The Human Proteoform Atlas, now aim to identify the vast universe of different variants of proteins born from the same gene, with over 60,000 proteoforms detected to date.

Until recently, researchers sometimes struggled with the relatively low sample throughput of conventional mass spectrometers for proteomics research. Similarly, the number of detected peptides or proteins also struggled to reach the proteome scale. Detecting 12,000 proteins required extensive fractionation, enzymatic digestion, and over one day of mass spectrometer analysis time per sample, making mass spectrometry prohibitively time-consuming and expensive for large-scale proteomic cohorts, drug discovery studies or those assessing post-translational modifications.  

Despite this, today, mass spectrometry throughputs are beginning to match those we have come to take for granted in NGS for genomics. This need for improved throughput and detection capabilities has culminated in technologies such as the Orbitrap™ Astral™ mass spectrometer from Thermo Fisher Scientific.  

This instrument now allows users to boost sample throughput to up to 180 samples per day, as unbiased detection of 8,000 proteins per sample occurs in only eight minutes. Only one hour is needed to detect 12,000 proteins per sample instead of one day.  

The increased sensitivity also allows users to detect over 5,000 proteins from a single-cell input equivalent, making mass spectrometry-based single-cell proteomics a reality. Similarly, the RapidFire 400 from Agilent now increases sample throughput with the integration of 1536-well plates, eight-second sample extractions and two-second measurements and is increasingly used in specific applications like probing lipids or metabolites in the clinic.

Notably, protein post-translational modifications (PTMs) are crucial for protein participation in molecular interaction networks and have attracted much interest due to their potential to be used as disease biomarkers in clinics. To date, mass spectrometry is the only established method used for PTM profiling with high sensitivity. PTM profiling requires challenging sample preparation workflows with protein purification and target PTM enrichment steps, which are low throughput. Nevertheless, owing to the high sensitivity of mass spectrometry instruments, mass spectrometry remains indispensable in PTM research.

Mass spectrometry instruments need extensive infrastructure and expertise to maintain and operate, typically requiring core facilities or outsourcing to other laboratories. Due to the wide variety of options available at each stage, there is also no universal solution for the entire process, from sample preparation to mass spectrometry data analysis. While ease of use has improved, these shortcomings could still be barriers for researchers, clinicians or diagnostic laboratories without access to a core facility, potentially hindering novel biological discoveries, clinical diagnoses or drug development.

Next-generation proteomics: fulfilling an unmet need

Enter next-generation proteomics.

While mass spectrometer-based proteomics requires considerable expertise and substantial financial investment from labs, novel approaches to proteomics continue to develop at pace, with an eye firmly on ease of use and affordability from sample preparation to data analysis.  

Genomics and transcriptomics laboratories are increasingly embracing a holistic “multi-omic” approach to intricate biological questions, where “familiar next-generation-sequencing-like” proteomic technologies complete the “narrative” from DNA to RNA all the way to the functional proteins most likely to influence phenotypes.

For instance, the benchtop Platinum® Next-Generation Protein Sequencer™ from Quantum-Si sequences individual peptides by determining the binding kinetics, fluorescent lifetime and intensity of N-terminal amino acid recognizers that bind to N-terminal amino acids. Subsequent cleavage of the N-terminal amino acid by aminopeptidases exposes the next for amino acid recognition until whole peptides are assayed at the single-molecule level, followed by automated data analysis.

The platform has also been developed to distinguish between different post-translational modifications of the same amino acid, dimethylation, and citrullination, which are challenging to detect with mass spectrometry. Despite these advances, the technology has limited throughput at present with up to 10 peptides targeted per sample. Oxford Nanopore Technologies is also developing a workflow for protein sequencing using its nanopore technology.

Attracting commercial interest

Other affinity-based discovery proteomics platforms have attracted commercial interest from key players in the mass spectrometry industry, such as the Explore HT platform from Olink® recently acquired by Thermo Fisher Scientific, and SomaScan® from SomaLogic now merged with Standard Biotools. These can be used directly from plasma, making them an attractive option for those worried about overall sample prep quality.

Antibody company Abcam was also recently acquired by Danaher, and Bruker has acquired the gene expression profiling and spatial multi-omics capabilities of NanoString, suggesting extensive interest in omics technologies and sample/assay preparation tools like antibodies used in affinity-based panels.  

Affinity-based proteomics platforms are increasingly popular choices for benchtop proteome analyses in laboratories more familiar with genomic technologies. These have excellent dynamic range to detect lowly abundant proteins difficult with standard mass spectrometry approaches but both rely on preselected target panels.  

The O-link® platform uses proximity extension assay technology, where tandem antibodies with DNA tags identify and bind to their target proteins. This is followed by next-generation sequencing of the unique sample and protein-DNA barcodes. Assays now include antibodies for over 5,000 proteins, but the need for validated antibody targets means the technology falls short of providing unbiased proteome-wide analyses as is possible by mass spectrometry. This approach offers ease of use and scalability previously difficult to achieve with mass spectrometry for biomarker panels, targeted proteomic studies, or high-throughput sample analyses.

The SomaScan® 11K assay from SomaLogic boosts the number of human proteins measured to around 11,000 via its DNA-based aptamers, which have specific binding shape complementarity to their target proteins. A series of washes then releases the bound nucleic acid for next-generation sequencing and quantification. Over 1,000 clinical samples can be assayed in one day via automation by core laboratories.  

Also, one theme of these next-generation proteomics technologies is that they are designed for ease of data analysis and integration with genomic and transcriptomic data generated by next-generation sequencing. For multi-omic studies, data compatibility is an attractive proposition that is currently difficult for mass spectrometry to achieve.  

Complementary technologies will advance proteomics

So, does this mean that next-generation proteomic approaches could supersede mass spectrometry as the proteomic method of choice? At present, no instrument can analyse the entire proteome, but not all of a researcher's questions will require such comprehensive readouts. Ultimately, the most appropriate technology and instrument will depend on the job at hand.

Combining the depth of mass spectrometry with the ease and affordability of next-generation proteomics technologies undoubtedly provides researchers with unprecedented power to answer diverse hypotheses in the most effective, robust and reproducible way.  

The entry of next-generation proteomics technologies into the market represents another string in the biologists’ bow on the ultimate quest to understand the myriad complexities of the proteome.

However, there is much scope to make these powerful technologies more accessible in clinical and academic research settings, for example through streamlining and increasing standardisation of sample preparation, or end-to-end automation, as mentioned above.  

Lack of standardisation in sample preparation is a problem for both mass spectrometry and, to a lesser extent, next-generation proteomics due to countless applications and workflows depending on the instrument used, the cell type or tissue sample available and the experimental question.

A universal sample preparation pipeline similar to RNA-extraction-free cell lysis-based approaches used in large-scale pharmacotranscriptomics or ultra-high-throughput single-cell RNA-seq studies could accelerate proteomic technologies, ultimately providing bench-to-bedside analyses of the proteome and improving ease of use in mass spectrometry.  

However, the size of proteins combined with their complex folding and charges make this prospect much more challenging than RNA sample preparation, and it’s not obvious that this could become a reality in the near future.

Ultimately, technological advances will come from extensive cross-fertilisation of ideas between experts from the next-generation sequencing and proteomics fields or accelerated by multidisciplinary organisations like TTP, where we already have proteomics and next-generation sequencing expertise. Novel solutions to upstream sample preparation or instrument design could be built collaboratively from the ground up or anywhere between.

Overall, the environment of proteomic technologies has never been richer. Combining the constellation of options now available to probe the proteome will undoubtedly allow researchers to answer unprecedented questions about how global protein landscapes are affected in disease or after treatment with therapeutic compounds, inevitably improving outcomes for patients and human health.

References

[1] MarketsandMarkets (2025) Proteomics Market by Product (Spectroscopy, Chromatography, Electrophoresis, X‑Ray Crystallography), Reagent, Service (Core Proteomics, Bioinformatics), Application (Diagnostic, Drug Discovery), End User (Hospital, Labs, Biopharma) & Region – Global Forecast to 2029. Report code BT 2341, February 2025. Available at: https://www.marketsandmarkets.com/Market-Reports/proteomics-market-731.html

Talk to us about your next project

Talk to us about your next project

Whether you would like to discuss a project or would like to learn more about our work, get in touch through the form below.

Last Updated
June 18, 2025
Greta Sneideriene
Share

You might also like

Get the latest from TTP

Join our community to get the latest news and updates on our work at TTP.

You will occasionally receive expert insights from across our areas of focus and hear directly from our engineers and scientists on the newest developments in the field.

Get the latest from TTP

Join our community to get the latest news and updates on our work at TTP.

Want to work 
at TTP?

Find open positions and contact us to learn more.

Overlay title

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique.

No items found.