Quantum entanglement is the scientific basis for “quantum 2.0” technologies being developed all over the world. But how do we reduce risk and uncertainty for adopters? Technology standards can help to abstract out the complex science, but it’s crucial not to stifle innovation.

The quantum revolution is here. The United States has committed an annual budget of $153 million to the development of quantum technologies for 2023-2027, in line with similar initiatives in the UK, Europe and China. Perhaps the most interesting aspect of this bill is the investment in quantum standards, which represent around 10% of the programme.

Standards are fundamental in science and technology. They give a framework for development which reduces uncertainty and risk for the end user, as well as protecting the technology developer. This is turn facilitates the development and market penetration of new technologies.

Quantum technology has the potential for wide market disruption with scope way beyond just quantum computing. Although some applications may not advertise ‘quantum’ as a feature, these technologies will permeate a wide range of sectors and applications, including medical imaging, end-of-line testing, inertial navigation or underground detection for example. 

This is the result of many years of research – and many PhDs – in quantum science and ambitious technology transfer through initiatives such as the UK’s National Quantum Technology Programme. Standards will help us get the most out of these breakthrough technologies, bringing solutions to different challenges, which include: driving adoption across the board (push and pull); integrating quantum into existing markets; and directing development in the most synergistic ways.

However, this is not an easy task, as the science is fundamentally complex – a fact that can be overplayed, when quantum presents a connotation that emphasises ‘weird’ and ‘spooky’ effects – uncommon selling terms in industrial technology. Hopeful adopters and interested observers may have no background in science – let alone quantum physics – yet are constantly faced with a myriad of articles, ranging from reassuringly optimistic to excessively cynical about the future for quantum tech. 

While these voices are all crucially important as the collective ‘hive mind’ explores the potential of quantum technologies, industry will soon question authenticity, as happened with graphene, artificial intelligence, and CRISPR – among others. In other words, manufacturers will ask: ‘how do we know that this device is the real deal? Should we really invest in it’? Enter both standards and industry conventions. 

Interface Standards

Two main classes of standards are relevant – interface standards and performance standards. The most obvious parallel to existing standards is in the case of quantum communications, where we may compare to the 3GPP standards in telecoms. In both cases, networks are complex and involve many components from different suppliers. Therefore, some form of collaboration or standardisation is a pre-requisite to their success.

The makeup of the 3GPP standards committees follows the system architecture of telecom networks, and one might expect the quantum communications standards being developed at ETSI to show a similar pattern, for example with a distinction between standards for free space links and fibre links, and a distinction between discrete-variable and continuous-variable QKD protocols.

Historically, leading companies in telecoms such as Qualcomm and Erikson have been a few years ahead of the pack, allowing them to take a lead in forming new telecoms standards, integrating their own technologies. In the area of quantum communications, we observe a similar phenomenon – if a few companies move out ahead, this will allow them to set the playing field and put forward their own technologies into standards. At this early stage, there is also the opportunity for nations or larger geographical markets to take a lead in forming new standards and secure their position in the emerging market of quantum. 

Interface standards also find applications in complex devices, as quantum components become increasingly interwoven with state-of-the-art classical components to deliver new capabilities. In this sense, quantum subcomponents will need robust and modular packages, ready for a range of different purposes. Timing is key though – perhaps putting interface standards in place before these use-cases have been proved out would be like putting the cart before the horse.

Is it too early for performance standards?

On the other hand, performance standards serve a different purpose. First off, they ensure safety and reliability in critical functions; and secondly, they provide a useful reference point for end users, allowing them to develop trust in new technologies and compare competing products. In this second case, conventions agreed across an industry are also very powerful, sometimes even more so than prescriptive formal standards. In this context, standards bodies need to walk the line between maximising the upside of increasing investor confidence and mitigating the risk of stifling innovation by narrowing the playing field. 

In light of this, what is the best policy for creating new performance and quality standards for quantum technologies? Quantum technologies, and hybrid technologies containing quantum elements should of course comply with existing relevant standards such as ISO13485 for medical applications and ISO 15288 for safety-critical functions. Although not technology-specific, these standards play a vital role in promoting safety and security. 

However, generating new quantum-specific performance standards, at least at this stage, could hinder innovation. Take qubit metrics for instance: while quantum volume is widely considered a useful metric for quantum computers (and rightly so), if it were formalised as the only metric the market might distort. High fidelity-low qubit number systems would be favoured over lower fidelity-high qubit number systems, even though the latter may be more capable of outperforming a classical computer; and question of long-term scalability may be forgotten entirely.

Communication and collaboration – the pathway to fluid yet secure standards in quantum technologies

Through strong and collaborative communication, the quantum tech industry should develop best practices and common languages which can remain sufficiently fluid to adapt to new ideas. This will allow faster and more efficient progress than formal performance standards which risk stifling what is currently a flourishing field of innovation. Growing this culture will be the mantle (and advantage) of the more ambitious companies, who are willing break new ground before others have proven the way.

If you’re interested in evaluating new quantum technologies and the impact they might have on your industry, please reach out to TTP.

Ben Walker