a new era of quantum technology

   January 13, 2023


There is a worldwide competition to develop new methods to manage and utilise quantum phenomena and to develop algorithms that are significantly more efficient. Taylor Glascock for Bloomberg



                                 Since the first quantum bits, or qubits, were connected to create a simple quantum computer, it has been 25 years. Qubits are the fundamental building blocks of systems that potentially significantly outperform current computers in some forms of problem-solving due to their capacity to simultaneously represent both the ones and zeros in conventional computers. Since then, progress has relied more on applied engineering than on hard science, thanks to the development of more stable qubits that can maintain their quantum state for longer than a tiny fraction of a second, the linking of them in larger systems, and the development of novel programming languages that take advantage of the technology's features.

Following the development of the transistor in the 1940s and the integrated circuit in 1958, this is comparable to what transpired in the early years of classical computing. In retrospect, Moore's Law's steady, exponential increase in computing power that propelled computers into the public appears inevitable.

It is improbable that the quantum era would develop with the same feeling of metronomic inevitability. On the upside as well as the negative, it might yield significant shocks. The prospect of unexpected performance increases is increased by the race being waged worldwide to develop new methods to manage and utilise quantum phenomena as well as to develop algorithms that are significantly more efficient.

The disclosure of Chinese research that suggests a method for breaking the most popular type of internet encryption using a quantum computer comparable to those already in use has brought about such a surprise. That achievement—possibly a "Sputnik moment"—had been anticipated to call on much more sophisticated quantum systems existing several years in the future.

Later, several cyber security professionals came to the conclusion that it was highly improbable for this approach to be effective. If China had actually found a means to reveal the majority of the world's secret conversations, one could wonder why they would have allowed their disclosure. It nevertheless caused a jolt, though, and should serve as a wake-up call for everyone who is concerned about the dangers of China becoming the technical superpower, especially in the US.

Many businesses in sectors including chemical production, finance, and automobile manufacture have invested in learning how to develop quantum systems in the anticipation of soon-to-come applications. Quantum systems might have an advantage in modelling complicated financial risks, creating new molecules, and speeding up data processing in machine learning systems as soon as they are even somewhat less expensive or quicker than current computers.

This "quantum advantage"—when the systems show a real, if little, edge over some issues—remains tantalisingly just out of grasp. Even if long-term potential appears to be stable, there is a strong possibility for short-term disappointment as investment and expectations rise.



It continues to be challenging to maintain qubits in their quantum state for long enough to carry out effective computations. The next step is to develop error correction techniques that utilise some of the qubits to reduce the "noise" brought on by this lack of coherence. According to recent study, this challenge is being solved more quickly than anticipated.

The likelihood of a quantum shock—when the machines transform from interesting science experiment to game-changing technology—has grown because to the possibility for advancements in fields like error correction. It would be hasty to assume that this moment has already arrived based on the Chinese encryption paper, which appears to have problems. However, given the massive global effort being put into using quantum mechanics' features for computing, it would be even riskier to postpone a careful analysis of the benefits and drawbacks until later.


Post a Comment

0 Comments
* Please Don't Spam Here. All the Comments are Reviewed by Admin.