MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
quantum
Search

Is 2025 the year of quantum computing?

Wednesday February 5, 2025. 04:35 PM , from InfoWorld
The year was 1900 and Max Planck was a young physicist working on the problem of blackbody radiation. This was an intense area of research because the experimental data—the radiation emitted by different types of matter as they were heated—disagreed with what had been predicted by classical physics. 

Physics at that time was riding a wave of enormous success, and its future in explaining much of the natural world seemed certain. There were just a few loose ends to tie up.

One such loose end was the research showing that, when heated, solid bodies did not emit gradually increasing amounts of radiation in all frequency bands. Worse, if they did, it would create even greater problems in the form of infinite energy, the ultraviolet catastrophe. Desperate for a solution, Planck used experimental data to construct a mathematical model that worked, even though it violated centuries of scientific thinking.

The result was the idea that energy comes in discrete levels or packets, what Planck called quanta. This produced his famous constant and gave birth to an entirely new kind of physics: quantum mechanics.

It would be several decades yet before computing began to take shape, so there was no way for Planck to foresee its strange and almost mystical intersection with quantum science. Nonetheless, quantum computing, as we know it today, is indebted to his work.

The limits of traditional computing

Classical computing really took off with the Von Neumann architecture in the 1940s. Since then, it has pressed onward with ever smaller transistors and ever more clever designs for wringing performance and efficiency out of matter. Although this has produced astonishing advances and hitherto unimaginable capabilities, certain classes of problems remain resistant, even in theory.

For example, take the ancient and easily understood problem of prime factorization for large numbers. This problem on sufficiently large integers—say, with 2,048 bits—would take even a top-of-the-line supercomputer several billion years to crack.

Prime factorization isn’t just a mathematical puzzle of interest to theorists. Much of the security of the Internet rests on problems of this kind.

Modeling uncertainty

Traditional computers are Boolean constructs at heart. Fundamentally, they acknowledge yes/no states. Much of the power of modern AI systems is in layering stochastic, random, or probabilistic engines on top of this base. The reasoning is that much of the world is fuzzy, and therefore only amenable to statistical approximations.

But for hard mathematical problems, only overcoming the basic hardware model will suffice. What’s needed is a computer that does not just say yes or no, but gives a likelihood. Such a computer would be able to model otherwise unknowable areas of the universe. Quantum computing does this by replacing transistor bits with subatomic particles called qubits.

To understand how this works, we need to follow Planck’s idea of quanta through its evolutionary trajectory in Einstein’s ideas about light quanta and onto the realm of Heisenberg’s uncertainty principle, where we begin to wrangle with interpreting the experimental data of subatomic physics.

When faced with the uncertainty principle, even hard-nosed scientists are forced to confront a philosophical question: Does the uncertainty found in experiments stem from the physical world, from our relationship to it, or from our senses and minds? In other words, is such uncertainty ontological or epistemological? (Heisenberg himself preferred the term indeterminate.)

The Copenhagen interpretation of quantum mechanics essentially says: let’s not try to decide, let’s just continue to do good science.

The promise of quantum computing

One of the best ways to appreciate the scientific predicament of quantum mechanics is in the split-beam or double-slit experiment. Remaining in the realm of thought experiments, but trusting that experimental physicists are expressing valid truths, the basic idea is that a photon (a subatomic particle) will exhibit different characteristics, depending on how you look at it. If you look at a photon before it hits the detectors, it’ll act like a particle; otherwise, it’ll act like a wave.

What is good for photons goes for all subatomic particles and all their properties. The uncertainty principle is a framework, a circumscribed line around our understanding that tells us what it’s possible to know. One cannot know both the location and speed of a subatomic particle. Finding one value makes the other undiscoverable with accuracy. Push and pull as we might—and scientists have pushed and pulled mightily—the world of the subatomic remains obscure. Wherever it enters the realm of sensible phenomenon, a subatomic particle’s behavior becomes probabilistic.

But perhaps these strange properties could be useful. What if we could use the nature of such particles to make vast statistical calculations, relying on the physical properties of matter itself?

That is the promise of quantum computing. The difficulty is in nailing down these particles in a way that makes sense. Holding them in a usable state is alone a huge engineering challenge. Some particles, like neutrinos, are very stable but hard to get into an interactively useful state. Others, like protons and neutrons, are interactive but highly volatile, and susceptible to decoherence.

Decoherence is the transition of phenomenon from the subatomic to the macro world. Some quantum mechanics interpretations draw a firm line along this boundary, essentially dividing the universe into two realms. What quantum computing must do is somehow put the question to the particles while they remain coherent in the quantum world, and then guide them to decoher into our macro world in a way that we can output sensibly.

This is the challenge of creating interfaces between macro detectors and traditional computing resources, and the developing quantum computing infrastructure.

Quantum computing in the real world

As quantum computing research gradually inches toward real-world usability, you might wonder where we’ll see the impacts of this technology, both short- and long-term.

One of the most immediately important areas is cryptography. Since a quantum computer can take on many states simultaneously, something like factoring large numbers can proceed in parallel, relying on the superposition of particle states to explore many possible outcomes at once.

There is also a tantalizing potential for cross-over between machine learning and quantum computing. Here, the probabilistic nature of neural networks and AI in general seems to lend itself to being modeled at a more fundamental level, and with far greater efficiency, using the hardware capabilities of quantum computers. How powerful would an AI system be if it rested on quantum hardware?

Another area is the development of alternative energy sources, including fusion. Using matter itself to model reality opens up possibilities we can’t yet fully predict. Drug discovery and material design are also areas of interest for quantum calculations. At the hardware level, quantum systems allow us to use matter itself to model the complexity of designing useful matter.

These and other exciting developments, especially in error correction, seem to indicate quantum computing’s time is finally coming. But there’s a huge amount of work still to be done to move towards usable systems—and then there will be even more work to make it all practical for everyday users. What we will likely see is the emergence of hybrid computing models, where cloud systems rent quantum computing resources that contribute their unique powers to the overall system.

The quantum disruption is coming

To understand where we are in this tale, consider Google’s recently announced breakthrough with the Willow chip. The critical takeaway here is that engineers have successfully built a quantum system that reduces errors as the number of qubits are scaled up. Error checking is essential in all computer systems but it’s especially so in quantum systems, where decoherence is always nipping at particles, threatening to cause them to lose their entanglement by interaction with the macro environment.

It’s also important to note that Willow scaled up to an array of 7×7 physical qubits. That’s still a pretty modest size, even though it does open up impressive capabilities. There will come a moment when quantum computing makes the leap from experimental projects to being a useful computing resource. It probably won’t be 2025, but the quantum disruption is coming. It is such an exploratory area of research that it’s hard to predict its impact.

On the philosophical side, you might wonder what quantum says about the nature of reality. Not only do quantum experiments point to something entirely different from causation as we know it, but it’s possible to build machines that utilize this characteristic. A quantum computer essentially takes an object while it’s in an unknown condition (framed by probabilities) and proposes a question to it. We then decipher the question to back to its known state, where it’s possible to derive useful data from that “trip into the unknown.”

Quantum hasn’t fully arrived, but there’s every reason to believe it’s coming. Technologists should keep an eye on the quantum space. Watch for where and when the quantum industry makes itself felt in the areas you are engaged with. Be ready to leap when it comes.
https://www.infoworld.com/article/3812575/is-2025-the-year-of-quantum-computing.html

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2025 Zicos / 440Network
Current Date
Feb, Wed 5 - 21:10 CET