A small mathematical revision to quantum mechanics could effectively limit the purported infinite capacities of quantum computers—if validated, that is.
A small mathematical revision to quantum mechanics could effectively limit the purported infinite capacities of quantum computers—if validated, that is.The entire spiel of quantum computers is that the odd principles of quantum mechanics allow them to exponentially outperform their classical counterparts.
But what if the very foundation of this claim is wrong?, Tim Palmer, a physicist at the University of Oxford in the United Kingdom, proposes a slight tweak to the underlying math of quantum theory. The framework, dubbed “Rational Quantum Mechanics,” would effectively place an upper bound on quantum hardware capacity. If validated, that means quantum capacity won’t grow infinitely. That subsequently dampens whatever excitement—or fear—we derive from their potential. For instance, they won’t be as much of a threat toBut all this is a big “if.” For one, quantum mechanics is one of the most successful theories in the history of science. Sure, there is still much we don’t understand about the quantum world, but it’s an ambitious move to suggest the theory needs some tweaking. Palmer agrees but still believes that some mathematical aspects can be revised to better represent reality. What’s more, his idea could be testable with existing quantum technologies within the next five years. Specifically, Palmer focuses on a concept called the Hilbert space—the standard vector space used to calculate most quantum systems. Compared to classical physics, quantum mechanics is “more vitally dependent on the continuum of real numbers… nature abhors a continuum,” Palmer explained in aIn conventional quantum mechanics, the number of dimensions in a Hilbert space grows exponentially with the number of qubits. According to a column by the, this “exponential scaling is critical for the fulfillment of the promise of quantum computing, enabling algorithms such as Shor’s method for factoring large numbers far faster than classical machines.” Palmer’s suggestion is as follows: For practical purposes, physical space more resembles a collection of discrete, not continuous, elements. “Rational” quantum mechanics subscribes to this view of geometrical space, and as a result the information content in the quantum state grows linearly with the number of qubits. “Above a critical number of entangled qubits, there simply isn’t enough information in the quantum state to allocate even one bit of information to each dimension of Hilbert space,” Palmer explained. “When this happens, quantum algorithms that utilize all of Hilbert space will stop having a quantum advantage over classical algorithms.” According to the paper, quantum computers will lose their advantage once the system exceeds approximately 1,000 qubits. One big selling point of quantum computers is that they’ll be able to factor extremely large numbers in ways classical computers cannot. That infinite factoring capacity is relevant to claims that quantum computers could crack the RSA algorithm. Therefore, there’s a limit to how many qubits engineers can cram into the most “powerful” quantum computer—after 1,000 qubits, the system will tap out long before reaching the required scale. In case you’re wondering, that threshold lies way below aWhile a fascinating proposition, rational quantum mechanics remains highly speculative. Only time and scrutiny will tell how much—if at all—this proposal could change things for the better or worse. In the paper, Palmer proposes an experimental test to entangle many qubits according to a specific algorithm and check for any signs of degrading performance. Then again, quantum mechanics remains one of the most empirically tested theories. Palmer is correct that the Hilbert space is more of an “idealization,” as he says in the statement, but there also haven’t been any experiments to indicate the kind of discrete physical space described by Palmer in his proposal. Personally, I don’t want to discredit the new idea too much. It’s unwise to assume that something is “impossible” when quantum things are involved. But big claims require big evidence, and if something of that sort arises from this theory, I’d be first in line to learn more.7:30 amThe proof-of-concept still has a long way to go before it’ll end up in real devices, but it’s a great start.Chemists Create Wacky ‘Half-Möbius’ Molecule, Quantum Computers Prove It’s the Real DealFrom Quantum Spam to Quantum Minds: Why the ‘Best’ Revolution in Physics Is Only Getting Started Physicist Paul Davies looks back at the past century of quantum mechanics—the most disruptive theory in the history of modern science.Reading a Quantum Clock Costs More Energy Than Actually Running One
United States Latest News, United States Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Quantum pioneers who perfected secrecy receive Turing AwardAn American physicist and Canadian computer scientist received the A.M. Turing Award on Wednesday for their groundbreaking work on quantum key cryptography.
Read more »
A Quantum Leap for the Turing AwardCharles Bennett and Gilles Brassard pioneered quantum information theory. Now they've been awarded the highest honor in computer science.
Read more »
Why Bitcoin's Biggest Quantum Critic Says Real Bull Market Starts at $80,000Capriole Investments founder Charles Edwards outlines the mathematical threshold for Bitcoin's next vertical move. Is $80,000 the ultimate line in the sand?
Read more »
Quantum Battery Prototype Paves The Way For Almost-Instant ChargingThe Best in Science News and Amazing Breakthroughs
Read more »
World’s first quantum battery prototype built, charges faster as it gets largerResearchers in Australia have built world's first prototype of a quantum battery which can be charged faster as the units of storage increase.
Read more »
Quantum Leap turns up the sweetness before getting overly cuteQuantum Leap turns up the sweetness in 'How The Tess Was Won' before getting overly cute in 'Double Identity.'
Read more »
