From CNET: In the potentially revolutionary new technology of quantum computing, the number of qubits a machine uses to process data isn't the only factor that matters. But it's a big deal, and Intel believes its strategy -- staying as close to conventional computers as possible -- will pay off in the long run by enabling large qubit counts.
By some measures, Intel lags rivals in developing quantum computers. It hopes to leapfrog them with quantum computer processors that eventually will have enough capacity to fulfill the promise of quantum computers in jobs like developing new battery or solar panel materials, making fertilizer cheaper to manufacture, optimizing financial investments, developing better waterproof clothing, and the somewhat scarier prospect of cracking today's encryption. Quantum computers also show promise for accelerating AI.
Quantum computing relies on the weird physics of the ultrasmall. Conventional computers store data in bits that store either a zero or one, but the fundamental element quantum computers use to store and manipulate data, the qubit, can store a peculiar combination of zero and one through a phenomenon called superposition. And multiple qubits can be entangled, intertwining their computing fates in a way that stands to dramatically accelerate some computation tasks.
Qubits are flighty creatures, easily perturbed by outside forces that derail computations. One approach to addressing that situation is by ganging multiple physical qubits into a single larger error-corrected qubit that doesn't lose the thread as fast. But error correction will mean quantum computers need even more qubits.
View: Full Article