What is Quantum Computing and How Will It Alter Computers?

Quantum computing is set to change the world in the future and promises computing power far in excess of every computer ever built.

By Tim TrottHow Stuff Works • August 29, 2012
1,084 words, estimated reading time 4 minutes.
What is Quantum Computing and How Will It Alter Computers?

Quantum Computing, with its immense potential yet to be fully harnessed, is a realm of untapped power that holds a promise of exciting and intriguing possibilities.

Firstly, I will explain a little bit about how traditional computers work (no pun intended!) and how things compare with quantum computers. I'll also define a few terms.

What are Bits in Computing?

In computing, a 'bit' is used to identify the smallest amount of information a computer can store. It represents an electrical state either on or off and is commonly stated as 1 or 0. There are only two possible values that a 1-bit system can represent. To make things easier to understand, let's imagine that a bit is a coin - the coin can only be a head (1) or a tail (0). This is similar to a light switch, which can be either on or off. In contrast, a qubit, the basic unit of information in quantum computing, can be in a superposition of both states, like a light switch that is both on and off simultaneously.

A 2-bit system consists of two coins (bits). With two coins, there are four combinations - head+head, tail+head, head+tail, tail+tail. 3-bit systems have a total combination of 9 possible values, and so on.

A 'byte' represents 8 bits and has 256 possible combinations. Still, a single byte can only represent one combination at any given time.

What are Qubits? Quantum Computing Bits?

A Quantum Bit (or qubit) possesses a unique property that differentiates it from traditional bits. It can hold a value of 1 or 0 or any value in between. This means that a qubit can be heads, tails or any ratio of head:tail, allowing it to hold every value simultaneously. A qubyte, therefore, can hold every value between 0 and 255 simultaneously. Just 100 qubits can store 1x1030 different numbers, which is many trillion times the storage capacity of all the computer storage ever made.

Now, thanks to Schroedinger and his feline friend, once a qubit is measured, it takes on one of the two states: heads or tails.

Building Blocks of Data in Quantum Computing

Let's have a look at exactly what a bit is. I've already told you it is a one or a 0, head or a tail, on or off, but what is this in physical terms?

In the earliest non-electronic information processing devices, such as Babbage's Analytical Engine, a bit was 'stored' as the position of a mechanical lever or gear and later in the presence or absence of a punch hole at a specific point of a paper card or tape. Later, they were represented by magnetic polarity. Bits are easy to make; they can be large or small and take on permanent or temporary mediums.

On the other hand, Qubits are a realm of complexity and mystery, a puzzle yet to be fully solved. Various approaches are being explored involving trapping ions, electrons, or other tiny particles, superconductors, or photons. The challenge lies in scaling these techniques up, making the quantum computing research journey more engaging and challenging.

The problems don't end once you have made a few qubits. As Schroedinger pointed out, quantum systems must be isolated from the rest of the world to work. If qubits were to interact with the external world, they would de-cohere, collapse down and take on a binary state, just like a traditional computer.

Current thinking is that deciding on an acceptable error rate and design around it is best. A small error rate can still outperform normal computing power with enough qubits. The trouble is that qubits are hard to produce, making them very expensive and, for now, only available to research machines.

Using a Quantum Computer

Let's say you have a working quantum computer with all its quantum power at your fingertips. You can expect to run Windows quickly and launch multiple applications instantly and easily. But you would be very wrong. It would be a nightmare.

Google's Quantum Computing Sycamore (2018) Chipset
Google's Quantum Computing Sycamore (2018) Chipset 

The first problem you will encounter is that there isn't any way of knowing if it works properly since observing a quantum system changes the outcome (thanks to Schroedinger again). The currently available quantum computers aren't verified to work as they should. They're based on the right theory, some fingers crossing, and judged by their output.

How would you go about programming a quantum computer? Our current programming logic would have to be scrapped, as any answers you get from one will be based on probability, not a definitive value. Even a variable comparison could be ambiguous, for example: if (variable1 == variable2) is the most basic programming comparison and traditionally, the result is either true or false. What would a programmer do if the comparison returned possibly true, false, or something else?

A calculation must be repeated multiple times for a "most probable" answer. If you have to run a calculation multiple times, the overall time taken increases. Is it worth using a quantum computer for this event?

How do you test the resulting answer from a quantum computer?

Depending on the problem, you cannot know if the answer is correct! Theoretically, a quantum computer can solve a calculation in hours, which would take a normal computer thousands of years to work out. So, is the answer correct if there is no way of verifying it?

Quantum Computers and Encryption

Quantum computers, with their ability to perform calculations exponentially faster than classical computers, pose a significant threat to modern encryption methods. These methods, such as RSA and AES, rely on the difficulty of factoring large numbers or solving complex mathematical problems. For everyday internet users, this breakthrough means that once-secure passwords, credit card information, and personal data could become vulnerable to decryption by quantum computers. This underscores the urgent need to develop and implement quantum-resistant encryption methods to protect sensitive information and maintain trust in digital security systems.

Quantum-resistant encryption, also known as post-quantum cryptography, involves cryptographic algorithms that are secure against the potential capabilities of quantum computers. Examples include Lattice-Based Cryptography (Learning With Errors (LWE) and Ring-LWE problems), Code-Based Cryptography (McEliece cryptosystem), Multivariate Quadratic Equations (Rainbow signature scheme) and Supersingular Elliptic Curve Isogeny Cryptography (SIKE). These present problems are hard for quantum and traditional computers to solve.

Final Words on Quantum Computing

Quantum computing is still in its infancy, with our understanding at the abacus stage compared to traditional computing. We are far from a fully functional quantum computer, but the rapid progress underscores the need for further research and development.

Update, May 2013
Google announced that the Quantum Artificial Intelligence Lab, based in NASA's Ames Research Center, would study how quantum computing might advance machine learning. Press release here.

Related ArticlesThese articles may also be of interest to you

CommentsShare your thoughts in the comments below

My website and its content are free to use without the clutter of adverts, popups, marketing messages or anything else like that. If you enjoyed reading this article, or it helped you in some way, all I ask in return is you leave a comment below or share this page with your friends. Thank you.

There are no comments yet. Why not get the discussion started?

New comments for this post are currently closed.