KEY CONCEPT
What is Quantum
Computing?
Computers currently work using tiny silicon
transistors as on/off switches to encode bits
of data. Each action can have one of two val-
ues: one (on) and zero (off) in binary code.
Traditional computing is measured by the
amount of information that can be contained
in these zeros and ones. Either a bit is a zero
or a one, not both at the same time. This limits
the speed at which computation can occur.
A quantum computer is not limited to this
either/or way of thinking. Its memory is made
up of quantum bits or qubits – tiny particles
of matter (like atoms, ions, photons or even
electrons) which are the units of quantum
computing. Qubits do both zero and one –
meaning they can be in a superposition of all
possible combinations of zeros and ones; in
other words, they can be all those states
simultaneously.
Qubits can adopt a value to represent zero,
one, and zero and one at the same time, or
any quantum superposition of those two
qubit states. This is caused exclusively by the
characteristics of quantum physics.
Qubits can be made in different ways, but the
rule is that two qubits can be both in state A,
both in state B, one in state A and one at
state B, or vice-versa, so four probabilities in
total. The state of a qubit is not known until
you measure it.
In theory, a quantum computer would pro-
cess all the states of a qubit at the same time,
and with every qubit added to its memory
size, its computational power increases expo-
nentially. So, for three qubits, there are eight
states to work with simultaneously – for four,
16; for ten, 1,024; and so on. It does not take
a lot of qubits to quickly surpass the memory
banks of the most powerful modern super-
computers, so for specific tasks, a quantum
computer can find a solution much faster
than any regular computer could.
[0) + [1)
√2
1
Classical
Bit
28 | THE DOPPLER |
SPRING 2019
[0)
0
[1)
Qubit