A Global Village
Issue 6 » Atoms

Quantum Vision 2050

Dr. Herbert Wiklicky, Imperial College London

In 1982 Richard Feynman proposed to utilise ‘quantum computers’, employing quantum mechanical phenomena such as superposition and entanglement, in order to perform resource intensive computational tasks, followed in early the 1990s by a breakthrough from Peter Shor and Lov Grover who described quantum algorithms for factorising and search which are substantially faster than their classical counterparts.

The conceptual and practical problems with quantum computation are mainly related to the process of extracting information via measurements – the uncertainty in the measurement of the system has important implications as one has to distinguish between what is computed by the device and the computational result extracted. Looking to the future, it is to be expected that at some stage quantum effects will start to play a role on integrated circuits such as the ones we use on a normal PC. At this stage we need to decide whether or not we consider this to be a bug or a positive feature: do we reject a device that does not behave according to classical Boolean logic as faulty or do we adapt to accommodate this new quantum
logic?

The year 2012 will be celebrated by many in the computing community as Turing Year, in commemoration of the 100th birthday of Alan Turing. Turing is commonly regarded as the mathematician who laid the theoretical foundations of computer science by introducing what now is called the Turing Machine. This theoretical model provides the basic ideas of what constitutes a ‘computation’ as a mechanised, step-by-step process. Some people have criticised various aspects of this model as providing only hypothetical ‘paper machines’ rather than real physical devices.

If one follows this argument then one has to accept the fact that computing is fundamentally a physical process and thus limited by physical constraints. One such limitation is the necessary ‘heating up’ of computational devices – a real and major problem for large server farms – which is essentially due to the irreversibility of information erasure. In 1871 James Clerk Maxwell introduced a thought experiment that could demonstrate that the Second Law of Thermodynamics, which asserts that entropy cannot decrease, has only statistical certainty1. While investigating the implications of ‘Maxwell’s Demon’, scientists such as Leo Szilard, Leon Brillouin, Rolf Landauer and Charles Bennett established the thermodynamic limitations of computation. In particular, Landauer’s Principle states that an increase in entropy resulting from the erasure of information is compensated by a release of heat
energy.

Other physical constraints on what can be computed in the real world stem from various fundamental physical theories. For example, special relativity postulates that the maximal communication speed is equal to the speed of light. Similarly, general relativity imposes a particular structure on a computational system, e.g. related to the ‘loss of information’ in black holes, but this seems to be of rather minor importance at the moment, perhaps due to the fact that we do not use solar systems or galaxies to perform computational tasks. What is, however, undeniably an open issue in computing is the effect that quantum physics has on computational machines.

In 1982 Richard Feynman proposed quantum computing in order to perform resource intensive computational tasks, not least in order to simulate concepts in quantum physics. In the following years scientists like David Deutsch developed these ideas further and in the early 1990s a breakthrough came when Peter Shor and Lov Grover described fast quantum algorithms for factorising and search. At about the same time, Charles Bennett and Gilles Brassard and later Artur Ekert developed quantum cryptographic protocols which utilise the properties of quantum systems in order to protect information exchange from eavesdropping.

If we look at the practical implications and potential of quantum computers and quantum computation now, and in the future, we have to understand the particular features of quantum systems as compared to classical systems. Without doubt, one can say that quantum physics is a mathematically well-understood theory – however, it is often said that nobody really understands quantum physics intuitively, even great minds like Feynman. Indeed, there are legions of popular science books discussing the question of ‘what constitutes quantum reality?’ Intriguing Gedanken experiments like Schroedinger’s cat or the Einstein-Podolski-Rosen paradox tend to confuse more than they explain when discussed in a mathematics-free form.

Heisenberg’s Uncertainty
Perhaps the most important conceptual issue in quantum theory is the distinction between the state of a system and observations of that system. In classical physics we can observe whatever we like about a system and, if we are careful, the system will not care about whether we observe it or not. Thus, we are used to identifying what we see with what there is. In quantum physics the situation is completely different: we have to distinguish between the state of a system and its observations. Though we still might know everything about the state of a system – for example, because we know how it was created – still we cannot measure all of its properties.

The relationship between state and observables is also complicated by the fact that, when we measure a quantum system, the measurement results are in general not uniquely determined by the state; instead both are related to each other statistically. There are well-defined probabilities that a particular state will give rise to certain measurements but this is not a one-to-one relation. Furthermore – no matter how careful we are – in general measuring a system changes the state of the system.

Quantum computers ...
encode 
information or memory
as quantum 
bits, or qubits,
which can exist in 
combination
or superposition, and 
thus allow
us to compute with many

different inputs at the same time


In quantum computing this has important implications for quantum memory. Today’s computers, like a Turing machine, work by manipulating bits that exist in one of two states: a 0 or a 1. Quantum computers aren’t limited to two states; they encode information or memory as quantum bits, or qubits, which can exist in combination or superposition, and thus allow us to compute with many different inputs at the same time. If we build quantum memory from qubits, then their capacity to store information increases not linearly but exponentially, thus increasing computational power greatly.
The reason for this is that qubits are, or can be, entangled, i.e. they are not kept as separate entities – like classical bits – but instead they can only be understood as being part of a greater unit. This gives rise to an increase in information capacity as we record information not as the individual properties of qubits but as properties of the interaction of a whole group of qubits.

Qubits explained

The conceptual and practical problems associated with quantum computation are mainly related to the process of extracting information via measurements, i.e. their interaction with classical systems. For example, the uncertainty in the measurement of the system has important implications as one has to distinguish between what is computed by the device from the computational result we extract: the contents of what we see on the output display is fundamentally different from the contents of the quantum memory. Furthermore, if we produce/request an output we change the contents of the quantum memory irreversibly, and the results we get are only probabilistically related to what the device had computed.

Hence, measurements both reduce what we can actually compute and are irreversible. Both of these conceptual problems can, to a certain degree, be overcome and exploited. We can, for example, compute internally a large number of potential results but ultimately check only a single or a subset of relations between them.

The observational sensitivity of quantum systems – often referred to as the decoherence problem – however makes them very unstable or, perhaps more precisely, delicate physical systems. This instability can be understood in one way as non-intentional observation of a quantum computer/memory by completely unrelated objects maybe in another, far away galaxy. Alternatively, one can also argue that the whole universe is described by one big quantum state in which everything is entangled with everything else. Isolating qubits from external influences or compensating for such interactions, e.g. via error correction, is perhaps the biggest engineering problem currently in quantum computation.

State of the Art
After the announcement of Shor’s factorisation algorithm and Grover’s quantum search in the 1990s much hope was placed on ‘quantum speedup’ that would allow improvements in algorithmic design. Not least it seemed that quantum computation could provide cryptographic attacks, for example against common public key systems such as RSA encryption. However, despite some new results including general so-called hidden-group problems, there have been disappointingly few new developments. Although quantum computers cannot compute more intelligently than classical machines – undecidable problems remain undecidable also in the quantum world – it had been hoped that perhaps an exponential speedup could be achieved for many problems that are classically computationally hard. Unfortunately this remains an open issue.

[XPLANE/www.xplane.com]

Perhaps the most active development in quantum computation is with regard to various computational models or schemes. These proposals try to provide frameworks for describing and analysing quantum algorithms while exploiting certain quantum features. The starting point was quantum circuits that describe internal computational steps via basic steps employing what are termed quantum gates. Other proposals include one-way or measurement-based quantum computation that exploits entanglement in combination with appropriate basic measurements in order to perform certain quantum transformations, or topological quantum computation that aims to construct a framework that is largely immune to external influences. Other developments in this area investigate various description formalisms and programming languages that enable a precise description of quantum computational processes.

Perhaps the most practically successful development relates to quantum cryptographic protocols. These exploit measurement sensitivity in such a way that interferences with a transmitted message are detected so that countermeasures can be undertaken. Already today some products based on these ideas, such as quantum key distribution, are commercially available.

Despite a number of successful experimental setups (and non-confirmed claims) it seems that nobody has yet successfully constructed a quantum computer of reasonable size. This seems to be mainly related to the decoherence problem as discussed above. Looking at the history of classical computing, it could be suggested that without semi-conductors it would have been impossible to reach the current level of development, despite the theoretical work of Turing and colleagues on the development of vacuum tube based computers. However, it is hard to forecast any parallel development of `quantum transistors’.

Quantum Computing of the Future
Clearly, the future in a historical situation such as ours, with accelerated technological development, is hard to foresee. Indeed, if previous forecasts had turned out to be correct London would be buried under a huge pile of horse manure or suffocating from fumes and pollution. Maybe we can always just see a step ahead rather than look towards 2050. What will happen depends not only on ingenuity, luck and unexpected discoveries, but also on what is possible and what is needed given the laws of nature, economics – and politics.

It seems that adaptation
of PCs is a more plausible
way of introducing
quantum computation,
rather than via purpose-
built quantum computers


While physicists start with a quantum dynamical system as it exists in nature and try to describe it and to forecast its behaviour, we have a different, or opposing, situation when it comes to computing. Here we need to describe a desired behaviour in order to achieve a certain result. Given that our existing, classical specification and programming languages fail in principle when it comes to describing quantum computational processes, it will be necessary to develop such calculi and language. The task is to perform this in relation to the right quantum computational model, ideally also reflecting the actual hardware situation. This is essential also when it comes to analysing and modifying quantum algorithms/programmes.

Although Moore’s Law describing the rapid miniaturisation of classical computing devices remains accurate, it is slowing down and it is to be expected that at some stage quantum effects will start to play a role on integrated circuits such as we use on a normal PC. It remains to be seen whether we will view this uncertainty as a bug or a positive feature in the future: will we adapt to accommodate this new quantum logic?

It seems that adaptation of PCs is a more plausible way of introducing quantum computation, rather than via purpose-built quantum computers. The latter seem to be currently too difficult to build. However, maybe unforeseen discoveries like super-conduction materials at high temperatures might change the situation
drastically.

The ‘over-sensitivity’ of quantum systems that makes it difficult to isolate individual qubits, yet allows us to design successful quantum cryptographic protocols, could play an important role in a highly controlled society. Quantum observations are by their nature irreversible and thus any intrusion can be detected. This could provide a very effective tool to protect privacy, trust and enable surveillance (like the hair in Winston’s journal in Orwell’s 1984). Schemes like the ones in quantum communication or Stephen Wiesner’s quantum money2 could provide the first steps in this direction. Possible applications which exploit the fact that quantum systems necessarily possess a kind of intrusion detection mechanism could include cloud computing, enabling users to detect if service providers interfere with data in a non-agreed way – even in the case where they just copy or clone it.

In 2007, US company D-WAVE claimed to have built the world’s first quantum computer yet many in the scientific community remain sceptical. [D-Wave Systems Inc]

However, there might be legal challenges to this approach related to, for example, the Regulation of Investigatory Powers Act 2000 (RIP) in the UK. If quantum data is required to be revealed to the court or police this could cause two kinds of problem: on one hand, as we have seen, it is in principle impossible to measure or extract all properties of quantum memory so some information always remains unknown. No power in the world can order its disclosure and the owner of this data must therefore fall foul of the RIP requirements, and on the other hand the information is to some extent destroyed solely by the act of disclosure, which raises the question of damages and compensation. Legal requirements like RIP might thus turn out to be a major obstacle in the development of quantum computation, in a similar way perhaps to the ancient requirement that any carriage not drawn by a horse had to be preceded by a man waving a red flag.
 
It is likely that quantum computing applications to privacy protection could be the most important development in this area by the mid-century. If the only thing that is worth stealing is information, then observational detection is of utmost importance. However, if in 2050 we live in an over-heated, drowning, and starving world, then an old AK47 might be more valued than any smart device equipped with quantum intrusion detection.

Dr. Herbert Wiklicky is Reader at the Department of Computing of Imperial College London. He holds degrees in Mathematical Physics, Mathematics and Computer Science. His current work focuses on probabilistic program analysis, quantitative models of computation and in semantics.

Leave A Comment

[1]    Leff, H. S. & Rex, A. F. (2002) Maxwell’s Demon 2: Entropy, Classical and Quantum Information, Computing. CRC Press.
[2]    Singh, F. (1999) The Code Book: The Evolution of Secrecy from Mary, Queen of Scots, to Quantum Cryptography (1st ed.). Doubleday, New York, NY, USA.