- The transistor is a completely fundamental component of all modern electronics for amplification and control of electrical signals.
- The first transistor based on semiconductors, that is materials with an electrical conductivity between conductive and insulating materials, was invented in 1947 by John Bardeen, Walter Brattain, and William Shockley.
- The vast majority of all semiconductor components are based on the element silicon. This technology forms the basis for integrated circuits (chips), where thousands of transistors together form microprocessors that can process digital information.
Of course I know what a quantum computer is—I learned that in school!
In 1965, Gordon Moore, who was later one of the founders of the Intel Corporation, predicted that the number of transistors in integrated circuits would double every year.
He subsequently adjusted this to a doubling approximately every two years, and this prediction of an exponential development has held true with remarkable precision.
PlayStations are 1,000 times faster than my first computer
At the age of 7, I made my first acquaintance with computers. It was an IBM Personal System/2 with an Intel 8086 CPU with MHz clock frequency.The smallest structures in the processor were 3.2 microns—I was, of course, blissfully unaware of this at the time.
Now my own children play games on a PlayStation with an 8-core CPU that delivers FIFA and other good stuff with 3.5 GHz and comes with 7 nanometre fabrication technology.
In other words, 1,000 times finer manufacturing technology and 1,000 times faster speed in just over 30 years.
And, of course, this is just the technology for private users.If you instead take supercomputers such as Fugaku in Japan and IBM’s Summit in the United States, these are transistor-based computers optimized to the limit.They grind data at a rate of several hundred petaFLOPS (peta = 1,000,000,000,000,000; FLOPS = floating point operations per second).
It is easy to get the impression that this is an endless development and that computers are so powerful that they can handle anything.
- The first mechanical calculator was designed in the 19th century by the mathematician Charles Babbage, and Ada Lovelace, who was also a mathematician, had the idea of developing algorithms for machines of this type.
- In the 1930s, Alan Turing developed an abstract general understanding of what a computer is.
- Later in the same decade and during the following decade, the world’s first electronic digital computer, known as the Atanasoff-Berry computer, saw the light of day.
- After a few decades of development, in 1971 Intel launched the first commercial microprocessor, the Intel 4004, on the market—a CPU with no less than 4 bits.
- In the 2020s, we are now at the beginning of the exascale computing era, where computers are capable of processing data with more than 1018 operations per second.
But it is actually not that hard to find a problem that can checkmate even the most powerful computers.
The key is to realize that the number of operations required of the computer to perform a calculation scales very differently for various types of problems.
For example, if you add up numbers, the number of operations grows linearly with the size of the numbers (n).
The calculation 317,456 + 467,084 requires roughly twice as many operations as 317 + 467.
If we instead multiply the same numbers by each other, the number of operations for the common arithmetic method will then grow quadratically (n2) with the number of digits in the numbers.
The travelling salesman problem
It is thus a more complex task to multiply numbers than add up numbers, and it is therefore also a more time-consuming calculation for the computer.
Computers can keep up and perform the task efficiently as long as the number of operations does not grow too heavily with the size of the problem.
But for some types of calculations, the number of operations increase exponentially (an) or worse, and this is where even the most powerful computers fall short. The calculation time becomes dramatically longer for bigger and bigger problems.
An example is ‘The travelling salesman problem’: The computer is fed a list of cities and their distances from each other, and is asked to find the shortest route that goes through all cities once and returns to the point of departure.
If there are only 10 cities to visit, there are 10 x 9 x 8 x 7 x 6 x 5 x 4 x 3 x 2 x 1 = 3,628,800 options for the computer to examine and choose from.
It takes time, but it can be done. If you add just three more cities to the route, there are over six billion options. The travelling salesman problem is relevant for, for example, logistics companies, and they typically have much more than 13 destinations on their routes, making the problem extremely complicated to solve.
The quantum computer constitutes both new technology and a completely new way of looking at the world
What makes the quantum computer special is precisely its potential to solve some of the problems that, due to their complexity, are practically unsolvable with ordinary computers.
This is possible because it works with a more advanced information concept than ordinary computers.
The basis of our current digital information technology, and thus the computer as we know it, is the so-called information unit called a ‘bit (binary digit), which can assume the values 0 or 1. In the quantum computer, the counterpart is a ‘qubit’ (quantum bit)—a two-level quantum mechanical system.
A qubit has two states at the same time, perhaps?
In the hardware of an ordinary computer, the abstract bit is made up of a small electronic circuit, and, correspondingly, the qubit is also a small physical system in the quantum computer.But where the bit can only be in the states corresponding to ‘0’ and ‘1’, the qubit has more options. Its more complex state is described by the formula:
If you measure the qubit, you always get one result or the other, |0⟩ or |1⟩.
But before that, it is in a superposition of the two.
In popular terms, it is often said that the qubit can be in both one and the other state at the same time.
We can only say something about the probabilities
Quantum mechanics is the cause of many puzzles and especially the possibility of superposition states, so essential for the qubit, is a source of much confusion.
The Copenhagen interpretation, created, in particular, by Niels Bohr and Werner Heisenberg, avoids some of the problems by understanding quantum states, as shown in the above formula, as being solely an expression of the probabilities of a given measurement result.
The quantum state contains everything there is to say about the system, but it does not represent the physical object itself. Therefore, we are not to understand it as if the qubit is in both states at the same time. It would be more correct to say that it is neither one nor the other until it has been measured. Only at that moment does the state randomly collapse into one or the other.
And the only thing we can make predictions of is the probabilities of the two measurement results, which can be found by calculating respectively |α|2 and |β|2.
To obtain an ordinary number, it is therefore necessary to calculate the absolute square.
One can think of a complex number as an arrow that has both a length and a direction. When calculating the absolute square, you get a measure of the length, but lose all knowledge about the direction. But we will not concern ourselves with this any further here.
The algorithms make the quantum computer special
The fact that a quantum computer works with qubits instead of bits means that information is encoded in its processor in a completely different way.
In fact, our whole concept of what information is gets expanded—generalized—as we move from the digital 0s and 1s to the superpositions of the qubit.
It also makes it possible to develop completely new algorithms that exploit the possibilities that the generalized information concept opens up for.
In the same way as the right recipe is crucial for how good ingredients are turned into tasty food, algorithms are completely essential for utilizing the resources in the quantum computer hardware.
In some cases, the quantum computer makes the difficult problems palatable.
It is therefore not ‘just’ another step up the technological ladder of evolution, but also a leap to a whole new way of looking at the world.
Rotating national symbols and alternative Cola brands
Because, ultimately, the algorithms are what enable the computer to solve problems for us, it also became a focal point for our book on the quantum computer.
But it is also a very abstract topic, and instead of just scratching the surface of Peter Shor’s algorithm for prime number factorization, we put our effort into visually explaining a different and much simpler quantum algorithm from end to end.
- When we make payments online or need to identify ourselves using the MitID password, our data is secured by means of so-called asymmetric encryption. One of the basic techniques for this is RSA encryption, and the security basically consists of it being practically impossible for ordinary computers to figure out which two prime numbers multiplied by each other give a known third number if the numbers are large enough. It is easy enough to figure out that 12 = 3 x 4, but what two prime numbers do you have to multiply by each other to get 32,278,157?
- RSA encryption is used for more than 90 per cent of all communications via the Internet.
- In 1994, American mathematician Peter Shor developed a quantum algorithm that makes prime number factorization a manageable task for a quantum computer. In other words, Shor’s algorithm will enable a large enough quantum computer to break RSA encryption.
- Shor’s algorithm is a powerful driver for the development of quantum computers.
In this way, qubits and their superpositions ended up taking the form of rotating national symbols and alternative Cola brands, and the intermediate calculations that steadily unfold the potential of the algorithm were translated into the same figurative language.
We still have not seen it solve the hardest problems
Although it is theoretically possible to make a universal quantum computer, that is a computer that can run any conceivable algorithm, it is definitely not certain that it will be able to do it effectively.
In fact, until now we only know of very few examples where the quantum computer will be able to perform this better or faster than an ordinary computer.
And, so far, it has only been demonstrated for problems of very limited practical relevance.
But the potential is so great that there are still strong minds and deep pockets that are determinedly pushing the development forward.
We just have to arm ourselves with patience—perhaps for another couple of decades—and make sure to steer clear of the worst of the hype.
New technologies are always surrounded by some degree of hype, which is probably also necessary to mobilize the enthusiasm and financial support required to realize them.
But if expectations become too high and it takes too long to meet them, this may result in so-called ‘tech winter’ (where the development grinds to a halt), and that is not in anyone’s interest.
The development of powerful quantum technologies requires patience and patient investments.
Four and a half years and 30 pages later
Back to the textbook. As with all our previous projects, Jan Egesborg and I also adopted a goal-oriented and ambitious approach to this textbook.
But, just as convincingly, we also ran into a large, heavy, classic wall that left us little chances for tunneling ourselves out on the other side.
The textbook format simply turned out to be a very poor match for us and our penchant for quirky narratives.
It required countless failed attempts, separated by, for example, long pauses for reflection and an eye-opening experience of rereading physicist Richard Feynman’s iconic lecture from 1982.
In the end, we accepted that we simply had to give ourselves the freedom to follow our own ideas about what a textbook can also be, and that broke the deadlock.
New book about quantum computers for primary and lower secondary schools
We then proceeded at our usual high pace, and a cute little book of about 30 pages materialized before us.
The book, which is simply entitled ‘Kvantecomputeren’ (The Quantum Computer), was published at the end of March, and the schools have shown much greater interest in it than we had dared to hope for.
Now we are eagerly awaiting to hear how it works in the classrooms and whether, over time, it will make more young people say: “Of course I know what a quantum computer is—I learned that in school!” ‘
Kvantecomputeren’ by Jan Egesborg and Ulrich Busk Hoff has been published by the publishing house Nybrogade Press.
Quantum technology is an area of rapid growth. Researchers at DTU are focusing on three areas of technology: Quantum communication and data security; ultra sensitive quantum sensors; and the development of quantum computers. This is done through both basic research and development of technologies that can be used by businesses and government alike, which are both showing strong interest in the field.Read more in our special topic about quantum technology.