Unconventional Computing

Computing in new and creative ways

While most of us are probably not familiar with all the details of how computers work, we are generally pretty familiar with conventional computing. In fact, since this lesson is on the internet, everyone reading it must at least be somewhat familiar with computers. The conventional computers we use typically work by having a central processing unit (CPU) that performs logical operations on information stored in memory. While how these actually function is immensely complicated, the basic concept is not. There is a set of instructions encoded in ones and zeros that work with well-defined rules. Using these things together gives predictable behaviors that can be used to display things like Wikipedia articles about birds, or do more mathematical things like solving optimization problems. The point here is that information is processed centrally in a sequential way based on mathematical and logical rules, and this is all done using electricity in semiconductor systems. 

Unconventional computing is any type of computing done in a way that is not the usual way. For the present discussion it makes sense to keep the way unconventional is defined somewhat informal. The terms may be used in different ways elsewhere, but for our purposes, we mean things that are highly different from the conventional method of computing. For example, having a few processors that work together isn’t that unconventional, in fact, it is something that most modern computers do. Likewise, for our purposes, having a computer that functions in the usual way but uses trinary variables, 00,11, or 22 instead of binary 00 or 11, wouldn’t be very unconventional. The basic concept is the same, just a detail is different. What we are most interested in here is the idea of computing in ways that are significantly different, where for example it is hard to even define what the instructions or logic are in the conventional sense, and where there is no clear CPU processing the information. Systems designed to act directly as neural networks are one example where there is clear structure and logic, but it doesn’t fit into the conventional framework for computing.

It should go without saying that this lesson will not cover the entire field of unconventional computing. The field is vast and contains many efforts where the goal is not so much to improve performance but to push the boundaries of our understanding of what a computer can be (see this special issue for some examples and a discussion of different topics here). We will rather focus on those which are relevant to our products, and which may be practically useful.

Why unconventional?

Entropy computing diagram
An example realization of entropy quantum computing, a form of unconventional computing

This is certainly a valid question. After all, computers were designed the way they are for a reason, and done so over many decades by very smart people. Modern computers can do many tasks very well. An answer here is that for most tasks conventional computers work just fine. The practical case for unconventional computing is that there are some tasks where the conventional approach struggles and that is where a new way of thinking about computing comes in. There is no point in reinventing the wheel for what our computers already do well. 

A good historical example here is the graphics card. A lot of calculations have to go into displaying complex three-dimensional images and these have to be done quickly, particularly for video games where a bit of lag in what you see can make you lose. It is hard for a single processor to keep up with these, even if it is very fast. The solution is to calculate different parts of the image in parallel with a chip that has many processors, each with a small amount of memory. This is exactly what graphics cards do. While graphics cards (known as graphical processing units, GPUs) have found other uses outside of just displaying graphics, such as in machine learning, modern computers still have more conventional central processing units to perform all of the other tasks that are not well suited to massive parallel processing. Given how widespread they are now, it is probably debatable whether GPU computing can really still be considered “unconventional”, but it was certainly an unconventional idea at one time. 

QCi’s unconventional computers address different tasks where the more conventional techniques struggle. These include solving hard optimization problems, with the EQC (entropy quantum computing) paradigm and performing machine learning with minimum training in the reservoir paradigm.

What is computing?

Unconventional hardware
The QCi team working on unconventional computing hardware

Similar to how we needed to define what information is in this lesson, it is worth thinking about what it actually means to compute, especially in an unconventional context. Conventional computers are pretty easy to define just by listing out a fairly narrow set of conditions which they all follow (for example they are electronic and based on transistors, have central processors and memory, etc…). This notion does not work in unconventional computing, where the goal is to look for computers outside of the traditional idea of what a computer is. Another option would be to require a computer to be universal, capable in principle of performing any computation given enough time. However, this is even too limiting, as an unconventional computer that is good at solving one particularly important problem could be useful without being universal. A historical example here is analog tide predicting machines, which only could perform a very specific but highly important computation.

We do need to limit the definition of what a computer is somehow though, otherwise we could say that a rock in the front yard is a type of computer that does a very efficient job of calculating the thermal equilibrium behavior of a rock (but isn’t very good at computing anything else). Fortunately, we are not the first people to consider the question of “When does a physical system compute?” The answer, according to researchers on the topic, is that computation relies on having an abstract model of what the system is doing where information is programmed into the system and eventually read out, with the physical system acting as a stand-in for the model. In this idea of computing, almost anything can be a computer if it is used as one but just existing doesn’t make a system a computer. This holds up well with real applications of unconventional computing. For example, some of the earliest ideas of reservoir computers considered the reservoir to be a literal bucket of water. We offer a reservoir computer as one of our products, but it uses a photonic system that is somewhat more complex than a bucket of water.

Probabilistic Computers

Conventional computers, like the one you are probably using to read this article, are deterministic. This means that a set determined series of events will happen for a given input, and any deviation from this behavior is an error (errors do happen but they are exceedingly rare in a properly functioning conventional computer). An alternative is to design a computer where the actions of the computer are well-defined, but there is an element of randomness in how the computer behaves. We will call this a probabilistic computer, while intentionally avoiding the term “non-deterministic", since that has a particular meaning in complexity theory which we want to avoid. While this seems to be a step backward, there are many cases even in conventional computing where making a random choice is useful. In fact the entire Monte Carlo family of computational methods were named after a casino in Monaco for this reason. 

The most “conventional” way to perform probabilistic computation is to use a deterministic computer but use some randomly generated numbers as inputs to make some choices randomly. One of our products is a device that does exactly this, a random number generator. A more unconventional application of probabilistic computing is to use a computer where the inherent randomness is actually built into the computer itself. Since quantum mechanical processes are fundamentally probabilistic (thanks to how quantum measurement works), our EQC paradigm is one example of fundamentally probabilistic computing. 

Analog and Continuous-time Computers

Conventional computers are digital, meaning that the whole computational process can be described in terms of discrete 11 and 00 variables, which undergo discrete operations. Without quantum mechanics, discrete separate states imply discrete operations with bit flips that happen at specific times. While nature itself is analog, conventional computers define ones and zeros as voltages being within certain ranges that are separated by a range of voltages that are considered “forbidden”. A classical computer where the allowed states are continuous would be an analog computer that can have values that change continuously over time. If we are being very technical, when switching, even a digital computer would have to have a voltage pass through the forbidden values, but this happens quickly, in a way that does not interfere with other operations. Therefore this model can be discrete and treat the change as instantaneous.

For quantum systems the distinction between digital and analog is more subtle, quantum superposition means that it is possible to transition between discrete states in a continuous way and the weights in the superposition just change slowly over time. The closest analogy to digital computing in a quantum setting is gate-model quantum computing, where discrete quantum operations (gates) are defined similarly to the operations performed in conventional digital computers. While our core technology could be used to build gate-model computers, there are also numerous more unconventional ways in which it can be used. More unconventional models can be less demanding in terms of noise levels and less demanding for the technology, so they are our current focus.

Computing on a discrete state space, but in a way that is continuous in time is a uniquely quantum proposition. Classically, if a bit is in the state 11, then it either flips to state 00 or it doesn’t, and having a state between 00 and 11 is contrary to the definition of a bit and the computer would be analog in this case. Quantum mechanically, however, it is possible to continuously change a superposition over time. For instance, if I define

ψ(t)=a(t)0+b(t)1|\psi(t)\rangle=a(t)|0\rangle+b(t)|1\rangle

then I can imagine that I can change a and b continuously (note that normalization requires a(t)2+b(t)2=1\left| a(t)\right|^2+|b(t)|^2=1 at all times), such that initially a(0)=1a(0)=1 and b(0)=0b(0)=0 and after some time b(1)=1b(1)=1 and a(1)=0a(1)=0, effectively flipping a bit over time. By the rules of quantum mechanics, I can measure at any time and I will always find a 00 outcome with probability a(t)2\left| a(t)\right|^2 and a 11 outcome with probability b(t)2\left| b(t)\right|^2 and there is no chance of any other outcome. This is fundamentally different from a classical probability distribution, which may change continuously overall. However, where an underlying system is still undergoing discrete jumps, such hidden variable models are not possible in general for quantum systems. An extension to this is that many operations could be performed simultaneously. The simplest example is a continuous-time quantum walk, where Hamiltonian elements that flip bit values as well as terms that describe an optimization problem are active at the same time. In one sense such a system is digital because there are two discrete outcomes, but in another sense it is analog, the operation is continuous and the state is well-defined at all points during the operation. By the traditional way of thinking, continuous-time quantum computing is somewhere between analog and digital, it has a combination of properties of both which would not be allowed in a non-quantum setting. 

Conclusion

While conventional computers are very powerful and can handle many tasks very well, there are some tasks where computing capabilities are still limited. For these tasks, such as those in optimization and machine learning, unconventional ways of thinking about computing are needed. The computing technology we are developing at QCi showcases some of the ways in which unconventional computing can be performed. In particular, quantum systems provide new ways to compute, based on the continuous nature of quantum superpositions.