What is Unconventional Computing: The weird and unexpected

If you had ever wondered what’s behind computing, it might surprise you to know that people used other methods to create computers before we had the electronic computers that we see everywhere today.

Unconventional computing is an interdisciplinary research area with the main goal of finding computing models that go beyond or enrich the standard models we use today. This research effort can also involve alternative methods for implementing a computer that goes beyond electronics.

Some of the technologies being researched today suggest some exciting developments may be in our future.

What is an alternative computing?

Some folks use alternative computing to describe the hacker and computer cracker culture, also known as hacktivism, that has developed over the years. It’s also another moniker for unconventional computing, which is where I’m going to focus. Typically, this is used to describe any new or unconventional methods of computing that are in development.

The general theory of computation allows for many different computational models. We have several that are in use today, which are implemented using electronic means. These implementations are the most advanced we have at the moment. Before electronic computing, computing models were implemented using mechanical computing methods.

As research and development continue, new and exciting methods of implementing a computational model will likely come along, expanding the capability we’ve developed. Some of which may seem very strange or mysterious in comparison to the way computing is conducted today.

What is mechanical computing?

Mechanical computing is the process by which mechanical systems are designed and implemented to engage in computing. That means programs can be made, loaded, and ran in a mechanical computation machine. One of the simplest examples of such a machine is an old music box that you crank by hand to run. The drum in the center is the program, you crank it to run it, and the machine plays a tune. However, the designer can arrange much more advanced operations.

Mechanical calculating machines have a significant legacy in history—devices such as the abacus and the slide rule. The slide rule is so effective for quick calculations it is still sometimes used today. The slide rule was the go-to tool for engineers and mathematicians since the early 1600s when it was invented. Even though they never really took off with the public, they were an important part of developing the modern world we live in.

In a more advanced machine, levers, gears, and other mechanical components are used instead of electronic components—devices such as adding machines and mechanical counters. Mechanical computing was all the rage until the 1960s. At that point, the popularity of electronic computing methods began to spread and take over the market share. Mechanical computing continued into the ’70s and was eventually phased out through the ’80s.

What is electronic computing?

Electronic computing started as a hybrid mixture of mechanical computing and electrical computing. Even though electro-mechanical computers incorporated switches and relay logic, a substantial amount of mechanical components were involved. After that, it wasn’t long before vacuum tubes were included and, subsequently, the transistor.

Once the transistor became mainstream, electronic computing capability was refined and had been growing ever since. This is due to the transistor’s ability to function as a switch. From this simple function, people can build the basic binary logic structures.

With the basic binary logic structures, you can build complex binary computational machines. Machines so complex you can build the modern computational environment we enjoy today on top of it. Increases in speed and a continued march towards miniaturization have unlocked the incredible applications of this technology all around us.

What computing models are used today?

Originally, computing was done by purpose build machines that could only run one program. To change it, you usually had to rebuild the whole machine or, at a minimum, significantly modify it. Then, a stored-program computer was proposed to increase the flexibility of computing machines.

One of the most prevalent computing models for a stored-program computer in use today is the Von Neumann architecture. The Von Neumann model includes the following properties:

  • Facilities for input and output
  • External storage
  • Memory that stores data and instructions
  • A processing unit that has processor registers and an arithmetic logic unit
  • A control unit that has a program counter and an instruction register

Because the Von Neumann model uses a shared bus to access instructions and data, it can only access one at a time. This is what’s known as the Von Neumann bottleneck.

The Harvard architecture is another stored-program computing model similar in design to the Von Neumann model, but it has a separate bus for instructions and data. That means it doesn’t have the same bottleneck present in the Von Neumann model.

The Harvard model is most commonly used in demanding scenarios such as high-performance digital signal processors or tiny devices like microcontrollers. The deciding factors often include the cost and power savings of the Harvard model versus the penalty incurred on the programming side.

What’s next for computational methods?

Almost all areas of science are trying to advance computing within their own discipline. At the moment, the physics folks appear to be ahead of the game with optical computing and quantum computing. Optical computing chips are already at the prototype stage, and there are quantum computer prototypes as well.

The driver of these great efforts is how Moore’s Law is increasingly a constraining force for electronic circuits. This is a big deal for AI research taking place and the immense computing requirements that AI has. So definitely look for some new computing technologies soon!

Scroll to Top