"Why Iteration is not Innovation"

Watch our recorded WEBINAR!

What is quantum computing and what could it mean?

We spend a lot of time here at ENO8 talking about the newest technologies that impact humanity in the coming months and years. From machine learning and neural networks to AI and next-generation software, we concern ourselves with the new and next. As a custom software developer that specializes in innovation integration, we have to (and, luckily for us, we love to). But one of the technologies we’re really excited about seemingly comes from the pages of science fiction, but is, in reality, one of the most promising things being worked on in the world today:

Quantum computing.

In traditional computing, as I’m sure everyone is probably well aware, the conductors on the microchip have two states of being based on the flow of electrical current — binary, if you will. Either the switch is closed/off, “0”, or it’s open/on, “1”. The smallest unit of data measure is a bit, and those binary bits encode and dictate all of the computer’s operations.

Given the physical nature of those bits on the microchip(s), there’s a physical limit to the amount of bits in any given area. That number has doubled in density every two years for basically the last 50, a phenomenon known as Moore’s Law. But, the Law may be showing signs of slowing because the material sciences are hitting their upper (known) limits.

Enter the realm of quantum computing. Quantum computing relies on an elemental branch of physics (quantum mechanics) that defines the nature and behavior of very small things in our known universe. Things get pretty weird at the infinitesimal level — particles can be in two states at once, they can be intertwined over great differences, etc. And by leveraging these weird properties, quantum computers might be able to tackle problems far more complex than classical computers can handle, and in a much smaller physical architecture.

The theoretical basis for quantum computers were laid out decades ago, but the culmination of this nascent technology might be within our immediate grasp. To understand what’s at stake, here is some background from a study published in the prestigious science journal Nature:

The conceptual foundations of quantum computing were laid during the 1970s and early 1980s — most notably by the late US physicist Richard Feynman, whose lecture on the subject, published1in 1982, is widely credited with launching the field. The basic insight is that conventional computers are ‘either–or’ machines, meaning that the tiny silicon circuit that encodes a given bit of information acts like a switch that is either open or closed. This means that it can represent choices such as ‘true’ or ‘false’, or the 1s and 0s of binary arithmetic. But in the quantum realm, ‘either–or’ gives way to ‘both–and’: if binary 1s are represented by, say, electrons that are spinning clockwise, and 0s by electrons spinning counterclockwise, then the subatomic laws that govern those particles make it possible for a given quantum bit to be both 1 and 0 at the same time.

By extension, the set of qubits [(quantum bits)] comprising the memory of a quantum computer could exist in every possible combination of 1s and 0s at once. Where a classical computer has to try each combination in turn, a quantum computer could process all those combinations simultaneously — in effect, carrying out calculations on every possible set of input data in parallel. And because the number of combinations increases exponentially with the size of the memory, the quantum computer has the potential to be exponentially faster than its classical counterpart.

Scientific American goes on, contextualizing what qubits entail in comparison to classical computers:

The more-than-binary ability to occupy multiple states at once allows qubits to perform many calculations simultaneously, vastly magnifying their computing power. That power grows exponentially with the number of qubits. So at somewhere around 49 or 50 qubits, quantum computers reach the equivalent of about 10 quadrillion bits and become capable of calculations no classical computer could ever match, says John Preskill, a theoretical physicist at California Institute of Technology. “Whether they will be doing useful things is a different question,” he says.

The technology allowing qubits to exist together in these types of architectures are still experimental. The ability to create qubits doesn’t necessarily guarantee users can harness those qubits in the ways we can use classical computers. And, qubit-based systems are incredibly sensitive to outside influence — with the interlacing, multiple-state existences, etc., even the slightest vibrations or electrical disturbances can interrupt the calculation, and render the entire chip useless. Scientists are getting better at correcting for those disruptions, but it still demonstrates the level of opposition these systems will present to scientists at the cutting edge of the science. That said, the immense potential of these computers means it’s worth investing huge amounts of time and resources into because they very well could change the world.



Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

twenty − thirteen =

Jeff Francis

Jeff Francis is a veteran entrepreneur and founder of Dallas-based digital product studio ENO8. Jeff founded ENO8 to empower companies of all sizes to design, develop and deliver innovative, impactful digital products. With more than 18 years working with early-stage startups, Jeff has a passion for creating and growing new businesses from the ground up, and has honed a unique ability to assist companies with aligning their technology product initiatives with real business outcomes.

Get In The Know

Sign up for power-packed emails to get critical insights into why software fails and how you can succeed!

EXPERTISE, ENTHUSIASM & ENO8: AT YOUR SERVICE

Whether you have your ducks in a row or just an idea, we’ll help you create software your customers will Love.

LET'S TALK

Beat the Odds of Software Failure

2/3 of software projects fail. Our handbook will show you how to be that 1 in 3.