This article was published on September 17, 2020

Quantum and classical computers handle time differently. What does that mean for AI?


Quantum and classical computers handle time differently. What does that mean for AI?

As humans, we take time for granted. We’re born into an innate understanding of the passage of events because it’s essential to our survival. But AI suffers from no such congenital condition. Robots do not understand the concept of time.

State of the art AI systems only understand time as an implicit construct (we program it to output time relevant to a clock) or as an explicit representation of mathematics (we use the time it takes to perform certain calculations to instruct its understanding of the passage of events). But an AI has no way of understanding the concept of time itself as we do.

Time doesn’t exist in our classical reality in a physical, tangible form. We can check our watch or look at the sun or try and remember how long it’s been since we last ate, but those are all just measurements. The actual passage of time, in the physics sense, is less proven.

In fact, scientists have proven that time’s arrow – a bedrock concept related to the classical view of time – doesn’t really work on quantum computers. Classical physics suffers from a concept called causal asymmetry. Basically, if you throw a bunch of confetti in the air and take a picture when each piece is at its apex, it’ll be easier for a classical computer to determine what happens next (where the confetti is going) than what happened before (what direction the confetti would travel in going backwards through time).

Quantum computers can perform both calculations with equal ease, thus indicating they do not suffer causal asymmetry. Time’s arrow is only relevant to classical systems – of which the human mind appears to be, though our brains are almost certainly quantum constructs.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Where things get most interesting is if you consider the addition of artificial intelligence into the mix. As mentioned previously, AI doesn’t have a classical or quantum understanding of time: time is irrelevant to a machine.

But experts such as Gary Marcus and Ernest Davis believe an understanding of time is essential to the future of AI, especially as it relates to “human-level” artificial general intelligence (AGI). The duo penned an op-ed for the New York Times where they stated:

In particular, we need to stop building computer systems that merely get better and better at detecting statistical patterns in data sets — often using an approach known as deep learning — and start building computer systems that from the moment of their assembly innately grasp three basic concepts: time, space and causality.

While the statement is intended as a sweeping indictment on relying on bare bones deep learning systems and brute force to achieve AGI, it serves as a bit of a litmus test as to where the computer science community is at when it comes to AI .

Currently, we’re building classical AI systems with the hopes they’ll one day be robust enough to mimic the human mind. This is a technology endeavor, meaning computer experts are continuously pushing the limits of what modern hardware and software can do.

The problem with this approach is that it’s creating a copy of a copy. Quantum physics tells us that, at the very least, our understanding of time is likely different from what might be the ultimate universal reality.

How close can robots ever come to imitating humans if they, like us, only think in classical terms? Perhaps a better question is: what happens when AI learns to think in quantum terms while us humans are still stuck with our classical interpretation of reality?

So you’re interested in AI? Then join our online event, TNW2020, where you’ll hear how artificial intelligence is transforming industries and businesses.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top