The CPU has been the brain of computers for decades. Much like a body, a computer’s parts work in tandem together to produce a collective outcome. But, for the past 5-10 years, the GPU has slowly taken on a greater and greater importance in the pecking order. More tasks benefitted from the different architecture and differentiated computing focus. Big physics problems, image recognition computations… you get the idea. Then, GPUs became the hubs for neural networks, machine learning and the first mass-market incarnation of A.I. And as we push the boundaries of Moore’s Law to its physical limitations, it was only a matter of time before niche chips designed for specific tasks cropped up. Enter the IPU.

What’s an IPU?

…and who’s building them?

It’s not really news to say big technology companies like Google are designing and developing their own chip architecture for A.I. applications. Many futurists and technology prognosticators would agree it’s the future of computing (both from an output perspective as well as a market/$$$ opportunity perspective).

But, the juicy prospect of purpose-built A.I. hardware has V.C. money flying around and no shortage of startups looking to put it to work. Graphcore, for one, dropped a 2,000 teraflop A.I. supercomputer that’s about the size of a gaming P.C. And they call their proprietary chip architecture IPU, for ‘Intelligence’ Processing Unit.

Great name. Great branding. Also, it makes great sense for what they’re actually building.

The future of computing?

GPUs do so much more than graphics processing nowadays. But that is what they’re designed to do at heart. IPUs as a class of chip really are purpose-built with A.I. in mind. It’s not only a strong branding play; I would wager it’s legitimately a more descriptive term for what’s being built.

Graphcore recently took on more than $50 million from Sequoia following a $30 million capital raise as recently as July. And it makes sense! As physical space within computers is no longer the limiting factor in any one’s given capabilities, why not pack multiple purpose-built chips onto one motherboard? If you can get them working in concert together, you could revolutionize modern computing (at least at the cloud/enterprise scale at the very least). So of course it follows that venture capitalists want to put money into fledgling niche chip-makers that show real promise for a viable hardware product.

IPUs could be just the start. Other companies are talking about DPUs (dataflow processing units), NPUs (neural processing units, and EPUs (emotion processing units). And why not? The computers of the future will require some combination of these skills to remain relevant — why not purpose build the hardware tasked with providing it?



Leave a Reply

Your email address will not be published. Required fields are marked *

Captcha *


Rishi Khanna

Rishi Khanna is a serial entrepreneur and high growth CEO. He works closely with clients and internal leaders to think 10X. He enables business growth and improve operating efficiencies/profits through leveraging emerging technologies and digital transformational strategy. Avid about the sharing of knowledge, Rishi has written and been featured in Inc. Magazine, Entrepreneur Magazine, USA Today, Dallas Business Journal, Dallas Morning News, IndUS, and various other publications. He likes to use his time to guide, mentor and assist others to follow their passion and purpose in hopes of being a catalyst for innovation.

Like what you’re reading?

Start a conversation with our talented team today!

    Newsletter sign up

    Sign up for our monthly newsletter to stay in the know of all things ENO8 and digital product development.



    Whether you have your ducks in a row or are just at the beginning of an idea, our team is ready to assist you in any way that we can.


    Subscribe to our mailing list to receive updates about new posts to the ENO8 blog