The CPU has been the brain of computers for decades. Much like a body, a computer’s parts work in tandem together to produce a collective outcome. But, for the past 5-10 years, the GPU has slowly taken on a greater and greater importance in the pecking order. More tasks benefitted from the different architecture and differentiated computing focus. Big physics problems, image recognition computations… you get the idea. Then, GPUs became the hubs for neural networks, machine learning and the first mass-market incarnation of A.I. And as we push the boundaries of Moore’s Law to its physical limitations, it was only a matter of time before niche chips designed for specific tasks cropped up. Enter the IPU.
…and who’s building them?
It’s not really news to say big technology companies like Google are designing and developing their own chip architecture for A.I. applications. Many futurists and technology prognosticators would agree it’s the future of computing (both from an output perspective as well as a market/$$$ opportunity perspective).
But, the juicy prospect of purpose-built A.I. hardware has V.C. money flying around and no shortage of startups looking to put it to work. Graphcore, for one, dropped a 2,000 teraflop A.I. supercomputer that’s about the size of a gaming P.C. And they call their proprietary chip architecture IPU, for ‘Intelligence’ Processing Unit.
Great name. Great branding. Also, it makes great sense for what they’re actually building.
GPUs do so much more than graphics processing nowadays. But that is what they’re designed to do at heart. IPUs as a class of chip really are purpose-built with A.I. in mind. It’s not only a strong branding play; I would wager it’s legitimately a more descriptive term for what’s being built.
Graphcore recently took on more than $50 million from Sequoia following a $30 million capital raise as recently as July. And it makes sense! As physical space within computers is no longer the limiting factor in any one’s given capabilities, why not pack multiple purpose-built chips onto one motherboard? If you can get them working in concert together, you could revolutionize modern computing (at least at the cloud/enterprise scale at the very least). So of course it follows that venture capitalists want to put money into fledgling niche chip-makers that show real promise for a viable hardware product.
IPUs could be just the start. Other companies are talking about DPUs (dataflow processing units), NPUs (neural processing units, and EPUs (emotion processing units). And why not? The computers of the future will require some combination of these skills to remain relevant — why not purpose build the hardware tasked with providing it?