"Why Iteration is not Innovation"

Watch our recorded WEBINAR!

Google brings artificial intelligence to the masses

It’s no secret that the future of computing is likely to happen in two places above all else — artificial intelligence (AI), and on the cloud. Amazon, one of the tech world’s primary darlings, didn’t really start turning a profit until Amazon Web Services (AWS) began powering a huge portion of the Internet. Jeff Bezos and co. realized what a powerful resource fast, reliable cloud computing would become to 21st century organizations and made a massive move in that direction. Never one to seed market dominance in the digital world to a rival, Google has made a game-changing move themselves, only this time, it layers AI on top of its cloud offering.

To that end, Google will soon launch a cloud computing service that provides exclusive access to its new artificial-intelligence chip, which Google’s engineers designed themselves. What’s more, the chip is purpose-built to teach and train neural networks

CEO Sundar Pichai announced the new chip and service this morning at Google I/O, its annual developer conference.

Neural networks are simply large arrays of computers (or parts of computers — we’ll get to that in a moment) crunching away at enormous amounts of data in the hopes of recognizing patterns and learning how to think on its own. For the most part, companies developing their own neural networks build GPU farms (Graphics Processing Units) to train the neural network how to think. GPUs were originally designed to render graphics for intensive computational tasks, like video games or movie editing, encoding and exporting, but have since been repurposed for applications such as these. nVidia, a Silicon Valley chipmaker, has owned this market for some time now.

According to a Wired article summarizing the announcement, Google is now “providing some serious competition with a chip specifically designed to train neural networks. The TPU 2.0 chip can train them at a rate several times faster than existing processors, cutting times from as much as day down to a several hours, says Jeff Dean, who oversees Google Brain, the company’s central AI lab.”

Google christened the new chip “TPU 2.0” or the Cloud TPU; it’s version 2 of the custom-built processor Google has been using to drive its own AI services, most notably image recognition and machine translation, for the last two years.

The Wired article goes on to compare the newest Google offering to those of its competitors:

Amazon and Microsoft offer GPU processing via their own cloud services, but they don’t offer bespoke AI chips for both training and executing neural networks. But Google could see more competition soon. Several companies, including chip giant Intel and a long list of startups, are now developing dedicated AI chips that could provide alternatives to the Google TPU. “This is the good side of capitalism,” says Chris Nicholson, the CEO and founder of a deep learning startup called Skymind. “Google is trying to do something better than Amazon—and I hope it really is better. That will mean the whole market will start moving faster.”

These moves can have massive impacts for custom software development, as well as completely transform the way businesses use digital tools. The applications of the future will almost certainly include AI integrations to make immediate, actionable sense of your data. And, the more and more data you and your clients/customers create, the more the neural network learns, adapts and improves. But, instead of having to outlay huge amounts of capital to built GPU arrays capable of marshaling a neural network’s resources, you can simply rent the capacity from Google for a monthly fee.

Our digital assistants will get better and better. Speech and facial recognition will grow by leaps and bounds. Operational efficiencies will skyrocket. Business intelligence will be smarter and more insightful than ever. The number of applications for such a service is mind-bogglingly large, and soon, it could be built into every digital tool at your and your company’s disposal.

Regardless if Google ultimately wins out in this battle for neural network supremacy, it’s great for the technological ecosystem. It might be that Amazon or nVidia figure out how to train neural networks faster and more efficiently through some other process or hardware, but that doesn’t really matter to us as developers or consumers. What matters is that the industry is moving ever-faster toward an AI future, and with Google throwing its hat in the ring, the pace of innovation could heat up something fierce indeed.



Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Captcha *

Jeff Francis

Jeff Francis is a veteran entrepreneur and founder of Dallas-based digital product studio ENO8. Jeff founded ENO8 to empower companies of all sizes to design, develop and deliver innovative, impactful digital products. With more than 18 years working with early-stage startups, Jeff has a passion for creating and growing new businesses from the ground up, and has honed a unique ability to assist companies with aligning their technology product initiatives with real business outcomes.

Get In The Know

Sign up for power-packed emails to get critical insights into why software fails and how you can succeed!

EXPERTISE, ENTHUSIASM & ENO8: AT YOUR SERVICE

Whether you have your ducks in a row or just an idea, we’ll help you create software your customers will Love.

LET'S TALK

Beat the Odds of Software Failure

2/3 of software projects fail. Our handbook will show you how to be that 1 in 3.