Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Google has a speedy new AI chip it doesn’t really want to talk about

Google’s once-secret chip yields more questions than answers.

Google yesterday confirmed rumors that it has been working on a custom chip designed to speed up computing related to its artificial intelligence efforts.

The result, it said at its I/O developer conference, is a chip it calls a Tensor Processing Unit. It’s designed to work with TensorFlow, an open source software library for developing AI applications.

The TPU chips, Google says, are designed to be built into its existing computing infrastructure and are already in use boosting the performance of services like Street View and voice recognition. They also played a part in Google’s AlphaGo software that defeated the human champion at the board game Go.

Naturally, engineers and chip experts around the world have a lot of questions about this new chip. We asked Google for some answers but were told there are “more details coming later this year.” For now, we just have to ask and try to answer ourselves.

1. Is the TPU pre-trained?

Artificial intelligence is closely linked to a science called machine learning, which is exactly what it sounds like. It takes millions of examples of data in order to train a computer to recognize patterns. “For example, if you want a computer to recognize pictures, you have to show it literally millions of pictures, and a human has to check the answers,” said Pat Moorhead, head of research firm Moor Insights and Strategy. Once the training is done, the AI system executes based on what it has “learned.” Examples of these “pre-trained” chips include IBM’s TrueNorth and the Fathom developed by the startup Movidius.

2. Can the TPU be re-programmed if AI algorithms change?

Google referred to the chip as an ASIC (pronounced A-sick), which in the nomenclature of the chip industry stands for Application-Specific Integrated Circuit. ASICs are like the Kentucky Fried Chicken of chips: They are designed to do one thing, and only one thing really well. Those functions are hard-coded directly into the circuitry of the chip. That means that if the needs for those functions change, you have to redesign the chip itself and manufacture new ones, which can take months. Networking giant Cisco Systems uses ASIC chips in its routers, and ASIC chips can often be found in smartphones handling video and audio functions.

An ASIC is also a step up on the taxonomy of chips from an FPGA, or field-programmable gate array, which is essentially a chip that can be re-programmed to do specialized tasks. (Last year, Intel spent nearly $17 billion to buy an FPGA company, Altera.) Logically, the algorithms associated with AI applications will be subject to change over time at Google. Microsoft has been using FPGA chips to enhance the AI capabilities of its Bing search engine. So why not use an FPGA?

3. Will TPUs work only with TensorFlow?

TensorFlow is one of several AI software libraries. Is this chip open only to one?

4. Could several TPUs be connected in a system to work together?

This is common for other chips. Could several TPUs work on especially complicated AI problems together, or even teamed up with other chips?

5. In the server rack, why is the TPU inserted inside a hard drive?

Putting a chip close to the hard drive and not closer to a server’s main computing engine, typically an Intel Xeon chip, seems to place the TPU away from where the computing action is.

6. Where is the memory?

There’s probably a lot that’s obscured by the large metal heatsink that is the TPU’s most prominent visual feature and which is used to conduct heat away from the chip itself. Given the apparent size of the component Google has displayed, there doesn’t seem to be room for much memory, Moorhead said. “If you’re doing any training, you need a lot of memory,” which sends us back to question No. 1.

7. Where is the chip being built?

Google isn’t a chip company, and unless it’s been hiding one it doesn’t have a chip factory — typically called a fab — where this chip could have been built. Google has the resources to design it, but it would have farmed the job of manufacturing it out to a foundry company that builds chips under contract, probably Taiwan Semiconductor Manufacturing or GlobalFoundries. So which is it?

The Google rep remained mum.

This article originally appeared on Recode.net.

More in Technology

Podcasts
Are humanoid robots all hype?Are humanoid robots all hype?
Podcast
Podcasts

AI is making them better — but they’re not going to be doing your chores anytime soon.

By Avishay Artsy and Sean Rameswaram
Future Perfect
The old tech that could help stop the next airborne pandemicThe old tech that could help stop the next airborne pandemic
Future Perfect

Glycol vapors, explained.

By Shayna Korol
Future Perfect
Elon Musk could lose his case against OpenAI — and still get what he wantsElon Musk could lose his case against OpenAI — and still get what he wants
Future Perfect

It’s not about who wins. It’s about the dirty laundry you air along the way.

By Sara Herschander
Life
Why banning kids from AI isn’t the answerWhy banning kids from AI isn’t the answer
Life

What kids really need in the age of artificial intelligence.

By Anna North
Culture
Anthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque messAnthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque mess
Culture

“Your AI monster ate all our work. Now you’re trying to pay us off with this piece of garbage that doesn’t work.”

By Constance Grady
Future Perfect
Some deaf children are hearing again because of a new gene therapySome deaf children are hearing again because of a new gene therapy
Future Perfect

A medical field that almost died is quietly fixing one disease at a time.

By Bryan Walsh