Science & Technology

Google’s New Custom Chip May Not Live Up to the Hype

Google final week introduced the Tensor Processing Unit, a customized application-specific built-in circuit, at Google I/O.

Constructed for machine studying purposes, TPU has been operating in Google’s knowledge facilities for greater than a 12 months.

Google’s AlphaGo software program, which

thrashed an 18-time worldwide Go champion in a match earlier this 12 months, ran on servers utilizing TPUs.

SHORT CAPTION TEXT GOES HERE

These server racks home the TPUs utilized in the AlphaGo matches with Lee Sedol.

TPU is tailor-made for

TensorFlow, Google’s software program library for machine intelligence, which it turned over to the

open source community final 12 months.

Moore Nonetheless Guidelines

For machine studying, TPUs present an order-of-magnitude better-optimized efficiency per watt, Google mentioned. It is comparable to fast-forwarding know-how about seven years — three generations of Moore’s Legislation.

That declare is deceptive, in accordance to Kevin Krewell, a principal analyst at Tirias Research.

“It solely works on 8-bit math,” he instructed TechNewsWorld. “It is mainly like a Z80 microprocessor in that regard. All that speak about it being three generations forward refers to processors a 12 months in the past, so that they’re evaluating it to 28-nm processors.”

Taiwan Semiconductor Manufacturing reportedly has been engaged on a 10-nanometer FinFET processor for Apple.

“By stripping out most capabilities and utilizing solely crucial math, Google has a chip that acts as if it was a extra complicated processor from a pair generations forward,” Krewell mentioned.

Moore’s legislation focuses on transistor density and “tends to be tied to elements which are focused at calculation pace,” identified Rob Enderle, principal analyst at the Enderle Group. The TPU “is extra targeted on calculation effectivity, so it doubtless will not push transistor density. I do not count on it to have any actual influence on Moore’s legislation.”

Nonetheless, the board design “has a very huge warmth sink, so it is a comparatively massive processor. If I am Google and I am constructing this practice chip, I am going to construct the largest one I can put into the energy envelope,” Krewell famous.

Potential Affect

“Clearly, hyperscale cloud operators are step by step turning into extra vertically built-in, so that they transfer extra into designing their very own tools,” mentioned John Dinsdale, chief analyst at

Synergy Research Group.

That would “assist them strengthen their sport,” he instructed TechNewsWorld.

The processor might make Google “a a lot stronger participant with AI-based merchandise, however ‘might’ and ‘will’ are very totally different phrases, and Google has been extra the firm of ‘might however did not’ of late,” Enderle instructed TechNewsWorld.

The TPU will let Google scale up its question engine considerably, offering for higher-density servers that may concurrently deal with the next quantity of questions, he mentioned. Nonetheless, Google’s efforts “have a tendency to be underresourced, so it is unlikely to meet its potential until that follow modifications.”

There Is not Solely One

The TPU is not the first processor designed for machine studying.

Intel’s Xeon Phi processor product line is a part of that firm’s Scalable System Framework, which goals at bringing machine studying and high-performance computing into the exascale period.

Intel’s goal is to create methods that converge HPC, huge knowledge, machine studying and visualization workloads inside a typical framework that may run in both the cloud or knowledge facilities, the latter starting from smaller workgroup clusters to massive supercomputers.

A Case of Overkill?

Whereas the TPU “could have a giant impact and influence in data-intensive analysis, most enterprise issues and duties may be solved with easier machine studying approaches,” Francisco Martin, CEO of

BigML, identified. “Just a few corporations have the quantity of knowledge that Google manages.”

Historically, customized chips for machine studying algorithms “by no means turned out to be very profitable,” he instructed TechNewsWorld.

“First, customized architectures require customized improvement, which makes adoption troublesome,” Martin famous. “Second, by Moore’s legislation, normal chips are going to be extra highly effective each two years.”

TPU is “tailor-made to very particular machine studying purposes based mostly on Google’s TensorFlow,” he mentioned. Like different deep studying frameworks, it “requires tons of fine-tuning to be helpful.” That mentioned, Amazon and Microsoft “will most likely want to provide one thing comparable to compete for purchasers with superior analysis tasks.”
Google's New Custom Chip May Not Live Up to the Hype


Back to top button

Adblock Detected

Please stop the adblocker for your browser to view this page.