• Home
  • Internet
  • Internet News
  • Amazon’s Inferentia Chip to Handle Some Alexa Voice Assistant Services as Company Moves Away From Nvidia

Amazon’s Inferentia Chip to Handle Some Alexa Voice Assistant Services as Company Moves Away From Nvidia

Amazon said the shift to the Inferentia chip for some of its Alexa work has resulted in better latency at lower cost.

Amazon’s Inferentia Chip to Handle Some Alexa Voice Assistant Services as Company Moves Away From Nvidia

Amazon said that its cloud-based facial recognition service has started to adopt its own Inferentia chips

Highlights
  • Amazon previously handled that computing using chips from Nvidia
  • The service has come under scrutiny from civil rights group
  • The Amazon chip is designed to speed up machine learning tasks
Advertisement

Amazon on Thursday said it shifted part of the computing for its Alexa voice assistant to its own custom-designed chips, aiming to make the work faster and cheaper while moving it away from chips supplied by Nvidia.

When users of devices such as Amazon's Echo line of smart speakers ask the voice assistant a question, the query is sent to one of Amazon's data centres for several steps of processing. When Amazon's computers spit out an answer, that reply is in a text format that must be translated into audible speech for the voice assistant.

Amazon previously handled that computing using chips from Nvidia but now the "majority" of it will happen using its own Inferentia computing chip. First announced in 2018, the Amazon chip is custom designed to speed up large volumes of machine learning tasks such as translating text to speech or recognising images.

Cloud computing customers such as Amazon, Microsoft, and Alpahbet's Google have become some of the biggest buyers of computing chips, driving booming data center sales at Intel, Nvidia, and others.

But major technology companies are increasingly ditching traditional silicon providers to design their own chips. Apple on Tuesday introduced its first Mac computers with its own central processors, moving away from Intel chips.

Amazon said the shift to the Infertia chip for some of its Alexa work has resulted in 25 percent better latency, which is a measure of speed, at a 30 percent lower cost.

Amazon has also said that Rekognition, its cloud-based facial recognition service, has started to adopt its own Inferentia chips. However, the company did not say which chips the facial recognition service had previously used or how much of the work had shifted to its own chips.

The service has come under scrutiny from civil rights groups because of its use by law enforcement. Amazon in June put a one-year moratorium its use by police after the killing of George Floyd.

© Thomson Reuters 2020


Which is the best TV under Rs. 25,000? We discussed this on Orbital, our weekly technology podcast, which you can subscribe to via Apple Podcasts, Google Podcasts, or RSS, download the episode, or just hit the play button below.

Affiliate links may be automatically generated - see our ethics statement for details.
Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: Amazon Inferentia chip, Amazon, Alexa
TikTok Ban Enforcement Held Off by US Commerce Department Following Court Ruling: Report
WandaVision Release Date Set for January 2021, a Month Later Than Expected
Share on Facebook Gadgets360 Twitter Share Tweet Snapchat Share Reddit Comment google-newsGoogle News
 
 

Advertisement

Follow Us

Advertisement

© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »