Home Opinion and Features Amazon cranks up AI competition against Microsoft, Google with new cloud tools

Amazon cranks up AI competition against Microsoft, Google with new cloud tools

549

Amazon.com Inc’s cloud computing division this week released a suite of technologies aimed at helping other companies develop their own chatbots and image-generation services backed by artificial intelligence.

Amazon Web Services (AWS) has jumped into the AI race with a suite of its own proprietary technologies, but it is taking a different approach. Photo: File

AMAZON.COM Inc’s cloud computing division this week released a suite of technologies aimed at helping other companies develop their own chatbots and image-generation services backed by artificial intelligence.

Microsoft Corp and Alphabet Inc are adding AI chatbots to consumer products like their search engines, but they are also eyeing another huge market – selling the underlying technology to other companies via their cloud operations.

Amazon Web Services (AWS), the world’s biggest cloud computing provider, on Thursday jumped into that race with a suite of its own proprietary AI technologies, but it is taking a different approach.

AWS will offer a service called Bedrock, which lets businesses customise what are called foundation models – the core AI technologies that do things such as respond to queries with human-like text, or generate images from a prompt – with their own data to create a unique model.

ChatGPT creator OpenAI, for example, offers a similar service, letting customers fine-tune the models behind ChatGPT to create a custom chatbot.

The Bedrock service will let customers work with Amazon’s own proprietary foundation models called Amazon Titan, but it will also offer a menu of models offered by other companies. The first third-party options will come from start-ups AI21 Labs, Anthropic and Stability AI alongside Amazon’s own models.

The Bedrock service lets AWS customers test drive those technologies without having to deal with the underlying data centre servers that power them.

“It’s unneeded complexity from the perspective of the user,” Vasi Philomin, vice president of generative AI at AWS, told Reuters. “Behind the scenes, we can abstract that away.”

Those underlying servers will use a mix of Amazon’s own custom AI chips as well as chips from Nvidia Corp, the biggest supplier of chips for AI work but whose chips have been in tight supply this year.

“We’re able to land tens of thousands, hundreds of thousands of these chips, as we need them,” Dave Brown, vice president of Elastic Compute Cloud at AWS, said of the company’s custom chips. “It is a release valve for some of the supply-chain concerns that I think folks are worried about.”

– REUTERS

Previous articleThabo Bester eating prison food at Pretoria’s Kgosi Mampuru II, says Correctional Services
Next articleTracing Africa’s ancient copper trade routes