Amazon unveiled a suite of new AI tools, including its large-scale language model (LLM) called Titan and Bedrock’s cloud computing service. The move signals Amazon’s entry into the AI race, as rivals Microsoft and Google have already integrated AI chatbots into their search engines and cloud operations.
LLMs, such as OpenAI’s GPT-4 and ChatGPT, have become a hot topic in the tech space, prompting heavy investments in AI labs and disrupting the business operations of major companies. Amazon Web Services (AWS), Amazon’s cloud computing arm, has responded by designing a new AI technology that allows companies to develop their own chatbots and image generation services, similar to OpenAI’s DALL-E.
A core service provided by AWS, called Amazon Bedrock, allows companies to use their data to customize a foundational model. This foundational model serves as a starting point for companies to build fine-tuning, enabling them to bring in their data and customize the model to their needs.
Users can customize Amazon Titan models using their data, but that data will not be utilized to train Titan’s models, and it will not help other users, including rivals. Amazon does not divulge the Titan model’s size or the data needed to train it.
Bedrock is currently in limited preview, and Amazon has not disclosed the cost of the service. However, customers can register on the waiting list and use it. Amazon says a key part of its Bedrock offering is allowing AWS customers to test servers in their data centres with these new AI technologies without having to manage them. The servers underlying the Bedrock service will use a combination of Amazon’s custom AI chips (AWS Trainium and AWS Inferentia) and GPUs from Nvidia.
The introduction of Bedrock and Titan will allow companies to develop their own AI technologies without the need for heavy investment in AI labs. The future of AI looks bright, with Amazon leading the way.