The revelation of Amazon’s massive commitment to one AI provider, signifies a deviation from the AWS (Amazon Web Services) differential advantage of supporting numerous AI models. This development came to light during the recent DTW telecoms trade show with the spotlight on Hyperscaler-enabled AI. Given the exceptional computing power AI demands and the complementary relationship it has with operations already running on the public cloud, it is markedly clear why operators are ceding yet another slice of their operation to either of the large US tech players that lead this sector.

In this space, we find Google, owner of BARD. It is commonly assumed that Google guides its cloud customers towards BARD for their AI needs. Similarly, Microsoft, the prominent investor in OpenAI, would likely guide its Azure customers to follow the OpenAI path. On the other hand, AWS has maintained a position of neutrality, supporting whichever platform their customers choose.

However, with the announcement of Amazon’s intention to invest up to $4 billion in Anthropic, owner of the Claude LLM (large language model) that competes directly with similar models offered by OpenAI and Google, AWS’s position appears to be evolving. The announcement highlights Anthropic’s ensuing commitment to AWS, though it is challenging not to contemplate the reciprocal commitment.

Recognizing the Anthropic team’s competence and superior foundation models, Andy Jassy, Amazon CEO, expressed Amazon’s enthusiasm towards the collaboration. He expects numerous customer experiences to improve tremendously, short and long-term, through this deeper collaboration.

The Anthropic’s CEO, Dario Amodei, also looks forward to the collaboration with AWS expressing excitement about using AWS’s Trainium chips to develop future foundation models. Since announcing their support for Amazon Bedrock in April, the uptake of Claude by AWS customers has been significant.

Before Amazon’s announcement, Anthropic was estimated to be worth around $5 billion. Amazon’s initial investment is reported to be a meagre $1.25 billion, allowing the company to claim only a minority stake. However, this raises the question: why announce the $4 billion figure if there are no intentions to realize it fully? Doing so would seemingly make Amazon the majority owner, creating an inherent bias favoring Clause above other LLMs.

While several supportive quotes were received from AWS customers, none were from the telecoms industry. This is noteworthy because Light Reading confirms that these LLMs are not currently being trained on telecoms data. However, to partake in the generative AI revolution, operators will increasingly need to rely more heavily on one or more of the large public cloud providers. Their choice of provider will dictate which LLM they use, further aligning with the global trend.



Source