Content
newsletter Newsletter

Customers are expressing frustration with Amazon Web Services over constraints in its AI platform Bedrock, according to a report from The Information.

Ad

Despite AWS investing in Anthropic, the company reportedly struggles to deliver Anthropic’s models reliably through Bedrock. Several users cite arbitrary usage limits and missing features, including prompt caching—functionality that has long been available when accessing Anthropic models directly through the company’s own API. As a result, some startups, such as Lovable and Praxis AI, are transitioning to Anthropic’s native interface.

According to the report, internal AWS discussions have described the platform's current capacity issues as a "disaster." However, AWS spokesperson Kate Vorys publicly rejected this characterization, stating that the usage restrictions are not caused by capacity constraints. Instead, she described the limits as a standard industry measure to ensure "fair access" amid "unprecedented demand."

Cloud fuels AI fuels cloud

Amazon and Anthropic have formed a strategic partnership centered on AI and cloud infrastructure. Amazon has invested up to $8 billion in Anthropic, making it the startup’s largest backer. In return, Anthropic uses AWS as its primary cloud provider and runs its Claude AI models on AWS’s custom chips (Trainium and Inferentia). Amazon integrates these models into its Bedrock AI platform, aiming to attract enterprise and government customers.

Ad
Ad

Amazon’s core interest lies in boosting AWS cloud revenue by powering Anthropic’s compute-heavy AI workloads and attracting more clients to its ecosystem. It's a circular investment strategy: Amazon invests billions into AI startups like Anthropic, which in turn use that capital to purchase cloud services from Amazon Web Services (AWS). This creates a self-reinforcing cycle where Amazon’s financial backing drives demand for its own infrastructure, boosting AWS revenue.

This circular investment strategy is not unique to Amazon—Microsoft and Google follow similar approaches in the AI space. Microsoft, for example, has invested over $13 billion in OpenAI. In return, OpenAI runs its models on Microsoft’s Azure cloud and integrates them into Microsoft products like Copilot and Azure OpenAI Service. Google has employed the same model with companies like Anthropic and Runway.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Sources
Matthias is the co-founder and publisher of THE DECODER, exploring how AI is fundamentally changing the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.