AWS Bedrock
AWS managed service providing access to foundation models from multiple providers (Anthropic, Meta, Mistral) via API, without managing ML infrastructure.
seed#aws#bedrock#llm#ai#foundation-models#serverless
What it is
Amazon Bedrock is a serverless service providing access to foundation models from multiple providers through a unified API. No infrastructure to manage — just call the API and pay per tokens consumed.
Available models
| Provider | Models |
|---|---|
| Anthropic | Claude 3.5 Sonnet, Claude 3 Opus/Haiku |
| Meta | Llama 3.1, Llama 3.2 |
| Mistral | Mistral Large, Mixtral |
| Amazon | Titan Text, Titan Embeddings |
| Cohere | Command R, Embed |
| Stability AI | Stable Diffusion |
Features
| Feature | Function | Integration |
|---|---|---|
| Knowledge Bases | Managed RAG | S3 as source, OpenSearch as vector store |
| Agents | AI agents with tools and memory | Lambda functions as tools |
| Guardrails | Content and PII filters | Applicable to any model |
| Fine-tuning | Model customization | Own data in S3 |
| Model evaluation | Compare models on your data | Automatic + human metrics |
Agent integration
Bedrock Agents allows creating AI agents that use tools (Lambda functions) and query knowledge bases, all managed.
Why it matters
Bedrock democratizes access to foundation models without the complexity of managing GPU infrastructure. For teams already operating on AWS, it is the most direct way to integrate generative AI into existing applications, with the security and compliance of the AWS ecosystem.
References
- Bedrock Documentation — Official documentation.
- Bedrock Supported Models — AWS, 2024. Models available in Bedrock.
- Bedrock Pricing — AWS, 2024. Per-token pricing model.