One stop shop for running AI/ML on AWS
Docs · Available Images · Tutorials
AWS Deep Learning Containers (DLCs) are pre-built Docker images for running AI/ML workloads on AWS. Each image is tested and patched for security vulnerabilities. For more details, visit our documentation.
- [2026/04/28] vLLM v0.20.0 — EC2:
0.20-gpu-py312-ec2· SageMaker:0.20-gpu-py312· Introduces support for DeepSeek V4. - [2026/04/20] vLLM v0.19.1 — EC2:
0.19-gpu-py312-ec2· SageMaker:0.19-gpu-py312· This upgrades Transformers to 5.5.4, enabling Gemma 4 support. - [2026/04/07] SGLang v0.5.10 — EC2:
0.5.10-gpu-py312-ec2· SageMaker:0.5.10-gpu-py312 - [2026/04/07] vLLM v0.19.0 — EC2:
0.19-gpu-py312-ec2· SageMaker:0.19-gpu-py312 - [2026/03/26] vLLM v0.18.0 — EC2:
0.18-gpu-py312-ec2· SageMaker:0.18-gpu-py312
- [2026/04/28] We cannot guarantee security patching on Ubuntu-based vLLM and SGLang images due to the lack of Ubuntu Pro licensing. Customers may continue using these images at their own discretion and risk. We recommend migrating to our Amazon Linux-based images.
- [2026/02/10] Extended support for PyTorch 2.6 Inference containers until June 30, 2026
- PyTorch 2.6 Inference images will continue to receive security patches and updates through end of June 2026
- For complete framework support timelines, see our Support Policy
- Distributed Training on Amazon EKS - Configure and validate a distributed training cluster with DLCs on Amazon EKS.
- DLCs with Amazon SageMaker AI & MLflow - Use DLCs with SageMaker AI managed MLflow for experiment tracking and model management.
- LLM Serving on Amazon EKS with vLLM - Deploy and serve LLMs on Amazon EKS using vLLM DLCs.
- Fine-tuning Meta Llama 3.2 Vision - Fine-tune and deploy Llama 3.2 Vision for web automation using DLCs, Amazon EKS, and Amazon Bedrock.
- DLCs with Amazon Q Developer and MCP - Streamline deep learning environments with Amazon Q Developer and Model Context Protocol.
- LLM Deployment on Amazon EKS - Deploy and optimize LLMs on Amazon EKS using vLLM DLCs. See also: Sample Code
This project is licensed under the Apache-2.0 License.