This project uses Amazon Bedrock, AWS Lambda, API Gateway, and Amazon S3 to generate production-ready Dockerfiles using Gen-AI models. Everything is deployed using Terraform Cloud.
- Generate best-practice Dockerfiles for any programming language using Amazon Bedrock (Llama3).
- Store the generated Dockerfile in S3.
- Return a presigned URL to download the file.
- Full infrastructure-as-code using Terraform Cloud.
Here's how the whole system works:
- Postman sends a POST request to API Gateway with the programming language.
- API Gateway triggers the Lambda Function.
- Lambda Function:
- Sends a prompt to Amazon Bedrock
- Receives Dockerfile content
- Stores the Dockerfile in Amazon S3
- Generates a presigned S3 URL
- Returns the URL back to API Gateway
- API Gateway responds to Postman with the final download URL.
The Lambda function:
- Accepts
{ "language": "python" }as input - Generates a Dockerfile using Bedrock
- Saves it to S3
- Returns a presigned download URL
See lambda/app.py for details.
- Amazon Bedrock - Gen-AI model invocation
- AWS Lambda - Serverless backend
- Amazon S3 - File storage
- API Gateway - HTTP interface
- Terraform Cloud - Infra provisioning
Read the full blog post with prompt debugging lessons, Bedrock quirks, and architecture breakdown here:
👉 Medium Blog
MIT
Zlash65
GitHub
