-
-
-
-
-
-
The Architecture Diagram: This is the most important image. It shows how S3, Lambda, DynamoDB, and SNS connect
-
The AWS Lambda Console: A screenshot of your Python code inside the AWS Lambda console. This proves you actually deployed the function.
-
Amazon S3 Source Bucket: A screenshot showing the input bucket where JSON files are uploaded to trigger the automated pipeline.
-
The Terraform Plan/Apply: A screenshot of your terminal showing a successful terraform apply. This proves you used Infrastructure as Code.
Inspiration
In modern business, manual data entry is a "silent killer" of productivity. I was inspired to build CloudFlow to prove that even complex data workflows can be fully automated using a serverless architecture, reducing human error to zero and operational costs to nearly $0.
What it does
CloudFlow is an event-driven data pipeline that automates the transition from raw data to database storage. It automatically detects new file uploads in an Amazon S3 bucket. Once a file is detected, it triggers an AWS Lambda function to process the data. The processed information is stored in Amazon DynamoDB, and a notification is sent via Amazon SNS.
How we built it
Infrastructure: Defined the entire cloud environment using Terraform to allow for 1-click deployments. Compute: Developed the backend logic using Python 3.9 within AWS Lambda. Database: Utilized Amazon DynamoDB for NoSQL data storage. Storage: Configured Amazon S3 as the primary event trigger for the pipeline
Challenges we ran into
Data Consistency: Handled the issue of inconsistent data types within incoming JSON files. Validation: Built a custom validation layer inside the Lambda function to verify data before it is saved to the database.
Accomplishments that we're proud of
Cost Management: Engineered the system to run within the AWS Free Tier, resulting in $0 cost when idle. Automation: Created a fully "Zero-Touch" workflow that requires no manual intervention after the initial file upload
What we learned
Serverless Architecture: Learned how to connect multiple AWS services using event-driven triggers. Infrastructure as Code: Gained experience in managing cloud resources programmatically rather than through the manual console
What's next for CloudFlow — Track 1
AI Integration: Plan to add Amazon Bedrock to generate predictive insights from the stored data
Built With
- amazon-lambda
- amazon-web-services
- dynamodb
- python
- s3
- sns
- terraform
Log in or sign up for Devpost to join the conversation.