Inspiration
Shopping online often feels overwhelming. With so many brands and endless product listings, users waste time searching for items that match their unique style. We wanted to create a smarter, simpler way to discover clothes tailored to personal preferences — almost like Tinder, but for fashion.
What it does
Our project, Atelier, is a swipe-based clothing discovery app. Users swipe right if they like an item and left if they don’t. In the background, the system learns from these interactions, using embeddings and recommendation models to better predict and display items that align with the user’s taste.
How we built it
Backend: We used Python, FastAPI, and CLIP (Contrastive Language-Image Pretraining) embeddings to represent clothing images in a vector space. We precomputed embeddings for datasets from multiple clothing brands and exposed them through REST APIs.
Frontend: We built a simple React interface that displays items and captures user swipes (left or right). Axios connects the frontend with the backend API to fetch the “next item.”
Data: Clothing datasets in CSV format (with image URLs) were processed, and embeddings were stored locally to make serving fast.
Challenges we ran into
Handling broken or missing image URLs during embedding generation.
Managing dependencies and library installations (Torch, CLIP, Pandas, etc.) during the hackathon crunch.
Debugging path errors when connecting datasets, embeddings, and backend routes.
Syncing the backend API with the frontend so swipes updated correctly in real-time.
Accomplishments that we're proud of
Successfully connecting a working backend and frontend within hackathon time.
Implementing CLIP embeddings to personalize clothing recommendations.
Creating a clean swipe-based UI that feels familiar and intuitive.
Overcoming setup hurdles to get the full pipeline working end-to-end.
What we learned
How to generate and use CLIP embeddings for real-world recommendation tasks.
The importance of structuring backend/frontend directories cleanly.
How to quickly debug dataset, path, and dependency errors under time pressure.
The value of building MVPs (minimum viable products) that demonstrate core functionality first.
What's next for Atelier
Expanding beyond demo datasets to include larger, real-time fashion catalogs.
Adding authentication so users can save swipes and build long-term style profiles.
Improving the recommendation algorithm with collaborative filtering + embeddings.
Building features like “style boards” where users can curate outfits.
Eventually scaling into a plug-in for fashion retailers to improve customer discovery.
Built With
- axios
- clip
- clip-(openai)
- fastapi
- javascript
- javascript-(react)-frameworks-&-libraries:-backend:-fastapi
- node.js
- npm/yarn-for-frontend-other:-requests-(for-image-downloading)
- numpy
- numpy-frontend:-react
- pandas
- phython
- pip/virtualenv-for-python
- react
- tailwind-css
- tailwind-css-databases-&-storage:-precomputed-embeddings-stored-as-.npy-files-(local-storage)-apis:-custom-rest-api-endpoints-built-with-fastapi-platforms-&-tools:-visual-studio-code
- torch
- visual-studio
Log in or sign up for Devpost to join the conversation.