Inspiration
Coming from the opposite side of the world, Nepal, I found international cuisines fascinating. However, not having immediate access to things like cilantro and paprika made it difficult to follow along, since I had to search for spices that were equivalent or interchangeable. I dreamt of a companion that would look at what I had and found recopies that were possible instead of having to halt midway because of that one special ingredient that made the world's difference in the final taste. And now, Dachef, a preliminary of my imagination, has been created.
What it does
Dachef is a recipe finder/tracker that comparatively more convenient due to the fact that you capture a picture of your ingredients all at once. Previously, recipe finding involved more of typing or selecting ingredients from a long list, or minutes of google search. Dachef intends to convert those minutes into seconds. Dachef takes pictures, called snapshots, of your refrigerator(typically) from where it scans the image to figure out the ingredients the user has on hand. Based on that, the app scours the web using Grounded Gemini 2.5 Pro to retrieve recipes primarily involving those ingredients. Furthermore, Dachef is focused towards a green future. So, it calculates the carbon footprint of each ingredient based on the . Furthermore, through Knot API, the app, with user permissions, can track the user's food delivery habits, and provide alternative healthier and greener homemade recipes.
My main motto with Dachef is for the user to interact with the apps less than ten times before they already have a recipe they could towards or alongside Dachef in the future.
How I built it
The app is primarily built with React Native + Expo + Typescript with Supabase as the Backend as a Service platform for data persistence. The LLM behind DaChef is Gemini 2.5-Pro Exp-03-25 for the initial recipe finding and Gemini 2.0-Flash for the remaining actions because of its speed.
Challenges I ran into
The initial WRAP dataset was approximately 6000 rows long. Hence, I cleaned the data and reduce the rows to approximately 140 lines, appropriate for Gemini to consume. Additionally during the first hours of the hackathon, I was trying different classification methods such as YOLO, Grounding DINO etc. before landing on Gemini's Visual Inference. Gemini was the correct choice given the scenario and constraints because the app required the model to be descriptive of the object. Green apple and red apple have different usecases when cooking.
Accomplishments that I'm proud of
I am happy to make an app that I would use on a daily basis. Although I had more ideas to integrate, I am very happy to have completed the project on time.
What I learned
I learned that Cursor is a great tool to work with when you do not have to think about anything specific. However, it is astronomically annoying when I want to write code in my own way, and it keeps auto completing it. I learned about GPU over IP, Juice, which will prove to be really helpful for me in the long run as I get more involved into making 3d animations.
What's next for Dachef
Immediately, I plan to have a hands-free mode with Dachef when cooking the recipe. After the user will start their cooking session, they will be able to freely converse with the AI model, and show the progress through video capture for more personalized assistance. In the long run, Dachef will aim to be more accurate with object detection, carbon footprint calculation, web search and so on, along with many quality of life features.
Built With
- figma
- ngrok
- reactnative
- supabase
- typescript
Log in or sign up for Devpost to join the conversation.