Inspiration

Inspired by our personal experiences with older relatives and grandparents, we were struck by the surprising absence of voice-automated tools designed to support their daily lives. In response, we created Yaad—the Hindi word for "remember"—as a means to unlock and preserve memories through intuitive recitation and retrieval. We believe this approach not only enhances accessibility but also offers a dignified way to maintain personal connections, and we hope you will share in our vision.

What it does

Yaad takes in input for two different functions: storing memories and retrieving memories. The user can select what they want to do and speak to the interface and get tasks and memories retrieved for them based on relevance.

How we built it

Yaad was built using Deepgram, LangChain, Groq, OpenAI, React.js, and TypeScript

Challenges we ran into

Due to the scope of our project, our most difficult issues were with merging the backend logic with our frontend UI.

Accomplishments that we're proud of

This is the first time that team Yaad has worked together on a Hackathon project, and it was definitely a great experience with our complimentary skills in Full-Stack, Backed, and UI/UX development.

What we learned

In this project, we learned so much involving expressing information in new and creative ways, from generating vector embeddings in Pinecone DB to interpreting and displaying those embeddings through RAG modeling.

What's next for Yaad

We want to expand our feature set to image classification and interpretation, as well as increase the amount of accessibility features to improve our target audience's quality of life.

Built With

Share this project:

Updates