Inspiration

In a world where AI is quickly entering creative and emotional spaces, we asked ourselves: Can a machine ever really understand how we feel?

We’ve seen AI rewrite heartfelt journals into polished summaries. We’ve watched sketches full of raw emotion be “cleaned up” into something slick but soulless. Moments like these sparked a question: What gets lost in translation when emotion meets machine?

MoodTrace was born from that question, a space to explore where AI aligns with us… and where it misses the mark! It's not just about creation, it's about reflection. What feels real? What feels human?

What it does

MoodTrace is a space where you explore the emotional gap between people and AI through creativity.

  • In the Drawing Challenge, you sketch what a feeling looks like. The AI then tries to interpret that drawing it in its own way. The result always feels... off, and that’s the point.

  • In the Sketch Challenge, you’re given a strange, AI-generated doodle. Your job is to turn it into something meaningful, something only a human could feel. It’s a test of imagination, not perfection.

  • In the Text Challenge, you write something honest, maybe a journal entry, a poem, or just how your day felt. The AI rewrites it in its own voice, often missing the tone or stripping away what made it personal. You get to see both versions side by side.

  • In the Gallery, people share their work, vote on which version feels more real, and reflect on how emotion gets lost in translation. MoodTrace is part art, part experiment, and a quiet reminder that machines don’t feel the way we do.

How we built it

We built MoodTrace using a full-stack setup that helped us move fast and stay creative.

On the frontend, we went with React 18 with TypeScriptso we could write clean, organized code that’s easier to manage. Tailwind CSS made styling super quick while Lucide React icons gave the app a simple, polished feel. We relied on Vite to speed things up during development,. Drawing functionality was powered by the p5.js, allowing users to sketch their emotions directly in-browser.

The backend runs on FastAPI, which worked great for building lightweight APIs and paired it with Pydantic and used Uvicorn to serve everything. Even though we connected groqAPI, we used mock responses while testing so we didn’t run into API limits.

Using this stack helped us stay focused on what we really cared about creating an experience that explores the space between how we feel things and how machines try to make sense of it.

Challenges we ran into

  • Deploying the backend was harder than expected, especially while learning how to use FastAPI and GROQ API for the first time.

  • Working with teammates across different time zones made syncing up a bit challenging.

  • Figuring out how to structure and connect everything in FastAPI took some trial and error.

  • We spent a lot of time fixing small but frustrating UI bugs that showed up during testing.

What we learned

  • AI is smart... but feelings? That’s a whole other level. It really struggles with the messy, beautiful chaos of human emotion.

  • Turns out, making a machine "feel" is harder than expected (spoiler: it still doesn’t).

  • We learned how to hook up live AI with fun things like doodles and deep thoughts aka drawings, texts & games.

  • We got better at frontend + backend magic (FastAPI, GROQ, lots of debugging tears).

  • We realized designing something that makes people reflect, not just click, takes time, thought, and a bit of soul.

  • Most importantly: humans are complicated, emotional, and wildly creative and that’s kind of wonderful.

What's next for MoodTrace

  • We want to teach our AI to “feel” better (or at least fake it more convincingly) by training custom models on anonymous user input.

  • We’re working on a smarter dashboard so you can actually see how your emotions are being interpreted over time like your own personal mood map.

  • An API is in the pipeline so other projects can plug into MoodTrace and play with emotion too.

  • We're adding support for more languages, because feelings don’t only happen in English.

  • And we’re excited to explore how different cultures express emotions and how badly machines misunderstand all of them (which is kind of the point).

Built With

  • canvas-api
  • eslint
  • fastapi
  • github
  • groqapi
  • lucide-react
  • netlify
  • p5.js
  • postcss
  • pydantic
  • react
  • tailwindcss
  • typescript
  • uvicorn
  • vite
+ 18 more
Share this project:

Updates