Inspiration
Modern games are driven by engagement, yet every year, game studios lose billions to player churn, which is the rate at which players stop playing. Studios know the statistics of when and where players quit, but they've never known why, because they cannot directly measure how players feel while playing. Emotion has never been treated as a primary, measurable factor in game design. We built Emotica to change that.
What it does
Emotica is a real-time emotional telemetry system embedded directly into gameplay with two layers.
For players: The game reads your face every 2 seconds using on-device facial recognition. When it detects sustained frustration, it adapts: enemies slow down, health packs appear, and difficulty scales back. You stay engaged, stress-free, and keep playing.
For developers: Every session feeds a live cloud dashboard powered by Supabase PostgreSQL. Studios see mood distributions, Ragebait triggers, frustration hotspots, retention scores, and whether adaptive interventions actually worked. So, instead of guessing why players quit, developers can now directly correlate emotional data with game events to optimize level design, tutorials, and retention strategies.
How we built it
Emotion Engine: Browser-based facial landmark detection classifies Engaged, Bored, Frustrated, and Surprised states every 2 seconds using streak-based heuristics to filter noise
Ragebait Detection: A frustration score combining emotion streak length, performance drop rate, and reaction variability identifies churn-risk moments before they happen
Adaptive Difficulty: Game parameters like number of enemy spawns, speed, spawn rate, health pack, and ticket frequency adjust in real time based on detected emotional state to stabilize engagement without removing challenge
Cloud Telemetry: Session data is pushed to Supabase PostgreSQL on game end. Metrics captured include Emotional Distribution, Frustration Intensity Index, Ragebait Trigger Frequency, Emotional Stability Curve, and Adaptive Intervention Effectiveness, which are accessible by any studio, anywhere, and ready to feed UX research or churn modeling pipelines
Music Reactivity: Audio volume and intensity shift dynamically based on detected mood
Challenges we ran into
The hardest problem was false positives. A blink or head tilt would spike frustration readings. We solved this with streak filtering, which states that the emotion must persist across multiple consecutive readings before triggering any system response.
The second challenge was adaptive balance. Difficulty changes that feel sudden break immersion. We tuned gradual parameter shifts so the game feels responsive and natural.
The third challenge was real-time performance. Running facial detection in-browser while maintaining smooth gameplay caused frame drops early on. We optimised the detection loop to run asynchronously from the game loop, keeping both stable.
Accomplishments that we're proud of
We are most proud of being able to showcase a demo of our idea implemented in a game cause we believe that when similar models are modified and integrated into other games, it is the future for making gaming better, more engaging, and more personalised.
We also successfully deployed on-device emotion recognition using facial landmark detection with streak-based filtering to eliminate false positives, and hence achieved reliable mood classification without any external API.
Built a Ragebait detection system that identifies churn-risk moments before they happen by combining emotion persistence, performance drop rate, and reaction variability into a single frustration score and connected these metrics to a full telemetry pipeline to a live Supabase PostgreSQL cloud database so that the developers gain insights and improve the game easily.
What we learned
Emotion isn't binary, and it's a signal over time. The most predictive churn moments aren't single frustration spikes; they're sustained patterns.
That insight shaped everything: our detection logic, our intervention timing, and our analytics model.
We also learned that player comfort and responsiveness are as important as technical accuracy
What's next for Emotica
Emotion is the last unmeasured variable in game design. Next steps include integrating Emotica as a drop-in SDK for existing games, expanding emotion classification beyond facial detection to include controller input patterns and performance volatility, and building predictive churn models trained on aggregated emotional telemetry. This implementation could be the difference between a game that succeeds and a game that doesn't.
Built With
- canva
- chart.js
- css
- github
- html
- javascript
- mediapipe-(facial-landmark-detection)
- supabase-(postgresql-cloud-database)
Log in or sign up for Devpost to join the conversation.