Sentira

See what you're feeling. Show who you love.


What Inspired Us

We didn't start with a concept. We started with a feeling we couldn't name.

The three of us have all experienced some version of the same moment: you walk into a room and something is off. A friend goes quiet and you know — before a single word — that something happened. You hang up the phone and feel lighter, or heavier, without being able to explain why. The emotional atmosphere between people is constantly communicating. We're just never taught to read it.

When we saw the prompt — design a tool to track and influence human sensory experience — we kept coming back to interoception: the body's internal sense of its own state. Neuroscientists define it as the sensory system that maps your heartbeat, your tension, your emotional feeling. It's the physiological basis of emotion itself. And yet we have no tool for it. We track our steps, our sleep, our heart rate. But the thing we actually navigate every day — what we're feeling, and whether the people we love know we're there — remains invisible.

We wanted to change that by making the signal that already exists inside you visible — and shareable, on your own terms.

The story that grounded everything was simple: two college students, 1,200 miles apart, who love each other but have lost the texture of closeness. Nadia carries a lot and can't find the words. Priya hides when she's struggling. Neither of them needs more messages. They need presence. Sentira was built for them.


What We Built

Sentira is a mobile app — with an Apple Watch companion — that externalizes your internal emotional state as a living, abstract blob. The blob is always yours. You choose what it looks like, who sees it, and when.

The visual language is intentionally minimal. Three variables carry all the meaning:

  • Color > what you're feeling
  • Size > how much — emotional intensity
  • Opacity > how present you are — open or withdrawn

This gives us an expressive system without labels, without diagnosis, without demand. The blob communicates without requiring the user to perform.

The app has three surfaces:

You — your own blob, live. Tap Understand Why to trace your state to its physical root: proprioception, sound sensitivity, thermoception. Each explanation comes with an Insight to Action — a single, gentle thing you can do right now.

Friends — the people you've chosen to share with. You see their blob. They see yours. A connection line between two blobs shows shared presence. When Nadia holds Priya's blob, a warm pulse travels that line — not a message, not a notification. Just presence sent.

Watch — your blob lives as ambient wallpaper on your wrist. When someone sends you a hug, your watch glows softly with their blob and one line: "Nadia is thinking of you." Priya sees it. She picks up the phone.

The weekly view maps your emotional states across time as a scatter plot — clustered by feeling, labeled by pattern. Not to judge. To show you what you lived.


How We Built It

We designed in Figma over a single weekend, using Figma Make to prototype core interactions.

Our process followed three phases:

1. Story first. Before we touched a frame, we wrote Nadia and Priya's story end to end. Every screen decision came from asking: does this serve this moment in their story? If it didn't, we cut it.

2. System second. We defined the blob visual language as a three-variable system before designing any screen. Color, size, opacity — locked early, applied consistently. This made every subsequent design decision faster and more intentional.

3. Prototype third. We used Figma Make to build the hug interaction, the watch nudge, and the preferences page. Where possible, we designed screens in Figma first, then handed them to Make with precise prompts. Short, targeted follow-up prompts fixed specific interactions without rebuilding entire flows.

The emotional arc of the app — solo awareness to shared presence to real connection — shaped not just the use cases but the information architecture. You start with yourself. You add people when you're ready. The app never pushes you outward before you're there.


What We Learned

Naming the sense made everything clearer. Once we stopped saying "emotional awareness" and started saying externalized interoception, every design decision sharpened. We knew what the app was doing scientifically, which made it easier to explain what it was doing emotionally.

Ending outside the app is the point. The most important moment in our prototype isn't in the app — it's Priya picking up the phone. We learned that good wellness design doesn't try to contain the user. It gives them what they need to go live their actual life.

Constraints are the design. Our blob has three variables. We kept being tempted to add a fourth — motion, texture, shape. Every time we resisted, the system got stronger. The simplest version of the visual language was always the most legible one.

Personas aren't decoration. Nadia and Priya made every hard decision easier. When we couldn't agree on whether a feature belonged, we asked: would Nadia use this at 11pm in her studio? That question resolved more debates than any design principle.


Challenges We Faced

The surveillance problem. Early in the process, our framing was about sensing others' emotional states. We quickly realized this created an app that could be used to monitor people without their knowledge. The reframe — you only ever share your own blob — solved the ethical problem and made the product more interesting. The app became about radical self-disclosure rather than emotional surveillance.

Scope. At various points this weekend, Emotional Bridge was three different apps: a self-tracking tool, a conflict resolution tool, and a shared joy platform. We had to keep pulling back to the single thread: Nadia, Priya, one story, one sense. Breadth is easy. A story is hard.

The watch interaction. Getting the watch nudge to feel warm rather than intrusive required more iteration than any other screen. The line between ambient presence and notification anxiety is thin. We landed on one line of copy, one soft blob, no sound implied. Restraint was the answer.

Designing for a sense that doesn't have a name yet. Interoception is known. Externalized interoception — a shared, visible version of it — doesn't exist yet. Designing for something speculative meant we couldn't test assumptions against existing mental models. Every interaction had to teach the user what the sense was while also showing them how to use it. The onboarding and the blob key were our solution: teach the language before you ask anyone to speak it.


Built at FigBuild 2026 by Christine, Olivia, and Gabi. Designed in Figma. Prototyped with Figma Make.

Built With

  • none
Share this project:

Updates