Inspiration

In the heart of hack the 6ix - where city lights flicker like thoughts and energy pulses through every corner - we imagined something more than just a project. We called it 6ixenality: a fusion of six senses and the sixteen personalities of MBTI. The name became our blueprint - a creative mashup of intuition and individuality, powered by tech. We didn’t just want to build a device that reacts. We wanted one that responds. One that doesn’t just compute data but captures presence. That can feel your pressure, frame your pose, sense your atmosphere - and echo it back in music and meaning. With Raspberry Pi as our canvas and the idea of identity as our palette, we set out to create a personality mirror that feels as alive as the people who use it.

What it does

In the heart of the “6ix,” where vibes speak louder than words and every glance is a story, we dreamed up something different - a device that doesn’t just see you but feels you. Powered by a Raspberry Pi 4B and a dash of personality magic, 6ixenality is a playful oracle that captures your essence - your pose, your pressure, your presence - and translates it into a symphony of self.

How we built it

Our foundation was the Raspberry Pi 4B, brought to life by the Rainbow HAT- complete with sensors, buttons, LEDs, and a buzzer. The Pi Camera captured users in their natural element, while the GPIO libraries handled inputs, image capture, and display. We wrote Python code that orchestrated the interaction: listening for button presses, sensing touch pressure, capturing camera frames, and gathering environmental context. All of this was packaged into a meaningful request for the Gemini API. Gemini returned personality codes with musical notes - and our system translated them into expressive outputs you could both see and hear. Every layer, from software to hardware, worked like instruments in an orchestra - playing in sync to create an emotionally intelligent experience.

Challenges we ran into

We wrestled with timing and coordination - live camera feeds, simultaneous button inputs, sensor reads- all while keeping the experience smooth and intuitive. Gemini's API required thoughtful packaging of context to extract meaningful responses, not just generic ones. Making the buzzer sing chords with clarity and creating consistent display behavior took fine-tuning. And mapping complex psychological concepts like MBTI types into musical tones pushed us beyond simple data processing - it became about emotion, intuition, and human design. But every obstacle made the final experience more authentic.

Accomplishments that we're proud of

We made hardware feel human. We built a system where someone could interact with a few simple buttons - and walk away feeling seen. One of our proudest moments was watching someone tap the buttons, hear their chord, read their MBTI, and simply nod in recognition. That connection - from machine to music to meaning - was what we were chasing from the start. Also, the harmony between image capture, pressure sensing, and AI response worked better than we hoped - and watching it all flow in real-time was magic. (In short, we made it work.)

What we learned

We learned how subtle user input can be - and how designing for quiet data (like pressure and posture) changes how you think about interaction. We dove deeper into how APIs like Gemini can be shaped to return not just information, but insight. We explored how to tie together different forms of expression - visual, auditory, and textual - to create an experience that feels alive. Most importantly, we learned how people respond when technology reflects something personal and poetic back at them. (Basically 0 social interaction/discovering new hardware)

What's next for 6ixenality

We’re just beginning. Future plans include emotion recognition through facial expressions, mood-based music generation, and personalized lighting to match your type. We envision public installations - “vibe “”booths” - where anyone can step in and walk out with their personality symphony. We also aim to make it portable, transforming the next version into a wearable or pocket-sized oracle. Eventually, we plan to run everything offline - predicting personality on-device, ensuring both privacy and closeness.

But 6ixenality isn’t just a hack - it’s a seed for something bigger. We see potential applications in the business and HR landscape, where personality insights could help build stronger teams, foster empathy, and enhance communication. In a world where connection is more digital than ever, 6ixenality offers a way to bring back nuance - helping people not only understand themselves better but also understand each other. Whether it’s onboarding new hires with a personality chord or breaking the ice at events through a shared vibe, the possibilities stretch far beyond the 6ix.

Built With

Share this project:

Updates