Healixity: Democratizing Health Understanding

Inspiration 💡

Growing up in a developing nation, I watched healthcare become a luxury rather than a necessity. "Power through it, it will be alright," my mother would say whenever I had a fever, actively discouraging doctor visits or medications. It wasn't neglect - it was survival economics. My parents only stepped foot in hospitals during emergencies or accidents.

As I've grown older, I've realized something heartbreaking: even when people finally have access to healthcare, the system itself has become so sophisticated that it's incomprehensible. Lab reports filled with numbers that mean nothing to patients. Medical jargon that might as well be a foreign language. People collect years of health data but remain completely lost about what it means for their lives.

I decided to build something that would have helped my family - and millions like us - actually understand their health journey.

What it does

Healixity is a health dashboard that transforms confusing medical complexity into clear, understandable insights. Think of it as your personal health translator.

You can upload any medical document - lab reports, prescriptions, doctor's notes - and the AI immediately explains what it means in plain English. Track your health metrics over time and get personalized insights about your trends. Most importantly, you can chat with an AI assistant that actually knows your complete health history and gives you answers tailored to your unique situation, not generic medical advice.

The goal is simple: make health data accessible to everyone, regardless of their medical background or economic situation.

How I built it 🛠️

I built Healixity with a Next.js frontend for the dashboard and a Go backend server. The real challenge was creating an AI agent system from scratch since I needed custom functionality that existing libraries couldn't provide.

The architecture centers around a custom agentic flow that I designed to work with Sonar API, integrating my own database request functions and similarity search for RAG context. Every component was built with care - from the health data validation logic to the document processing pipeline that can understand medical terminology and explain it in everyday language.

The frontend prioritizes simplicity and clarity, ensuring that complex health insights feel approachable rather than overwhelming.

Challenges I ran into 😤

The biggest technical hurdle was that Sonar API didn't support tool calling - a critical feature I needed since my tools were custom database functions and similarity search for RAG context. I had to architect a completely custom solution using multiple API calls to achieve tool-like functionality while maintaining the conversational flow.

Another major challenge was the lack of embedding support in Sonar for RAG functionality. I ended up integrating OpenAI's text-embedding-3-large model specifically for embeddings while keeping Sonar for the conversational AI. Bridging these two systems while maintaining performance and coherent responses required careful orchestration.

Building reliable health data processing was also demanding - medical documents come in countless formats, and accuracy is non-negotiable when dealing with someone's health information.

Accomplishments that I'm proud of 🏆

I built a fully functional AI health assistant with true agentic capabilities entirely from scratch, using a programming language that doesn't have extensive agent frameworks or libraries. This wasn't just about connecting APIs - I created a custom agent architecture that can reason about health data, maintain context across conversations, and provide genuinely personalized insights.

The system can actually read and understand medical documents, explain complex health trends in simple terms, and maintain meaningful conversations about a user's health journey. Most importantly, I created something that would have genuinely helped my own family navigate their health challenges.

Seeing the AI correctly interpret a lab report and explain it in terms that anyone could understand felt like a small victory for healthcare accessibility.

What I learned 📚

This project taught me how to build sophisticated AI agents without relying on pre-built frameworks. I learned to work with models that don't support standard agentic SDKs or tool calling, forcing me to innovate custom solutions for complex problems.

I also gained deep appreciation for the challenges in healthcare AI - the responsibility of handling someone's health data correctly, the complexity of medical terminology, and the critical importance of making technology feel approachable rather than intimidating.

Perhaps most importantly, I learned that sometimes the best solutions come from understanding real human problems first, then building the technology to solve them.

What's next for Healixity 🚀

The immediate priority is integrating wearable devices and health monitoring tools with automated hooks to update data as it's measured. This would create a complete health picture that updates in real-time rather than requiring manual data entry.

I'm also planning to expand the document processing capabilities to handle more medical document types and add predictive insights that can help users spot concerning patterns before they become serious problems.

The ultimate vision is making Healixity available globally, especially in developing regions where healthcare understanding is most needed but least accessible. Technology should bridge the healthcare gap, not widen it.

Built With

Share this project:

Updates