Inspiration

The idea for DocLess was born out of something deeply personal. One of our teammates grew up watching his mom, a nurse, spend countless hours after long shifts bent over charts and documentation. She had just finished caring for patients all day, yet her evenings were filled with paperwork instead of rest.

Seeing the exhaustion in her eyes — not from caring for people, but from the burden of documentation — sparked a question: What if technology could take that weight off her shoulders?

DocLess is our answer to that question.

What it does

DocLess is a web application with AI seamlessly integrated to streamline medical documentation and patient management.

Patient Registration & Database: Doctors and nurses can register patients into a secure database, storing critical details such as allergies, medications, and medical history.

Personal Dashboard: Each patient gets a dedicated dashboard where healthcare providers can quickly view important information at a glance.

Visit Management: Users can register new visits directly in the platform. During consultations, DocLess uses AI-powered voice detection to generate a live transcript of the conversation.

Summaries & Editing: After each session, the system automatically produces a quick structured summary of what was discussed, along with the complete transcript. Doctors can review and edit this output before saving it.

By reducing the manual burden of documentation and offering instant, structured insights, DocLess helps optimize hospital workflows and maximize productivity — letting doctors focus more on patient care rather than paperwork.

How we built it

DocLess is an AI-powered platform that transforms doctor–patient and nurse–patient conversations into structured, searchable medical records — without the hassle of manual note-taking.

Using Qwen3-8B for natural reasoning, DocLess listens to consultations and automatically generates structured summaries: symptoms, vitals, diagnoses, and care plans. Each conversation is converted into a clean, standardized format ready for integration with existing healthcare systems.

On the backend, EmbeddingGemma enables fast, accurate retrieval across a patient’s knowledgebase. Doctors can instantly surface past interactions, track evolving conditions, or review treatment history with semantic search — cutting down time spent hunting through charts.

Challenges we ran into

LLM integration hurdles: Getting Qwen3-8B to reliably parse messy, real-world conversations into structured formats required multiple prompt iterations and fine-tuning strategies.

Database linking: Ensuring that each patient conversation was correctly tied to the right patient record in the knowledgebase was more complex than expected, especially when testing multiple encounters.

Dummy data generation: Creating realistic, diverse patient conversations and medical histories for testing was difficult — too generic data made the model outputs less meaningful, while overly complex data slowed down development.

Accomplishments that we're proud of

Built a working pipeline that converts raw doctor–patient and nurse–patient conversations into structured, medical-grade summaries.

Successfully integrated Qwen3-8B to reason through unstructured dialogue and output clean JSON records.

Designed a patient-specific knowledgebase search using EmbeddingGemma and a vector database, enabling fast recall of past interactions.

Ensured the system can run on-premise, keeping sensitive patient data private while still leveraging advanced AI.

What we learned

How to balance reasoning models with embedding models to handle both structuring data and enabling retrieval.

The importance of chunking and formatting patient conversations so embeddings capture the right context for future queries.

That open-weight models like Qwen and Gemma can be both lightweight and powerful, giving us flexibility compared to purely cloud-based APIs.

How critical data privacy and compliance considerations are when working with medical information.

What's next for DocLess

EHR integration: connect structured outputs directly to popular systems like Epic or Cerner.

Voice-first workflows: add a live assistant mode that helps doctors/nurses during the consultation, not just after.

Multi-modal data: incorporate lab results, imaging notes, and vitals alongside conversation text in the patient knowledgebase.

Evaluation with clinicians: test in real-world settings to refine accuracy, usability, and compliance.

Scalability: optimize for hospital-scale deployments where thousands of conversations must be processed daily.

Share this project:

Updates