Inspiration
A blind patient doesn't need a text summary, they need audio. An ADHD patient doesn't need a paragraph, they need a visual map. An ESL patient doesn't need complicated medical jargon, they need understandable vocabulary. We built LinkCare to bridge the gap between a doctor's expertise and a patient's reality.
What it does
LinkCare is a doctor-facing AI tool that uses Anthropic’s Claude Sonnet 4.5 to transform complex medical notes into the exact format a patient needs:
For Visual Thinkers (ADHD): Breaks linear instructions into generative visual flowcharts (using Mermaid.js) to map out routines.
For Auditory Learners (Visually Impaired): Replaces visual metaphors with tactile descriptions and reads them aloud via native Text-to-Speech.
For ESL Learners: Flattens complex syntax to CEFR A2 levels while maintaining clinical accuracy.
How we built it
AI Core: Anthropic Claude Sonnet 4.5.
Backend: Python FastAPI.
Frontend: React + Vite + Tailwind CSS.
Visuals: Mermaid.js.
State: Zustand.
Challenges & Accomplishments
The biggest challenge was creating the multimodal consistency, which was getting Claude to reliably generate valid code(for diagrams and natural language in a single response. We solved this with a robust protocol that ensures the app never crashes, even if the API fails.
What's next
We plan to integrate and connect LinkCare into hospital databases so our models can retrieve up-to-date patient data.
Built With
- claude
- mermaid
- python
- react
- tailwind
- typescript
- vite
- zustand

Log in or sign up for Devpost to join the conversation.