Inspiration
Having narrowed our search for ideas down to the medical field, our team identified amputees and prosthetic users as an often overlooked demographic in the digital health revolution.
The statistics paint a clear picture: 57% of prosthesis users report dissatisfaction with their devices’ comfort, while over 50% experience regular pain during use (Sabina et al., 2022). Access to quality rehab remains a critical barrier to many, with 20% of non-military amputees reporting unmet rehab needs due to financial constraints (Pasquina et al., 2015). The emotional impact is substantial as well, with 41% of amputees facing an increased risk of anxiety, depression, and PTSD (Sabina et al., 2022).
With the traditional rehab process requiring multiple hospital visits, a substantial burden is placed on both healthcare resources and family support systems. These challenges are compounded by the increased risk of injury that prosthetic users experience. The need for continuous monitoring, combined with limited availability of specialized physical therapists nationwide, creates a significant gap in care delivery.
This is where BIONIC comes in, addressing these interconnected challenges by leveraging AI technology to provide accessible, personalized rehab support while reducing the burden on healthcare systems and caregivers.
Sources:
Building the Project
BIONIC was developed using a blend of multimodal AI and modern web technologies. The system leverages Python for both frontend and backend development, with NLX Dialog Studio powering our conversational AI interface. This results in a simple website that is easy to use yet highly functional. We first developed a system using OpenCV and Google MediaPipe to detect patients and recognize their individual limbs. Then, we used this model to isolate and label the prosthetic limbs based on distinctive colors and textures that separate them from real limbs.
Next, we ran the OpenCV-processed video feed through a Google Gemini model custom-tuned for prosthetic rehabilitation guidance. The AI then draws conclusions from the movement sample, determining a patient’s weak points and providing personalized advice on what may be the root cause of the issue and how they might go about fixing it, which saves the patient the time and money of visiting a doctor to get the same advice. The patient can also speak with our NLX chatbot, which simulates a real call with a hospital, prompting the user to provide information about their problem, giving possible solutions, and allowing them to schedule an appointment with a doctor if the bot deems the issue severe enough.
The UI was built with Streamlit, simplifying integration with the backend Python code. JavaScript was used to handle server-side tasks, taking care of displaying live video and handling event requests from various APIs. This comprehensive approach allows BIONIC to understand and respond to various types of input, making it more accessible and effective for users with different needs and capabilities. Whether through text input or movement analysis, the system provides consistent, personalized support for prosthetic users throughout their rehab journey.
Challenges We Faced
One of the biggest challenges we faced was integrating every aspect of the program together. When using Streamlit, we were limited by the platform and had trouble putting together complex aspects of the code like the OpenCV processed video feed, which resulted in hours working with both JavaScript and Python, getting the system to send information back and forth to accurately display the video on the webpage.
We also had trouble with the NLX chatbot, as our unfamiliarity with the platform made implementing the bot extremely difficult. The multimodality was also difficult to implement as the documentation was not as clear about how to actually use the features given.
Another challenge we faced was the identification of prosthetic limbs. Although we knew how to identify joints, figuring out which limb was the artificial one was a difficult task. However, we quickly realized that we could determine which limb was prosthetic by comparing them by appearance, so we used color ranges and binary texture mapping to determine which limb was the odd one out. This allowed us to only display the prosthetic limb, which makes it easier for our Gemini model to determine what’s happening in the input video and helps it deliver more accurate feedback.
What We Learned
In the process of developing BIONIC, we brought to light an existing system that sees little technological advancement, and learned how we can make treatment more accessible while simplifying the process for healthcare workers. Along the way, we combined many technologies to create a unique product that is simple to implement while having a large impact. Using computer vision, we learned how to detect prosthetic limbs through color and texture identification. We also learned how to integrate multimodal
What's Next?
In the future, we aim to advance our platform with more advanced multi-modal features that enhance user interaction and improve the healthcare experience. One addition will allow users to input their responses via voice. This functionality will enable the program to convert spoken language into text, making communication more seamless and intuitive. Additionally, we plan to implement a feature that allows users to upload images directly through the medical records chat. This capability will not only facilitate the storage of important visual information but also enable our system to analyze the images. By leveraging this data, we can provide more accurate insights and tailored solutions for users, enhancing their overall health management.
Conclusion
BIONIC empowers prosthetics users to take control of their rehabilitation journey through a secure, intuitive health platform. Patients can connect with healthcare providers, share progress through video recordings, and get instant support via AI chat. Whether you're documenting milestones or seeking guidance, BIONIC streamlines the path to recovery while keeping your personal data protected.


Log in or sign up for Devpost to join the conversation.