MargaDarshak: Navigational Goggles for the Visually Impaired
Inspiration
MargaDarshak was born from a deep desire to minimize the psychological burden faced by visually impaired individuals, empowering them to navigate the world independently and confidently. By transforming complex outdoor navigation into an intuitive, sound-based experience, we aim to break barriers, foster self-reliance, and restore a sense of freedom for the blind. Our mission is to make the world more accessible, one step at a time.
This inspiration drives every aspect of the project, ensuring that technology serves as a bridge to independence and dignity.
What It Does
MargaDarshak is a groundbreaking navigation system designed to help blind individuals navigate the world using Google Maps integration and real-time audio feedback. By combining precise location data with intuitive voice guidance, it provides:
- Step-by-step directions
- Obstacle alerts
- Contextual information about surroundings
This empowers visually impaired users to move confidently and independently, whether they're walking to a destination, avoiding obstacles, or exploring new areas. MargaDarshak turns navigation into an accessible, stress-free experience, making the world more inclusive for the blind.
How We Built It
We built MargaDarshak using a combination of cutting-edge technologies:
- Raspberry Pi: Core processing unit for portability and efficiency.
- OpenStreetMap API: Provides detailed and accurate mapping data.
- Computer Vision with OpenCV: Enables real-time environment analysis.
- Object Detection with MobileNet V3: Trained on the COCO dataset for obstacle detection.
- GPS Integration: Tracks live location for precise navigation.
- Audio Feedback System: Delivers real-time guidance and alerts.
Challenges We Ran Into
Inference Speed on Raspberry Pi 4B:
Running real-time object detection and computer vision algorithms on the Raspberry Pi 4B proved to be too slow for seamless performance.Stereo Camera Setup with Webcams:
The Raspberry Pi struggled to capture two camera frames simultaneously, making depth estimation unfeasible.API Key Access for Google Maps:
Accessing the Google Maps API was restricted due to cost and usage limitations, forcing us to rely on OpenStreetMap.Balancing Accuracy and Performance:
Ensuring real-time performance while maintaining high accuracy in object detection and navigation was a constant trade-off.
Accomplishments We're Proud Of
Integration of OpenStreetMap API:
Successfully incorporated OpenStreetMap for detailed and accurate mapping.Real-Time GPS Tracking:
Implemented live location tracking using mobile GPS for precise navigation.Shortest Path Calculation:
Developed an algorithm to calculate the shortest and safest path to the destination.Obstacle Detection:
Integrated object detection using MobileNet V3 and OpenCV to identify and alert users about obstacles.Audio Feedback System:
Created an intuitive audio feedback system for real-time guidance.Empowering Independence:
Built a system that empowers visually impaired individuals to navigate the world confidently and independently.
What We Learned
- Working in Complex Environments: Adapting to time-constrained, real-world challenges.
- Rapid Project Completion: Prioritization and efficient planning to deliver results in 2 days.
- Team Collaboration: Leveraging diverse skill sets and fostering teamwork.
- Integration of OpenStreetMap API with AI Models: Combining geospatial data with real-time object detection.
- Parallel Processing and Multithreading: Optimizing performance on the Raspberry Pi.
- Problem-Solving and Adaptability: Thinking creatively to overcome hardware and software limitations.
What's Next for MargaDarshak
Upgrade to Coral TPU:
Replace the Raspberry Pi with a Coral TPU for faster and more efficient real-time processing.Stereo Vision Setup:
Implement a stereo vision system for accurate depth perception.Detection of Potholes and Staircases:
Expand object detection to identify potholes, staircases, and elevated obstacles.More Robust Object Detection:
Train advanced models with larger datasets for improved accuracy.Enhanced Audio Feedback:
Refine the audio feedback system for more detailed and context-aware guidance.User-Centric Design:
Conduct user testing to improve usability and accessibility.Integration with Wearable Devices:
Explore integration with smart glasses or haptic feedback belts for a more immersive experience.Cloud-Based Updates and Support:
Develop a cloud-based backend for real-time updates and remote support.
By pursuing these advancements, MargaDarshak aims to become an even more powerful and inclusive tool, empowering visually impaired individuals to navigate the world with greater confidence, independence, and safety.
Log in or sign up for Devpost to join the conversation.