Echo Eyes - Meta Horizon Start Developer Competition 2025
🎯 Project Story
Inspiration
Every day, 2.2 billion people worldwide live with vision impairment—and for them, the world remains invisible.
As developers, we asked ourselves: What if cutting-edge VR technology could become the "eyes" for those who cannot see?
The inspiration for Echo Eyes came from witnessing the daily struggles of visually impaired individuals. Simple tasks like reading mail, navigating unfamiliar spaces, or identifying objects become monumental challenges without sight. While services like Be My Eyes connect users with human volunteers, they require waiting for availability and involve sharing personal spaces with strangers.
We envisioned a solution that provides instant, private, and always-available visual assistance—and Meta Quest became our perfect canvas. With its advanced passthrough cameras, AI capabilities, and hands-free interaction potential, we realized we could create something truly transformative: an AI companion that sees the world and describes it through voice.
Echo Eyes was born from one simple belief: technology should bridge the gap between ability and disability.
What It Does
Echo Eyes transforms Meta Quest into intelligent "eyes" for the blind and visually impaired.
🎙️ Voice-First Experience
Every interaction is designed for users who cannot see. No visual menus. No complex navigation. Just speak naturally, and Echo Eyes responds.
👆 Microgesture Control
With our latest update, users trigger features through simple thumb movements on their index finger—no need for controllers or large hand motions. Supports both left and right hands for maximum accessibility.
📖 Text Recognition (NEW)
Point at documents, signs, medicine bottles, or any text—Echo Eyes reads it aloud. This feature helps users read mail, prescriptions, product labels, and more.
🌍 Environment Understanding
Real-time AI-powered descriptions of surroundings help users understand their environment, identify objects, and navigate spaces safely.
⚠️ Safety Alerts
Proactive warnings about obstacles, stairs, and potential hazards keep users safe as they move.
Core Philosophy: Zero Learning Curve
Blind users shouldn't need to learn complex interfaces. Echo Eyes works the moment you put on the headset—just listen and gesture.
How We Built It
Technology Stack:
- Platform: Meta Quest 3/3S with Passthrough Camera Access
- AI Engine: Advanced LLM integration for natural language scene understanding
- Input System: OpenXR Microgestures API for hands-free control
- Audio: Spatial audio feedback for immersive, directional guidance
- Development: Unity with Meta XR SDK
Key Technical Achievements:
Camera-to-AI Pipeline
We optimized the passthrough camera feed processing to provide real-time AI analysis with minimal latency—critical for safety alerts.Microgesture Integration
We were early adopters of Meta's microgesture technology, implementing thumb-tap and thumb-swipe controls that work reliably even for users with limited dexterity.OCR Pipeline
Our text recognition system handles multiple languages, various fonts, and challenging angles—from handwritten notes to small prescription text.Voice-Centric Architecture
Every system output is designed for audio delivery. We refined our AI prompts to generate descriptions that are concise, clear, and actionable when heard (not read).
Challenges We Ran Into
🎯 Designing Without Visual Feedback
As sighted developers, we had to completely rethink UX. We conducted extensive testing sessions with blind users who reminded us that every "helpful" visual cue we added was useless to them.
🔧 Microgesture Accuracy
Early microgesture implementations had false positives. We implemented gesture confirmation patterns and dead zones to ensure actions only trigger when intended.
⚡ Performance vs. Quality
AI-powered scene understanding is computationally expensive. We had to balance response quality with battery life and heat management—vital for daily use.
🗣️ Natural Voice Output
AI tends to be verbose. We spent considerable effort training our prompts to generate descriptions that blind users actually want—brief, relevant, and conversational.
Accomplishments That We're Proud Of
✅ Published on Meta Horizon Store
Echo Eyes is live and available, helping real users today.
✅ Positive Community Impact
User reviews highlight how Echo Eyes provides genuine assistance for daily tasks. One user wrote: "It's like having a reliable partner for those of us who need visual assistance."
✅ Accessible to Multiple Communities
Beyond the blind community, we've received feedback from users with dyslexia and color blindness who benefit from our text recognition and description features.
✅ Pioneering Microgesture Accessibility
We're among the first apps to use microgestures specifically for accessibility—proving that hands-free VR interaction can serve those who need it most.
✅ Truly Voice-Operated Design
Our app demonstrates that VR experiences don't require visual interfaces—opening doors for future accessibility applications.
What We Learned
📚 "Accessibility-First" Changes Everything
Designing for blind users made our entire UX cleaner and more intuitive—even sighted users benefit from voice-based interactions.
📚 Users Are the Best Teachers
Every testing session with blind users revealed assumptions we didn't know we had. Their feedback fundamentally shaped Echo Eyes.
📚 Microgestures Are the Future
Low-calorie inputs like thumb taps feel natural and reduce fatigue. This input paradigm will become standard for extended VR use.
📚 AI Needs Human Guidance
Raw AI output isn't helpful—it needs careful prompt engineering to serve specific user needs effectively.
What's Next for Echo Eyes
🚀 Expanded Language Support
More languages for text recognition and voice output to serve global communities.
🚀 Object Memory
"Remember where I put my keys" — persistent object tracking to help users find items in their spaces.
🚀 Navigation Assistance
Turn-by-turn indoor navigation using spatial mapping.
🚀 Community Features
Allow sighted family members to leave voice notes attached to locations or objects.
🚀 Ray-Ban Meta Integration
Bring Echo Eyes technology to everyday glasses form factor.
Our Vision: Echo Eyes isn't just an app—it's the beginning of a world where visual impairment doesn't mean missing out on life's details.
📋 Submission Information
Are you submitting a NEW or UPDATED project?
UPDATED PROJECT
Summary of Significant Updates (Competition Period)
Echo Eyes v2.0 — Major Accessibility Update
During this competition period, Echo Eyes received transformative updates that fundamentally enhance how blind users interact with the application:
🆕 Microgesture Control System
We completely reimagined user interaction by implementing Meta's microgesture technology. Users now trigger all functions through subtle thumb movements on their index finger—no controllers required. This update:
- Supports both left and right hand microgestures for user preference
- Reduces physical effort with "low-calorie" gestures
- Eliminates the need for users to find and hold controllers
- Enables true hands-free operation—critical for users who may be holding a cane or other items
🆕 Text Recognition Feature
Based on extensive user feedback, we added comprehensive OCR capabilities:
- Read printed text from documents, signs, and labels
- Support for multiple languages and fonts
- Handles various angles and lighting conditions
- Reads prescription bottles, mail, product packaging, and more
- Announces text naturally through voice output
🆕 Enhanced Voice Interaction
- Refined AI prompts for more concise, actionable descriptions
- Improved response timing for faster feedback
- Natural conversational tone that doesn't feel robotic
📊 Impact Metrics:
- Reduced interaction effort by ~70% (gesture to microgesture)
- Added entirely new use case (text recognition)
- Maintained strong user satisfaction ratings on Meta Horizon Store
📎 Resources:
- Meta Horizon Store: Echo Eyes
- Developer Update Post: Microgestures + Text Recognition Update
These updates directly address the competition's focus on Hand Interactions innovation while serving our core mission: making the world more accessible for the visually impaired.
🏆 Applicable Award Categories
- Best Lifestyle Experience — Enhancing daily life for visually impaired users
- Best Implementation of Hand Interactions — Pioneering microgesture-based accessibility
- Best Implementation of Passthrough Camera Access with AI — AI-powered visual assistance
- Judge's Choice — Innovative social impact through XR technology
Echo Eyes — See the World Through Sound





Log in or sign up for Devpost to join the conversation.