Inspiration
Motion-disabled people have a hard time working on computers which are widely used these days. We are making a system in which motion-disabled people can use their eyes to select the option on the computer and can use it just like a normal person would use a computer.
What it does
1. Vision Quiz
When a question with four options is displayed, the user can look at the correct answer for 2 seconds to answer the question. The user will then be asked to confirm the answer and if he will look at yes for 2 seconds, the answer would be submitted.
2. Image Analysis for Advertisements
An advertisement image will be displayed for 3 seconds. Later, the heat map containing the analysis of where the user looks the most is displayed.
How we built it
We developed a three-layered architecture to interface between eyetribe device and web application. There are three modules which communicate with each other to acheive the proper synchronization and response time.
Technologies and Hardwares
- Eye Tracker Device (Eye-Tribe) : Provides real-time data of user eye-movement on the screen
- Socket.io : Excellent technology to handle passage of real-time streaming data (30fps) between multiple levels of clients and servers
- NodeJS : Event-driven, Asynchronous backend framework for
- Ngrok : Tunneling service to communicate with remote servers
- Express : MVC in NodeJS
- Mongo : NoSQL database
- Web Frontend (Bootstrap) : Responsive grid based frontend framework
- Git : Code Versioning
- Python : Speedup development process by automating tasks
Challenges we ran into
- Real-time streaming data : dynamic, configurable
- Multiple levels of clients and servers
- Cross-platform compatibilities
- Programming language barriers
- Sockets and TCP connections: Express handling became difficult
- Synchronizing eye blink and other variations
- The volume of data (30fps) : Enabling browser support for such high volume data
- Sleep : Since we are all humans :p
Accomplishments that we're proud of
- Developing an application to make the computer accessible to specially-abled people.
- Implementing a general architecture (three-layered server-client/server-client) to make the development of future applications on this platform easier and faster
What we learned
Asynchronous programming to handle real-time high volume data. Managing parallel multiple channels of development
What's next for iGuide
- Embedding the eye tracker in the computer itself (no separate hardware)
- Creating an accessible API to be integrated with any software
- Making control more granular (higher definition screen supports and dynamic coordinates calculation)
Log in or sign up for Devpost to join the conversation.