Inspiration
Our group was inspired our upcoming travels to places in Asia where there is a language barrier. The goal of traveling and meeting new people is to learn about other peoples cultures and experience new things. However, the lack of possible communication in foreign countries creates a barrier to visit such places which hinders the objective of these travels. Our goal is to fill this void and provide a solution that will allow for improved communications for travelers.
What it does
Xplore is a cross-platform communication tool built to help people connect with one another regardless of, barriers such as language, sight, hearing, and even emotion.
The goal of the project is to give everyone a chance to communicate freely and with confidence. The project features a real time language translation system, and a real time computer vision text box featuring subtitles from the translated speech. The goal is to provide a simple platform for users to facilitate the several language barriers that impose on us.
How We Built It
Xplore was built with Android Studio and the Google Cloud stack.
Challenges we ran into
- Expo caused way too many issues; including dependencies and getting it set up
- Android studio emulator was slow and often had difficulties running the application.
- Implementing UI and Material UI isn't as easy as JS or CSS -TextToSpeech has some latency when using it for long long periods of time
- We decided to use only 3 languages as a starting point, more can be added simply as we progress ## Accomplishments that we're proud of
- Learned how to operate and utilize Android Studio (with a team lead)
- Bringing idea to reality
- We would still like to implement more features, but within one day we were able to make an MVP
What we learned
- A few of us needed to learn android studio and react native while working on the project.
- We started with React Native and learnt that the documentation isn't so convenient.
- We learned how to work with APIs and how to use them based on our needs
- We learned how to create UI components
What's next for Xplore
There is still lots of to do for our project Xplore. We would like to begin by calculating the sentiment analysis based on the speech and face of the person speaking. Additionally, there would be a convert the captions back into the translated audio form and in the future implement this into AR glasses and earphones or bone conduction.
With more time we planned to implement text to speech for the visually impaired, and an audio emotion detection system for the hearing impaired (tones of voice). We also wanted to integrate our computer vision further so that the application starts and stops translating when the person being spoken with stops speaking in the interaction.
Log in or sign up for Devpost to join the conversation.