Inspiration

With our ARsound app we want to help visually impaired people understand and interact with the world around them and navigate it.

What it does

Using CoreML+ ARkit and Estimote Beacons we augment the world with sound which alerts the user of the presence of the objects around and inform about the distances.

How we built it

With Swift, CoreML, ARKit, Estimote beacons and lots of resources on the internet

Challenges we ran into

  • Helping visually impaired people understand and interact with the world around them
  • Providing blind people with more independence

Accomplishments that we're proud of

Putting this whole project together without any knowledge on CoreMl and super basic knowledge about ARKit.

What we learned

  • Difficulties visually impaired people face in their everyday life
  • Super extreme usefulness of CoreML + ARKit integration
  • How difficult it is to create a very precised prediction model
  • How difficult it is to create User Interface for sound solutions that are aimed at visually impaired people

What's next for ARSound

We'll see :) We want to develop it for sure !

Built With

Share this project:

Updates