Effortless navigation through intuitive AR.
View shops' information on the street using AR technology including types of shops, discount information, and so on. We foresee this idea can replace Yelp or be integrated into Yelp or Google Map.
For the Demo, we use accessable logo to detect.
Technologies used : Google Vision API (logo detection), Google Places API, Apple ARKit3, Apple SceneKit.
Approach : We use some of the source code from Apple's official documentation https://developer.apple.com/documentation/vision/recognizing_objects_in_live_capture. We first feed a frame from the camera to the Vision API and obtain the label and the location of any logo it sees. Then, we obtain ratings and other useful information about the restaurant that was detected by Vision API using Places API. Finally, we display the information in the Augmented Reality environment using ARKit3. The information will be overlayed over (or close to) the actual location of the restaurant and it will remain temporally stable. We currently have a working proof of concept--including detection and feature display on AR environment-- without any heavy UI/UX implementations.
- Race condition between AVCapture.session and SceneKit's session which makes the app not real-time(but logo detection is real-time); however, eventually, we did not use run AVCapture.session.
- CurrentFrame from SceneKit gives low quality of image, and AVCapture gives good quality, but we need SceneKit.
- Improve the UI/UX beyond what the time constraints in CalHacks allowed.
- Allow the user to choose what features should be displayed in the AR scene; every individual has different priorities, so this feature has high priority.

