Inspiration

Last December, Gorillaz released an AR concert in Times Square. This event and our experiences as Asian Americans inspired us to enable displaced and niche communities to feel more present in their physical realities.

What it does

ARcFest makes AR experiences more immersive and globally scalable. It utilizes ESRI’s 3D building model database and Google’s Streetview data to instantly localize the user and reconstruct the immediate environment as an occlusion mesh. A good example of occlusion is here: link

Users can use ARcFest to congregate at a specific location, interact with an AR holiday experience, and meet other people with the same culture. Together, they can achieve a stronger sense of presence by celebrating values and stories personally important to them.

To demonstrate a possible application of our platform, we developed a Quest Pro Christmas festival AR experience for the MIT iHQ building. Any Americans living in countries that do not celebrate Christmas could use this app to celebrate in person and meet other Americans at the experience location.

We chose to make ARcFest for several reasons:

  • As Asian Americans, many holidays important to our cultures are not publicly acknowledged or celebrated. The same applies to anyone living in a foreign country, such as immigrant workers, college students, and refugees.
  • Beyond ethnic identities, many other identities, such as gender and cultural, are also not publicly acknowledged and celebrated as much as they should be.
  • Location-based AR experiences can be even more immersive than they are now, but these technologies aren’t being utilized to their maximum potential.
  • Gathering data on the physical environment is a huge bottleneck in the location-based experience production pipeline. We wanted to challenge ourselves and make these location-based experiences into a more scalable business model.

How we built it

By sneaking into Marriott to work overnight, and a little coding and 3D modeling!

Some of the applications we used are:

  • Unity for designing/animating the location-based experience, programming the localization code and generating buildings by specific location and create occlusion meshes
  • 3DS Max for creating 3D models
  • Mixamo for animating the models
  • AfterEffects for editing video for presentation

The SDKs we used are:

  • Google Geospatial API to localize user at runtime in any environment with Street View data
  • ArcGIS SDK to generate 3D models of nearby buildings for both occlusion and development purposes
  • Oculus SDKs to connect Unity to Quest 2 and Quest Pro headsets for experience development

Hardware:

  • Google Pixel 2
  • Meta Quest 2
  • Meta Quest Pro

Challenges we ran into

  • Conveying AR concepts with just words, as visualizing certain AR concepts ended up being quite difficult to communicate clearly
  • The majority of the team was more experienced in Unreal, but the developer only had Unity experience, so most of the team was using Unity for the first time
  • The Google Geospatial API sample app did not work out of the box
  • The Meta Quest Pro would not connect to Unity for an entire day despite mentor help and making sure everything was set up correctly
  • Finding a place to work overnight to meet the Pencil's Up deadline
  • Google ARCore Play Services not being compatible with Meta headsets

Accomplishments that we're proud of

  • Great team communication to pivot our idea into one more focused on presence and inclusion (we originally were going to do an AR wayfinding app)
  • Getting the Geospatial API to finally detect buildings and return the correct latitude, longitude, and heading values
  • Getting the Quest Pro to work and developing the location-based experience with it
  • Utilizing ArcGIS 3D Building Model data to accurately recreate occlusion meshes for the sample experience
  • Scripting interactions on VR headsets to speed up development
  • We all had to learn programs, SDKs, and hardware we had never used before, but we all managed to resolve our issues and got the technologies working as intended

What we learned

  • Meta Quest Pro Unity documentation is basically nonexistent
  • Google sample projects never work
  • How to use Oculus camera rigs and controller inputs on Oculus headsets
  • Using the ArcGIS SDK to generate building data
  • Animating and designing content in Unity coming from an Unreal background
  • Techniques for developing location-based experiences

What's next for ARcFest

  1. Create a backend database to host experience content and download the relevant data
  2. Design front-end UI to find nearby experiences
  3. AR wayfinding to the destination experience
  4. Interview users and research what holidays, events, and festivals would best benefit the community
  5. Incorporate greater accessibility into the app with text-to-audio and different interaction methods

Built With

Share this project:

Updates