Inspiration
I am a student in my Creative Media Industries Institute and I currently study Video Game Design and Development. So I felt inspired to attempt a VR Challenge this time.
What it does
So my project was not able to be completed however I do have a 3D character in a custom Unity scene that I designed and implemented, and he can mirror whatever you say to him in real time.
How I built it
The Lip Sync Package from Unity was imported from the Oculus Unity Package so that I could feed a 3D model either audio clips or I could speak into the mic in realtime. I also was able to setup a multi-user online chat system using the AT&T API. My plan was to use Alexa Skills Kit or Google Dialogeflow to build a Natural Language Processing model for one end of that chat to become a chatbot.
Challenges I ran into
I spent many hours of hair pulling trying to get the Unity Lip Sync models to work for the first time.
Accomplishments that I'm proud of
I was able to build on the AT&T Marketplace API and setup a chat feature.
What I learned
I learned a bit more about how AT&T write their connection logic to move data across many different devices
What's next for 3DVirtualAssistant
I hope to implement the chatbot logic into my application and setup IBM Watson for speech to text in Unity.

Log in or sign up for Devpost to join the conversation.