As a kid, there was a show I watched that I liked almost more than the fan fared Pokemon. The name was digimon, a tale of a world filled with creatures filled with the highest form of Artificial Intelligence or as they put it, digital monsters! This year being the 20th anniversary of what struck the first chord of CS in my life, we went ahead and created an AR based Digimon!
We used Unity for the whole project ans used blender to animate models. We used the Watson API too but unfortunately, we faced a downtime for this morning, making the functionality nada. Oh and no sleep!
Unity is a very hectic environment to deal with, especially when one has experience with more powerful and flexible languages like C++ and python. Secondly, the Watson API gave us lot of trouble for us to figure out the methods of calling the APIs and using the functions (even though it DIED IN THE END). Lastly, and most importantly, GoogleARCore is a completely new platform. Figuring out the plane-detection algorithms, movement on these planes and the light conditions were things we had to deal with.
I'm proud of the fact that I managed to get the trackedplane finder to work for ARCore and further managing to completely apply various concepts onto a newer piece of software!
I learned a lot more about Unity in general and how to deal with it better. Further, I learned a lot more about ARCore and the Watson Speech to Text API.
Digivice is something I wish to complete with all my heart. We faced a lot of setbacks here which we were not capable enough to handle in less than 24 hours so, even though I was not able to get Digivice all the functionality I wanted it to have, in coming weeks it surely will. I believe the next steps are things like food, evolution (which has been partially implemented on the backend) and as a reach goal, multiplayer using Photon Networking!


Log in or sign up for Devpost to join the conversation.