Inspiration

Market Central's food is dumb. We wanted to make it smart.

What it does

An app that gives customers of Market Central a voice to give feedback, and see other people’s feedback on the food. Users will take a picture of their food, and the app will try to identify the picture based on existing menus from Sodexo as well as machine learning. In the event that the app identifies the food wrong, the user could be able to manually find the name of the food by typing it in himself and selecting among the options that pop up. Once the correct food is selected, users will be able to leave a review that is accessible for other market central  This app will not only help students make smarter choices based on the food that they choose, but will also indirectly aid Sodexo in improving their food selection and quality. Sodexo will highly value this app as they highly value feedback from students as evidenced by their Meet and Greets as well as boxes for students to leave paper feedback slips. 

How we built it

We used CoreML, an open-source machine learning framework for the core image recognition functionality. We coded the iOS app in Xcode using Swift.

Challenges we ran into

All first-time coders in Xcode and Swift.

Accomplishments that we're proud of

Made a good proof of concept and managed to make a decent functioning app.

What we learned

Basics of Swift/Xcode, little bit of UI/UX, fundamental machine learning

What's next for MarkIT Central

Available data on market central food from Sodexo, cloud integration with database for review system. Also, improvement in machine learning algorithm for reinforcement learning and detection of most significant object in photo.

Built With

Share this project:

Updates

posted an update

Post-hackathon, talk is developing on creating a rewards system that will serve to encourage students to use the app. Work is brewing with regards to implementing our own Market Central-exclusive food photos dataset.

Log in or sign up for Devpost to join the conversation.