Inspiration

We were inspired by true events, and also false events (notably the movie The Conjuring).

What it does

EMPL*YMENT simulates the interview process at its worst. The recruiter asks the user interview questions using text-to-speech, and the user can press the space bar to answer. Ultrasonic sensing is used to record the user's distance from the screen and provide immediate feedback. Eye tracking tech is incoming.

How we built it

We set up the sensors with an Arduino. We used React and Three.js for the 3D recruiter simulation, and we built the front-end UI with HTML and CSS integration. We used Gemini AI for the text-to-speech.

Challenges we ran into

We struggled to integrate the eye-tracking technology, which was a bit of a disappointment, but we had fun working on the ultrasonic sensors instead (and we learned a lot in the process of working on the eye tracking). Also, it was tough getting the animation to fit correctly on the page. We had to mess with it a lot.

Accomplishments that we're proud of

We succesfully got Krit a job, though she was immediately fired :(

What we learned

A lot about sensors and animation!

What's next for EMPL*YMENT

Eye tracking technology! You WILL Be making eye contact with Sharon White for the entirety of your interview.

Note: We also added code to this hardware repo which we kept separate from our main repo! https://github.com/Kriti-Negi/HardwarePortionsPearlHacks

Share this project:

Updates