Inspiration
Recruiters typically spend only a few seconds scanning each resume before deciding whether to move forward with a candidate. Given the volume of applications they receive, they rarely have the time to thoroughly check each candidate’s GitHub and other supporting documents to verify their competency (GitHub activities can be easily manipulated). Verifying the quality of code and ensuring the authenticity of claims is crucial but often impractical. Our solution, Verify, helps recruiters streamline their verification process and make informed hiring decisions with confidence.
What it does
Verify revolutionizes the recruitment process by providing a comprehensive, evidence-based evaluation of candidates' skills and experience, and helps recruiters make informed decisions quickly by validating the authenticity of resumes and other candidate claims.
Our platform analyzes and cross-references data from multiple sources, such as GitHub repositories, to validate the authenticity of resume claims. It goes beyond surface-level information, diving deep into project complexities and coding patterns to generate a holistic view of a candidate's technical abilities. Verify also employs advanced natural language processing to match candidate profiles with job descriptions, offering recruiters a clear, data-driven basis for their hiring decisions. This approach not only saves time but also significantly reduces the risk of hiring based on inflated or inaccurate resume claims.
How we built it
We developed Verify using a robust and scalable tech stack, blending cutting-edge frontend and backend technologies to create a powerful, user-friendly platform.
Our frontend leverages Next.js, React, Tailwind CSSm and Shadcn for a sleek, responsive UI. This combination ensures a visually appealing and intuitive user experience, particularly on individual applicant pages where recruiters can quickly review data. Tailwind's utility-first approach allows for rapid styling and customization, resulting in a polished interface.
The backend is built with FastAPI, chosen for its high performance and asynchronous capabilities. We chose MongoDB as our primary database, utilizing its flexibility for storing varied data structures (Collections) and its powerful vector search capabilities (Atlas Vector Search) for semantic matching. MongoDB also plays a crucial role in our Applicant Tracking System (ATS), storing and retrieving resume text and keywords to power efficient resume analysis and matching.
Authentication is handled through Auth0, implementing features such as Universal Login, Auth0 Actions, and the Management API to ensure a secure and flexible authentication process. Firebase supports our file storage needs, particularly for handling resume PDF uploads and retrievals.
Our analysis leverages the GitHub API to fetch both public and private repositories of applicants. We securely and privately process this data using natural language processing techniques to derive user-specific, job-specific insights, and employ SentenceTransformers for advanced semantic search and matching skills mentioned in resumes with actual work demonstrated in repositories.
A key component of our system is the integration with Google's Generative AI (Gemini) for in-depth code analysis and insight generation. We utilize the Instructor library (shoutout Jason Liu) in conjunction with Pydantic to structure the output from our language models. This approach was crucial for handling the complex, nested data often encountered in repository analyses, allowing us to maintain consistent data structures throughout our application and enhancing reliability and ease of data manipulation.
This comprehensive tech stack enables Verify to deliver fast, accurate, and insightful candidate assessments, transforming the recruitment process for technical roles by providing a data-driven, evidence-based evaluation of candidates' skills and experience.
Challenges we ran into
We faced several challenges during the development of our project, particularly with the frontend and backend integration. Initial issues included problems with connecting the frontend to backend API endpoints, difficulties in displaying the radar chart, and managing resume storage. Crafting an optimal user experience also proved challenging due to time constraints.
Initially, we planned to host our backend API on GCP Cloud Run. While this setup worked at first, it soon became problematic as our API grew more complex. One notable issue occurred when our GCP instance crashed due to a BIOS data file error, which was traced back to a version mismatch in a dependency in the Google AI library (1.65.7 instead of 1.64.0) [as a result of this, we discovered a recent thread on Github where two people got into an argument over this and one threatened to sue the other for defamation in the Github discussion section lol].
As we integrated transformer-based embedding models into our API, we encountered additional complications: the build times increased significantly and the builds frequently failed on GCP. To address these inefficiencies and meet our tight deadlines, we decided to switch to hosting the API locally.
Accomplishments that we're proud of
We successfully integrated a diverse tech stack, including FastAPI, MongoDB, Next.js, React, and Google's Generative AI, to creating a cohesive, powerful, and scalable platform, and implemented a pipeline to derive context-specific and structured insights from Github repositories about a candidate's coding skills and project experiences. We streamline the recruitment process and allow recruiters to gain a better understanding of a candidate's skills without sacrificing time.
What we learned
Rudraksh: I learned how to model data with Pydantic and how to extract structured data from LLMs. I spent a lot of time tinkering with GCP, teaching myself FastAPI, and brainstorming how to architect our system to allow data to flow where we needed it to go.
Alec: I used Auth0 for the first time and learned to use their universal login, the management api + actions, forms + flows, and how to customize pages with HTML/CSS. I also learned a lot about API integration and LLM usage.
Hamza: During this hackathon, I gained hands-on experience with MongoDB, Auth0, and integrating Next.js with a FastAPI backend. Diving into so many APIs was a new challenge for me, but it significantly enhanced my backend skills. I spent considerable time understanding how to model and manage data effectively, and I now feel more confident in my ability to handle complex backend systems.
Sarah: As the UI/UX designer, I didn’t learn any new tools. However, since there was limited reference material for a recruiter-facing job platform, I had to use my imagination to envision a B2B interface. I started by creating low-fidelity wireframes and user flows to communicate my ideas with the front-end developers. I experimented with clean and Bento grid designs and was pleased to receive positive feedback from individuals with recruiting experience. I learned a lot about different LLMs and about libraries used for front end web development from this hackathon.
What's next for Verify
Intelligent Recommendations: We want to develop a recommendation engine that will suggest alternative job opportunities when a candidate doesn't quite fit the original position. This feature will help recruiters maximize the potential of their candidate pool and help candidates find roles that best match their skills.
Chatbot Assistant: We want to integrate a recruiter-facing AI chatbot that will allow recruiters to make natural language queries like "Find me a candidate who applied within the last month and has expertise in machine learning and cloud computing." This chatbot will streamline the search process, making it more intuitive and efficient.
Predictive Analytics: We think it would be amazing if we could implement predictive models that could forecast a candidate's potential performance and growth within a role, based on their past projects and learning patterns.


Log in or sign up for Devpost to join the conversation.