Inspiration

Most learning platforms treat everyone the same. We wanted an AI that reasons about where you are before telling you where to go by applying a multi-agent architecture to the universal challenge of skill development.

What it does

ImmerseAI builds personalized learning roadmaps based on your skill level, history, and goals. Three specialist AI agents run in parallel — one reads your learning history, one finds the best resources for your level, and one applies formal logic rules to validate prerequisite chains. The orchestrator synthesizes all three into a phased roadmap with curated YouTube resources and a clear next action.

The key differentiator: a logic engine enforces hard educational rules — don't recommend deep learning before you know linear algebra. Every recommendation has a logical justification, not just an LLM guess.

How we built it

A root OrchestratorAgent coordinates ParallelAgent (LearningCrew) activating three specialists simultaneously:

  • ProfileAgent — reads skill level and learning history from MongoDB Atlas
  • CurriculumAgent — queries YouTube Data API v3 for content matched to user level
  • LogicAgent — runs prerequisite checking via an s(CASP) constraint logic engine

Built on Google ADK with Gemini 2.5 Flash, deployed on Cloud Run, with a React + Vite hybrid chat/roadmap frontend.

Challenges we ran into

  • Parallel agents sharing state through ADK's InvocationContext without race conditions
  • Bridging Python and s(CASP)/Prolog as a callable FunctionTool with fallback
  • Vertex AI credential scoping on Cloud Run required some debugging

Accomplishments that we're proud of

A working parallel multi-agent pipeline where three specialists genuinely collaborate and produce a synthesized output better than any single agent alone. The s(CASP) integration enforces prerequisite logic with a formal proof, which makes our recommendations explainable and reliable.

What I learned

Multi-agent systems are only as good as their state management. Grounding LLM outputs in external sources (a logic engine, a live database, a real API) dramatically reduces hallucination and improves trust for high-stakes recommendations.

What's next for ImmerseAI

  • Peer-to-peer matched learning sessions (group study)
  • Adaptive/better roadmaps that update based on quiz/course content performance
  • Expanded s(CASP) knowledge graph across more domains
  • Mobile app with spaced repetition and daily streaks
  • Improved UI for user login and authentication
  • Enhance with paid software and parallelization techniques

Built With

  • cloud-run
  • fastapi
  • fastmcp
  • gemini-2.5-flash
  • google-adk
  • mongodb-atlas
  • python
  • react
  • scasp
  • vertex-ai
  • vite
  • youtube-data-api
Share this project:

Updates