Inspiration

We were inspired by generative world models and wanted to make something within that domain. We explored the idea of using world models to help teachers make interactive experiences for their students.

What it does

As a teacher, you're able to prompt our LLM, which will help you create your world. You can edit and change it however you like through the LLM. There are multiple viewing settings, allowing you to explore how a student may perceive the world. You can also load previously created worlds.

For students, you can load worlds and explore them with an AI guide, which tells you what you're looking at. You're also able to ask our AI questions (via voice) questions about what you're looking at.

How we built it

  • Typescript, Javascript, node, blender, Claude, Trae, Vite, Overshoot AI, Livekit

Challenges we ran into

  • Streaming a non-camera source (our screen/window) to Overshoot's API, Getting claude to generate blender assets using MCP, Using blender engine as a backend

Accomplishments that we're proud of

  • Overcoming the challenges and making something usable/demo-able in 24h

What we learned

How blender MCP works, hacky ways to try your computer in thinking it's screen is a camera source

What's next for WorldWeaver

Better world and asset generation. NPCs/AI interactions within the world

Share this project:

Updates