This project renders your provided map and collision-map images as a playable 2D scene.
map.html(PNG data): main map imagecollision.html(PNG data): collision mask image
From this folder:
python3 -m http.server 8000Open:
The game stays client-side, and a local Python server handles Backboard calls so your API key is never exposed in browser code.
- Install dependencies:
python3 -m pip install -r requirements.txt- Set environment variables (do not commit your real key):
export BACKBOARD_API_KEY="<your_backboard_api_key>"
export BRAIN_STATE_PATH="./brain_state.json"- Start the brain server:
uvicorn brain_server:app --host 127.0.0.1 --port 8100 --reload- 1 assistant per in-game NPC personality (
obama,spongebob,musician) - 1 thread per NPC per save file
- Stored in
brain_state.jsonas assistant/thread IDs
POST /initcreates/loads assistants and save threadsPOST /stepruns Observe → Decide → Act (tool calls) → Remember
/step accepts structured observation payloads and returns deterministic tool actions (move_to, say, sing, set_goal, set_mood) plus optional narration.
- Move:
WASDor arrow keys - Toggle collision debug overlay:
C
- The collision image is rescaled to match the map image dimensions, then converted into a per-pixel collision mask.
- Non-transparent dark pixels in
collision.htmlare treated as blocked. - Includes 3 wandering NPC agents with random placeholder sprites.
- NPCs use the same collision rules as the player, so they avoid blocked tiles and other actors.
brain_server.pykeeps your Backboard key on the server side and should be run separately from the static map server.