Real-time road quality monitoring system. A drone streams live video; YOLOv8 detects potholes frame-by-frame; results flow to MongoDB (live dashboard), Snowflake (archive), and Solana (immutable certificates). A React dashboard shows the live feed, a danger heatmap, detection logs, blockchain transactions, and a Gemini-powered command interface.
Three processes, three terminals.
cd server
./mediamtx mediamtx.ymlDownload the binary from https://github.com/bluenviron/mediamtx/releases (no compilation needed).
cd server
pip install -r requirements.txt
cp .env.example .env # fill in your credentials
uvicorn main:app --reloadRuns at http://localhost:8000. Interactive docs at /docs.
cd frontend
npm install --legacy-peer-deps
npm run devRuns at http://localhost:5173.
| Requirement | Notes |
|---|---|
| Python 3.11+ | Python 3.13 supported |
| Node 18+ | |
| MongoDB Atlas | Set MONGO_URI in .env; whitelist your IP in Atlas Network Access |
| YOLOv8 pothole model | Download from Roboflow Universe → place at server/models_dir/pothole.pt |
| mediamtx binary | Single binary, no install — see above |
| Snowflake account | Optional; archive writes disabled if not configured |
| Solana keypair (devnet) | Optional; certificate writes disabled if keypair not found |
| Gemini API key | Optional; powers natural language /command endpoint |
POST /analyze/images (images + GPS points)
POST /analyze/video (video file + GPS route)
→ YOLOv8 per frame → quality score → MongoDB + Snowflake + Solana → JSON response
Drone ──RTMP──► mediamtx :1935 ──RTSP──► FastAPI (background task)
│
HLS :8888 ◄── browser (live video in dashboard)
Drone controller ──WebSocket /stream/{id}/gps──► GPS buffer
│
MongoDB (real-time heatmap)
Snowflake (batched every 60s)
Solana (one tx at session end)
Streaming workflow:
- Frontend clicks START STREAM →
POST /stream/startreturnsrtmp_url - Drone pushes to
rtmp://localhost:1935/live/{session_id} - Drone controller connects WebSocket to
/stream/{session_id}/gpsand sends{lat, lon, ts}fixes - Browser pulls HLS video from
/live/{session_id}/index.m3u8(via Vite proxy → mediamtx :8888) - Frontend clicks STOP STREAM →
POST /stream/{id}/stop→ teardown + Solana cert
Drop any road video at server/demo/demo.mp4. With AUTO_DEMO=true in .env, the server automatically runs the full analysis pipeline on startup — pothole detection, GPS mapping, MongoDB/Snowflake/Solana writes. A fake GPS route through Plateau-Mont-Royal is baked in.
To trigger manually anytime:
curl -X POST http://localhost:8000/demo/runTo simulate a live stream with a local video:
ffmpeg -re -stream_loop -1 \
-i server/demo/demo.mp4 \
-c:v libx264 -preset ultrafast -tune zerolatency \
-c:a aac -f flv \
rtmp://localhost:1935/live/{session_id}The frontend auto-polls the backend and displays:
| Panel | Data source | Refresh |
|---|---|---|
| Live Feed + controls | /stream/active, /stream/{id}/status |
1–2s |
| Danger Map (OSMnx SVG) | /map/graph (street network), /telemetry (drone position), /detections |
1s / 2s |
| Detections Log | /detections (in-memory, always live) |
2s |
| Blockchain Log | /blockchain |
3s |
| Analysis Summary | /dashboard |
5s |
| Command (Gemini AI) | POST /command → Gemini 2.0 Flash |
on submit |
| Status pills | /health, /stream/active |
5s / 2s |
CrashingIntoWalls/
├── server/
│ ├── main.py FastAPI app + Gemini /command endpoint
│ ├── config.py Settings (pydantic-settings)
│ ├── models.py Request/response Pydantic models
│ ├── mediamtx.yml mediamtx config (RTMP :1935 / RTSP :8554 / HLS :8888)
│ ├── demo/ → place demo.mp4 here
│ ├── routes/
│ │ ├── analyze.py File upload endpoints
│ │ ├── data.py Dashboard / heatmap / sessions
│ │ ├── demo.py Demo video analysis (AUTO_DEMO)
│ │ ├── sim.py Simulator: /telemetry, /detections, /map/graph
│ │ └── stream.py Live streaming + GPS WebSocket
│ ├── services/
│ │ ├── vision.py YOLOv8 wrapper + analyze_single_frame
│ │ ├── gps.py GPS interpolation
│ │ ├── scoring.py Road quality scoring (0–100)
│ │ ├── gemini.py Gemini 2.0 Flash — command interpretation + scene analysis
│ │ ├── simulator.py DroneRoadSimulator (OSMnx Montreal graph)
│ │ ├── live_detections.py In-memory detection event store
│ │ └── stream_manager.py StreamManager singleton + RTSP reader task
│ ├── storage/
│ │ ├── mongo.py MongoDB Motor (async) — live dashboard queries
│ │ ├── snowflake.py Snowflake threaded batch queue (60s flush)
│ │ └── solana.py Solana devnet memo transactions
│ ├── models_dir/ → place pothole.pt here
│ ├── output/ Annotated frames saved here
│ ├── setup_snowflake.sql
│ ├── requirements.txt
│ └── .env.example
└── frontend/
├── src/
│ ├── App.jsx Data fetching + state
│ └── components/
│ ├── panels/
│ │ ├── LiveFeedPanel.jsx HLS video + stream controls
│ │ ├── HeatmapPanel.jsx OSMnx SVG danger map
│ │ ├── DetectionsPanel.jsx Detection log
│ │ ├── BlockchainPanel.jsx Solana tx log
│ │ ├── SummaryPanel.jsx Latest session summary
│ │ └── CommandPanel.jsx Gemini AI command input
│ └── Header.jsx Status pills + pothole counter
├── vite.config.js Proxy: all routes → :8000; /live → :8888
└── package.json
Key variables in server/.env:
| Variable | Default | Notes |
|---|---|---|
MONGO_URI |
mongodb://localhost:27017 |
Atlas URI recommended |
MONGO_DB |
road_quality |
|
YOLO_MODEL_PATH |
models_dir/pothole.pt |
|
YOLO_CONFIDENCE |
0.45 |
Detection threshold |
AUTO_DEMO |
false |
Set true to run demo video on startup |
GEMINI_API_KEY |
(empty) | Enables AI command interpretation |
SNOWFLAKE_ACCOUNT |
(empty) | Optional archive backend |
SOLANA_KEYPAIR_PATH |
~/.config/solana/id.json |
Optional certificate backend |
sh -c "$(curl -sSfL https://release.solana.com/stable/install)"
solana-keygen new --outfile ~/.config/solana/id.json
solana airdrop 2 --url devnetView a transaction: https://explorer.solana.com/tx/<SIGNATURE>?cluster=devnet