Every menu, every settings page, every app you have to learn — that's friction between what you want and what you get. DAUB is a rendering layer built for a world where interfaces are generated, not installed.
Not a comparison. Just the design decisions that led somewhere no one else went.
Two files, CDN link, done. No bundler, no framework, no config. Classless CSS means even raw HTML looks considered.
Tactile surfaces, letterpress typography, real textures. 20 theme families, each with character. "Considered" is the design constraint.
JSON-Render spec, MCP server, complexity-routed pipeline, 230+ block RAG library, and llms.txt — designed together. Built for AI to use.
JSON specs that AI can iterate on, validate, visually diff, and render — without a compile step. The spec is the UI.
Plug it into any stack — vanilla HTML, React, an MCP client, an AI agent. Infrastructure for intent-to-interface.
Each release builds on the last. The bottom is infrastructure; the top is where it gets interesting.
Classless CSS layer — plain HTML looks good with zero classes. ROADMAP published. Expanded AI docs with llms-compact.txt for token-efficient contexts.
AI outputs structured JSON, DAUB renders live UI. The Playground — describe what you need, see it appear. Multi-LLM pipeline with complexity routing across 6 dimensions.
Remote MCP server: generate_ui, render_spec, validate_spec, get_component_catalog. Any MCP client — Claude, Cursor, custom agents — can generate DAUB interfaces.
Prompts scored across 6 dimensions, routed to tiered models with exponential backoff. Figma OAuth import — screenshot capture and layout analysis feed into generation.
230+ pre-made blocks across 34 categories with multimodal RAG retrieval. Full QA audit — every block validated and screenshots regenerated.
Dashrock case study — first production deployment documented. Content integrity guards (mergeSpecFixes, measureTextContent). Theme switcher nudge for new visitors.
Buttons that POST to real endpoints. Forms with submit handlers. Shareable URLs, version history, standalone HTML export. Generated UI becomes functional and outlives the session.
"I need to track my expenses" becomes a spending dashboard. Context-aware generation — location, time, history inform the UI. Adaptive complexity that reveals depth as you engage.
Multi-agent orchestration: one fetches data, another decides layout, a third handles actions. The device becomes the place where thought meets interface — ambient, anticipatory, always ready.
Describe what you need. The right interface appears — considered, contextual, ready to use.
Cards of nearby places with ratings, ETA, prices, and an order button. Location-aware, time-aware.
Categorized expense dashboard with charts, input form, and monthly trends. Connected to your bank.
Itinerary builder with maps, hotels, flights, day-by-day schedule. Drag to reorder, tap to book.
Editor with AI suggestions, tone controls, company research sidebar, and export to PDF.
Recipes based on what's in your fridge, dietary preferences, and prep time. Tap to start step-by-step mode.
Plant timeline, watering schedule, sunlight tracker, photo journal. Seasonal advice and harvest predictions.
Not an app. A surface where thinking and making converge. Your device becomes quiet infrastructure for intent — always ready, never in the way.
Voice, text, gesture — whatever's natural. No app to find, no menu to learn.
Intent becomes structure. Data arrives, layout resolves. Fitts's law, Hick's law, Miller's law — baked in, not bolted on.
A considered interface materializes. Warm surfaces, honest shadows, proper hierarchy. It looks like someone cared — because the system did.
Act on what you see. The interface served its purpose, then it's gone. Next thought, next surface.
No apps to install. No interfaces to learn. No friction between thought and outcome. This is where DAUB is headed — a rendering layer for software that honours the person using it.