A high-performance 3D rendering engine built with WebGPU and TypeScript, featuring advanced graphics techniques and innovative architectural patterns for efficient GPU synchronization.
- WebGPU-capable browser (Chrome 113+, Edge 113+, or Safari 18+)
- Node.js 16+ for building
# Install dependencies
npm install
# Development server with hot reload
npm run dev
# Build for production
npm run build
# Run tests
npm run testThe application will launch in your browser at http://localhost:5173 (Vite default).
// Initialize renderer
const adapter = await navigator.gpu.requestAdapter();
const device = await adapter.requestDevice();
const canvas = document.createElement("canvas");
const renderer = new WebGPURenderer(device, canvas, {
shadowAtlasSizeX: 2048,
shadowAtlasSizeY: 2048,
shadowAtlasSplitX: 2,
shadowAtlasSplitY: 2,
});
// Create scene
const scene = new Scene();
const camera = new PerspectiveCamera(75, canvas.width / canvas.height);
scene.add(camera);
// Add mesh
const material = await Material.fromTexture("texture.avif");
const geometry = new Geometry(vertices, indices);
const mesh = new Mesh(geometry, material);
scene.add(mesh);
// Render loop
const draw = () => {
renderer.render(scene, camera);
requestAnimationFrame(draw);
};
draw();This renderer prioritizes minimal GPU overhead by distinguishing between logical changes and GPU updates:
- Signal-based reactivity detects which properties actually changed
- Computed signals derive dependent values (matrices, buffers, etc.)
- Pull-based evaluation means GPU buffers update only on actual changes
- Custom DI container eliminates verbose device threading through the codebase
src/
├── engine/
│ ├── renderer/
│ │ ├── WebGPURenderer.ts # Main renderer pipeline
│ │ ├── GPUCamera.ts # Camera GPU representation
│ │ ├── GPULights.ts # Light management & shadows
│ │ ├── GPUMesh.ts # Mesh GPU binding
│ │ ├── GPUMaterial.ts # Material & texture binding
│ │ ├── GPUGeometry.ts # Geometry GPU buffers
│ │ └── misc/ # Extensions (Skybox, Sun, Debug)
│ ├── camera/ # Camera implementations
│ ├── light/ # Light types (Point, Spot, Sun)
│ ├── material/ # PBR material system
│ ├── geometry/ # Geometry primitives
│ ├── controls/ # Camera controls (WASD, Orbit)
│ ├── loaders/ # GLTF loader
│ ├── extensions/ # Sky, Skybox components
│ ├── math/ # Transform, Vector math
│ ├── Object3D.ts # Scene graph node
│ └── Scene.ts # Scene container
├── lib/
│ ├── Signal.ts # Pull-based reactivity system
│ ├── DI.ts # Dependency injection container
│ └── Controls.ts # GUI controls integration
└── game/ # Game-specific code
└── world/ # Procedural terrain (in dev)
The Problem: In traditional renderers, changing a property triggers immediate GPU updates. With hundreds of meshes and lights, this creates excessive GPU submissions.
The Solution: A custom pull-based signal system where:
- Properties are wrapped in
Signal<T>objects with version tracking - Computed signals (decorated with
@computed) automatically recompute only when dependencies change - GPU updates happen exactly once per frame if the underlying data changed
- Dependencies are tracked at the property level, not the object level
Example:
// Light applies rotation → direction changes → GPU buffer updates automatically
export default class SpotLight extends ShadowLight {
@signal(1) declare decay: number;
@signal(10) declare innerCone: number;
@computed(function(this: SpotLight) {
// Only recomputes when transform.direction or other deps change
return new Float32Array([
...this.baseBuffer,
...this.transform.direction, // ← Reactive dependency
this.decay,
this.innerCone,
]);
}, ['transform', 'intensity', 'decay', 'innerCone'])
declare readonly asBuffer: Float32Array;
}GPU Sync Flow:
Frame 1: light.position changes
↓
Signal version increments
↓
Computed `asBuffer` detects dependency change
↓
Buffer recomputed on-demand when `.value` accessed
↓
GPU buffer only updated if asBuffer value differs
↓
Frame 2: light.position unchanged
↓
No GPU update
Benefits:
- ✅ Automatic dependency tracking
- ✅ Minimal redundant GPU uploads
- ✅ Works across property hierarchies (transform.position → world matrices → camera VP)
- ✅ Type-safe with TypeScript decorators
The Problem: WebGPU requires a GPUDevice to create any GPU resource. Passing it through constructor chains is verbose:
// Traditional approach - tedious
const renderer = new WebGPURenderer(device, canvas);
const material = new GPUMaterial(material, device);
const mesh = new GPUMesh(mesh, device);
const camera = new GPUCamera(camera, device);The Solution: A lightweight inversion-of-control container that:
- Registers singleton instances (device, canvas, context)
- Injects dependencies via
@inject()decorators on lazy-loaded properties - Resolves circular dependencies with lazy initialization
- Integrates with TypeScript reflection
Example:
export default class WebGPURenderer extends DIContainer {
constructor(device: GPUDevice, canvas: HTMLCanvasElement, options: WebGPURendererOptions) {
super();
// Register singletons in container
this.provide(device, GPUDevice);
this.provide(canvas, HTMLCanvasElement);
this.provide(this, WebGPURenderer);
}
}
// In any GPU class:
export default class GPUMaterial {
@inject(() => GPUDevice) declare private device: GPUDevice;
@inject(() => HTMLCanvasElement) declare private canvas: HTMLCanvasElement;
init() {
// device and canvas are automatically resolved
const texture = this.device.createTexture({...});
}
}Benefits:
- ✅ No constructor parameter drilling
- ✅ Lazy resolution (only when accessed)
- ✅ Works with circular dependencies
- ✅ Testable (swap implementations)
- ✅ Minimal boilerplate compared to full DI frameworks
- PBR Pipeline – Metallic/roughness workflow
- Texture Loading – Support for AVIF, PNG, JPEG via
createImageBitmap - Material Cloning – Easy material reuse with modifications
- Alpha Blending – Separate pipeline for transparent objects
const material = await Material.fromTexture("sand.avif");
const cloned = material.clone();
cloned.roughnessFactor = 0.8;Global illumination multiplier
renderer.ambientLight.set(0.2, 0.2, 0.2);Omnidirectional illumination with exponential distance decay
const light = new PointLight();
light.intensity = 1.5;
light.decay = 2.5;
light.decayStrength = 2.5;
light.maxDistance = 100;
scene.add(light);Directional cone illumination with smooth falloff (inner/outer cone)
const light = new SpotLight();
light.intensity = 15;
light.innerCone = 10; // degrees
light.outerCone = 15;
light.spotIntensity = 2;
light.castsShadow = true;Orthographic projection for distant light sources, ideal for sun/moon
const sun = new SunLight();
sun.intensity = 0.8;
sun.castsShadow = true;
sun.shadowOrthoSize = 50;
sun.maxShadowDistance = 150;Features:
- Shadow Atlas – Multiple shadow maps packed into one texture (configurable grid)
- PCF Smoothing – Percentage-closer filtering for soft shadows
- Distance Culling – Shadows disabled when light goes below horizon
- Orthographic Projection – Sun lights use ortho for efficient coverage
Configuration:
const renderer = new WebGPURenderer(device, canvas, {
shadowAtlasSizeX: 2048, // Atlas resolution
shadowAtlasSizeY: 2048,
shadowAtlasSplitX: 2, // 2x2 grid = 4 lights can cast shadows
shadowAtlasSplitY: 2,
});Per-Light Tuning:
light.castsShadow = true;
light.shadowSmoothing = 3; // PCF sample count
light.minShadowDistance = 0.1;
light.maxShadowDistance = 150;Cube-mapped background with color adjustment for day/night cycles
const skybox = await Skybox.load("skybox.png"); // Cross-layout: 4x3 tiles
const skyboxExt = renderer.create(SkyboxExtension, skybox, camera);
renderer.addPreRenderExtension(skyboxExt);
// Runtime adjustment
skybox.colorAdjustment.set(0.8, 0.8, 0.8); // Darken for nightProcedural Sun Positioning:
- Spherical coordinates (azimuth, polar angle)
- Smooth color transitions from red/orange (horizon) to white (zenith)
- Synchronized lighting: sun visual + directional light intensity
const sun = new Sun();
sun.polarRad = 0.5; // radians from horizon
sun.azimuthRad = 1.2;
sun.color.set(1, 0.95, 0.85);
sun.intensity = 1.0;
sun.glowDeg = 5; // glow radius
const sunLight = new SunLight();
sunLight.color.copy(sun.color); // Keep in sync
sunLight.intensity = maxLight * heightShape;Full Day/Night Cycle Example (test.ts):
const daySeconds = 60; // 60 second day
const draw = (time: number) => {
const phase = (time * 0.001 / daySeconds) % 1.0;
const azimuth = phase * Math.PI * 2;
// Compute sun direction rotating around world X
const dirY = Math.sin(azimuth); // ← Goes below horizon
const height01 = Math.max(0, dirY);
// Smooth falloff near horizon
const shaped = Math.pow(height01, 1.6);
// Lighting intensity ramps down
sunLight.intensity = 0.8 * shaped;
// Color shift: sunset orange → noon white
sun.color.set(
1.0, // R constant
0.42 * (1 - shaped) + 1.0 * shaped, // G: orange → white
0.15 * (1 - shaped) + 1.0 * shaped, // B
);
renderer.render(scene, camera);
};Depth-based fog in material shader (configurable in Material.wgsl)
- Metallic Factor – 0 = dielectric, 1 = metal
- Roughness Factor – 0 = mirror, 1 = diffuse
- Base Color Texture – Albedo mapping
- Normal Mapping – (Ready for extension)
const material = await Material.fromTexture("metal.avif");
material.metalicFactor = 0.8;
material.roughnessFactor = 0.2;Terrain generation using perlin noise-based heightmaps with island detection:
const size = 15; // ±15 chunks
const seed = 11;
for (let x = -size; x <= size; x++) {
for (let z = -size; z <= size; z++) {
const isIsland = isIsland(x, z, seed);
const material = isIsland ? sandMaterial : waterMaterial;
const chunk = chunk(x, z, seed, material);
scene.add(chunk);
}
}const wasd = renderer.create(WASDControls, camera, canvas);
// Update each frame
wasd.update();
// Properties
wasd.moveSpeed = 0.5;
wasd.rotateSpeed = 0.5;Input:
- W/A/S/D – Move forward/left/back/right
- Left Mouse Drag – Pitch/yaw rotation
- Roll Prevention – Always aligns "up" with world Y
const orbit = renderer.create(OrbitControls, camera, canvas);
// Properties
orbit.target = [0, 0, 0];
orbit.minDistance = 1;
orbit.maxDistance = 100;
orbit.zoomSpeed = 2.0;
orbit.rotateSpeed = 1.0;Input:
- Left Mouse Drag – Rotate around target
- Right Mouse Drag – Pan target
- Mouse Wheel – Zoom in/out
- W Key – Walk forward (in direction of target)
- Persistent State – Saved to localStorage
Vertex Shader:
- Transform model-space position to view-space
- Pass position, normal, UV, view position to fragment
Fragment Shader:
- Sample base color texture
- Evaluate PBR material properties
- For each light:
- Compute light contribution (Lambertian + specular)
- Sample shadow map for occlusion
- Apply PCF smoothing
- Blend ambient + all light contributions
- Apply fog if enabled
Binding Groups:
- Group 0 – Camera (view, projection matrices)
- Group 1 – Mesh (transform matrices)
- Group 2 – Material (texture, sampler, properties)
- Group 3 – Lights (array of lights, shadow maps, info)
-
Pre-render pass per light:
- Use light's projection matrix
- Render only opaque meshes (alpha-tested separately)
- Store depth to shadow atlas tile
-
Main pass:
- For each light, sample shadow atlas
- PCF filter around shadow coordinate
- Apply smoothstep falloff near shadow edges
1. Pre-render extensions (Skybox, Sun)
2. Opaque meshes (basic pipeline)
3. Transparent meshes (alpha pipeline, depth read-only)
4. Post-render extensions (Debug visualization)
Extensible rendering pipeline via RenderExtension interface:
export interface RenderExtension {
init(): void; // Called once after renderer init
update(): void; // Called each frame before render
render(pass: GPURenderPassEncoder): void; // Draw to render pass
}Renders background cubemap with inverse-projected view/projection
Renders procedural sun disc and glow with HDR falloff
Visualizes debug geometry (light cones, shadow frustums, camera projection)
Example Usage:
const debug = renderer.create(GPUDebug);
debug.enabled = true;
renderer.addPostRenderExtension(debug);Base reactive value wrapper
@signal(0.5)
declare roughness: number;
// Under the hood creates:
// roughness$ = new Signal(0.5)
// Getter: () => roughness$.value
// Setter: (v) => roughness$.value = vDerived value that auto-updates when dependencies change
@computed((position, rotation) => {
return position.add(rotation.getForward());
}, ['position', 'rotation'])
declare readonly worldPosition: Vector;Optimization: Value is cached; recomputed only if dependency versions changed
One-time read that detects changes
const reader = SignalReader.of(light, 'intensity');
// Each frame
const newIntensity = reader.read(); // Returns only if changed
if (newIntensity !== undefined) {
updateGPUBuffer(newIntensity);
}Multi-level dependency chain:
Camera.fov (signal) → Signal change
↓
Camera.projMatrix (computed)
↓
Camera.viewProjectionMatrix (computed)
↓
Camera.inverseViewProjectionMatrix (computed)
↓
Skybox needs reproject
With signals: Only update GPU buffer once if fov changed, cascading through the dependency tree automatically.
- Batching – Multiple small matrices packed into single buffer
- Pull-based evaluation – Unused computed values don't trigger GPU uploads
- Dirty flag tracking – Version numbers prevent redundant writes
- Atlas packing – Multiple shadow maps in one texture
- Reuse materials across meshes to reduce binding changes
- Enable shadows selectively – Each shadow-casting light has cost
- Use appropriate LOD for procedural chunks
- Cull disabled objects – Inspector sets
object.enabled = false
Enable GPU debug visualization:
const debug = renderer.create(GPUDebug);
debug.enabled = true;
renderer.addPostRenderExtension(debug);
// Render light frustums, shadow map tiles, etc.- Extend
ShadowLightfor shadow support (orLightfor unshadowed) - Implement
@computedfor shader buffer layout - Update shader light enumeration
- Register in
GPULights.update()
- Create
MaterialPropertysubclass - Update fragment shader group sampling
- Add to
Materialclass
Reference shader code locations:
- Vertex:
src/engine/renderer/VertexShader.wgsl - Fragment:
src/engine/material/Material.wgsl - Sky:
src/engine/extensions/SkyShader.wgsl
All shaders are imported as strings and compiled at runtime.
| Browser | Version | Status |
|---|---|---|
| Chrome | 113+ | ✅ Full support |
| Edge | 113+ | ✅ Full support |
| Safari | 18+ | ✅ Full support |
| Firefox | TBD | ⏳ In development |
Enable WebGPU flags if not available:
- Chrome:
chrome://flags→ search "WebGPU" - Safari: Develop → Experimental Features → WebGPU
- Normal/parallax mapping – Per-fragment surface detail
- Instanced rendering – Draw call optimization
- Compute shaders – Particle simulation, terrain LOD
- Deferred rendering – G-buffer for many-light scenarios
- Temporal AA – Anti-aliasing via frame history
- Volumetric fog – Depth-aware atmospheric effects
- Terrain occlusion – Frustum/occluder culling
Contributions welcome! Areas of interest:
- Performance optimizations
- Additional shader effects
- Test coverage
- Documentation improvements
MIT
- WebGPU Spec: https://gpuweb.github.io/gpuweb/
- WGSL Spec: https://www.w3.org/TR/WGSL/
- wgpu-matrix: https://github.com/greggman/wgpu-matrix (math library)
- Khronos glTF: https://www.khronos.org/gltf/ (3D model format)