Skip to content

aboyobam/WebGPU-Adventure

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

WebGPU Renderer

A high-performance 3D rendering engine built with WebGPU and TypeScript, featuring advanced graphics techniques and innovative architectural patterns for efficient GPU synchronization.


Getting Started

Prerequisites

  • WebGPU-capable browser (Chrome 113+, Edge 113+, or Safari 18+)
  • Node.js 16+ for building

Installation & Running

# Install dependencies
npm install

# Development server with hot reload
npm run dev

# Build for production
npm run build

# Run tests
npm run test

The application will launch in your browser at http://localhost:5173 (Vite default).

Basic Usage

// Initialize renderer
const adapter = await navigator.gpu.requestAdapter();
const device = await adapter.requestDevice();
const canvas = document.createElement("canvas");
const renderer = new WebGPURenderer(device, canvas, {
    shadowAtlasSizeX: 2048,
    shadowAtlasSizeY: 2048,
    shadowAtlasSplitX: 2,
    shadowAtlasSplitY: 2,
});

// Create scene
const scene = new Scene();
const camera = new PerspectiveCamera(75, canvas.width / canvas.height);
scene.add(camera);

// Add mesh
const material = await Material.fromTexture("texture.avif");
const geometry = new Geometry(vertices, indices);
const mesh = new Mesh(geometry, material);
scene.add(mesh);

// Render loop
const draw = () => {
    renderer.render(scene, camera);
    requestAnimationFrame(draw);
};
draw();

Architecture

Core Design Philosophy

This renderer prioritizes minimal GPU overhead by distinguishing between logical changes and GPU updates:

  1. Signal-based reactivity detects which properties actually changed
  2. Computed signals derive dependent values (matrices, buffers, etc.)
  3. Pull-based evaluation means GPU buffers update only on actual changes
  4. Custom DI container eliminates verbose device threading through the codebase

Directory Structure

src/
├── engine/
│   ├── renderer/
│   │   ├── WebGPURenderer.ts       # Main renderer pipeline
│   │   ├── GPUCamera.ts            # Camera GPU representation
│   │   ├── GPULights.ts            # Light management & shadows
│   │   ├── GPUMesh.ts              # Mesh GPU binding
│   │   ├── GPUMaterial.ts          # Material & texture binding
│   │   ├── GPUGeometry.ts          # Geometry GPU buffers
│   │   └── misc/                   # Extensions (Skybox, Sun, Debug)
│   ├── camera/                      # Camera implementations
│   ├── light/                       # Light types (Point, Spot, Sun)
│   ├── material/                    # PBR material system
│   ├── geometry/                    # Geometry primitives
│   ├── controls/                    # Camera controls (WASD, Orbit)
│   ├── loaders/                     # GLTF loader
│   ├── extensions/                  # Sky, Skybox components
│   ├── math/                        # Transform, Vector math
│   ├── Object3D.ts                  # Scene graph node
│   └── Scene.ts                     # Scene container
├── lib/
│   ├── Signal.ts                   # Pull-based reactivity system
│   ├── DI.ts                       # Dependency injection container
│   └── Controls.ts                 # GUI controls integration
└── game/                           # Game-specific code
    └── world/                      # Procedural terrain (in dev)

Innovative Features

1. Pull-Based Signal Reactivity

The Problem: In traditional renderers, changing a property triggers immediate GPU updates. With hundreds of meshes and lights, this creates excessive GPU submissions.

The Solution: A custom pull-based signal system where:

  • Properties are wrapped in Signal<T> objects with version tracking
  • Computed signals (decorated with @computed) automatically recompute only when dependencies change
  • GPU updates happen exactly once per frame if the underlying data changed
  • Dependencies are tracked at the property level, not the object level

Example:

// Light applies rotation → direction changes → GPU buffer updates automatically
export default class SpotLight extends ShadowLight {
    @signal(1) declare decay: number;
    @signal(10) declare innerCone: number;

    @computed(function(this: SpotLight) {
        // Only recomputes when transform.direction or other deps change
        return new Float32Array([
            ...this.baseBuffer,
            ...this.transform.direction,  // ← Reactive dependency
            this.decay,
            this.innerCone,
        ]);
    }, ['transform', 'intensity', 'decay', 'innerCone'])
    declare readonly asBuffer: Float32Array;
}

GPU Sync Flow:

Frame 1: light.position changes
  ↓
Signal version increments
  ↓
Computed `asBuffer` detects dependency change
  ↓
Buffer recomputed on-demand when `.value` accessed
  ↓
GPU buffer only updated if asBuffer value differs
  ↓
Frame 2: light.position unchanged
  ↓
No GPU update

Benefits:

  • ✅ Automatic dependency tracking
  • ✅ Minimal redundant GPU uploads
  • ✅ Works across property hierarchies (transform.position → world matrices → camera VP)
  • ✅ Type-safe with TypeScript decorators

2. Custom Dependency Injection for Device Management

The Problem: WebGPU requires a GPUDevice to create any GPU resource. Passing it through constructor chains is verbose:

// Traditional approach - tedious
const renderer = new WebGPURenderer(device, canvas);
const material = new GPUMaterial(material, device);
const mesh = new GPUMesh(mesh, device);
const camera = new GPUCamera(camera, device);

The Solution: A lightweight inversion-of-control container that:

  • Registers singleton instances (device, canvas, context)
  • Injects dependencies via @inject() decorators on lazy-loaded properties
  • Resolves circular dependencies with lazy initialization
  • Integrates with TypeScript reflection

Example:

export default class WebGPURenderer extends DIContainer {
    constructor(device: GPUDevice, canvas: HTMLCanvasElement, options: WebGPURendererOptions) {
        super();
        // Register singletons in container
        this.provide(device, GPUDevice);
        this.provide(canvas, HTMLCanvasElement);
        this.provide(this, WebGPURenderer);
    }
}

// In any GPU class:
export default class GPUMaterial {
    @inject(() => GPUDevice) declare private device: GPUDevice;
    @inject(() => HTMLCanvasElement) declare private canvas: HTMLCanvasElement;

    init() {
        // device and canvas are automatically resolved
        const texture = this.device.createTexture({...});
    }
}

Benefits:

  • ✅ No constructor parameter drilling
  • ✅ Lazy resolution (only when accessed)
  • ✅ Works with circular dependencies
  • ✅ Testable (swap implementations)
  • ✅ Minimal boilerplate compared to full DI frameworks

Rendering Features

Materials & Textures

  • PBR Pipeline – Metallic/roughness workflow
  • Texture Loading – Support for AVIF, PNG, JPEG via createImageBitmap
  • Material Cloning – Easy material reuse with modifications
  • Alpha Blending – Separate pipeline for transparent objects
const material = await Material.fromTexture("sand.avif");
const cloned = material.clone();
cloned.roughnessFactor = 0.8;

Lighting System

Ambient Light

Global illumination multiplier

renderer.ambientLight.set(0.2, 0.2, 0.2);

Point Lights

Omnidirectional illumination with exponential distance decay

const light = new PointLight();
light.intensity = 1.5;
light.decay = 2.5;
light.decayStrength = 2.5;
light.maxDistance = 100;
scene.add(light);

Spot Lights

Directional cone illumination with smooth falloff (inner/outer cone)

const light = new SpotLight();
light.intensity = 15;
light.innerCone = 10;  // degrees
light.outerCone = 15;
light.spotIntensity = 2;
light.castsShadow = true;

Sun (Directional) Light

Orthographic projection for distant light sources, ideal for sun/moon

const sun = new SunLight();
sun.intensity = 0.8;
sun.castsShadow = true;
sun.shadowOrthoSize = 50;
sun.maxShadowDistance = 150;

Shadow Mapping

Features:

  • Shadow Atlas – Multiple shadow maps packed into one texture (configurable grid)
  • PCF Smoothing – Percentage-closer filtering for soft shadows
  • Distance Culling – Shadows disabled when light goes below horizon
  • Orthographic Projection – Sun lights use ortho for efficient coverage

Configuration:

const renderer = new WebGPURenderer(device, canvas, {
    shadowAtlasSizeX: 2048,   // Atlas resolution
    shadowAtlasSizeY: 2048,
    shadowAtlasSplitX: 2,     // 2x2 grid = 4 lights can cast shadows
    shadowAtlasSplitY: 2,
});

Per-Light Tuning:

light.castsShadow = true;
light.shadowSmoothing = 3;         // PCF sample count
light.minShadowDistance = 0.1;
light.maxShadowDistance = 150;

Skybox

Cube-mapped background with color adjustment for day/night cycles

const skybox = await Skybox.load("skybox.png");  // Cross-layout: 4x3 tiles
const skyboxExt = renderer.create(SkyboxExtension, skybox, camera);
renderer.addPreRenderExtension(skyboxExt);

// Runtime adjustment
skybox.colorAdjustment.set(0.8, 0.8, 0.8);  // Darken for night

Sun Cycles & Dynamic Lighting

Procedural Sun Positioning:

  • Spherical coordinates (azimuth, polar angle)
  • Smooth color transitions from red/orange (horizon) to white (zenith)
  • Synchronized lighting: sun visual + directional light intensity
const sun = new Sun();
sun.polarRad = 0.5;     // radians from horizon
sun.azimuthRad = 1.2;
sun.color.set(1, 0.95, 0.85);
sun.intensity = 1.0;
sun.glowDeg = 5;        // glow radius

const sunLight = new SunLight();
sunLight.color.copy(sun.color);  // Keep in sync
sunLight.intensity = maxLight * heightShape;

Full Day/Night Cycle Example (test.ts):

const daySeconds = 60;  // 60 second day

const draw = (time: number) => {
    const phase = (time * 0.001 / daySeconds) % 1.0;
    const azimuth = phase * Math.PI * 2;

    // Compute sun direction rotating around world X
    const dirY = Math.sin(azimuth);  // ← Goes below horizon
    const height01 = Math.max(0, dirY);

    // Smooth falloff near horizon
    const shaped = Math.pow(height01, 1.6);

    // Lighting intensity ramps down
    sunLight.intensity = 0.8 * shaped;

    // Color shift: sunset orange → noon white
    sun.color.set(
        1.0,                                    // R constant
        0.42 * (1 - shaped) + 1.0 * shaped,    // G: orange → white
        0.15 * (1 - shaped) + 1.0 * shaped,    // B
    );

    renderer.render(scene, camera);
};

Fog

Depth-based fog in material shader (configurable in Material.wgsl)

PBR (Physically-Based Rendering)

  • Metallic Factor – 0 = dielectric, 1 = metal
  • Roughness Factor – 0 = mirror, 1 = diffuse
  • Base Color Texture – Albedo mapping
  • Normal Mapping – (Ready for extension)
const material = await Material.fromTexture("metal.avif");
material.metalicFactor = 0.8;
material.roughnessFactor = 0.2;

Procedural World Generation (In Development)

Terrain generation using perlin noise-based heightmaps with island detection:

const size = 15;  // ±15 chunks
const seed = 11;

for (let x = -size; x <= size; x++) {
    for (let z = -size; z <= size; z++) {
        const isIsland = isIsland(x, z, seed);
        const material = isIsland ? sandMaterial : waterMaterial;
        const chunk = chunk(x, z, seed, material);
        scene.add(chunk);
    }
}

Control Systems

WASD Controls (First-Person)

const wasd = renderer.create(WASDControls, camera, canvas);

// Update each frame
wasd.update();

// Properties
wasd.moveSpeed = 0.5;
wasd.rotateSpeed = 0.5;

Input:

  • W/A/S/D – Move forward/left/back/right
  • Left Mouse Drag – Pitch/yaw rotation
  • Roll Prevention – Always aligns "up" with world Y

Orbit Controls (Third-Person)

const orbit = renderer.create(OrbitControls, camera, canvas);

// Properties
orbit.target = [0, 0, 0];
orbit.minDistance = 1;
orbit.maxDistance = 100;
orbit.zoomSpeed = 2.0;
orbit.rotateSpeed = 1.0;

Input:

  • Left Mouse Drag – Rotate around target
  • Right Mouse Drag – Pan target
  • Mouse Wheel – Zoom in/out
  • W Key – Walk forward (in direction of target)
  • Persistent State – Saved to localStorage

GPU Rendering Pipeline

Vertex → Fragment Flow

Vertex Shader:

  1. Transform model-space position to view-space
  2. Pass position, normal, UV, view position to fragment

Fragment Shader:

  1. Sample base color texture
  2. Evaluate PBR material properties
  3. For each light:
    • Compute light contribution (Lambertian + specular)
    • Sample shadow map for occlusion
    • Apply PCF smoothing
  4. Blend ambient + all light contributions
  5. Apply fog if enabled

Binding Groups:

  • Group 0 – Camera (view, projection matrices)
  • Group 1 – Mesh (transform matrices)
  • Group 2 – Material (texture, sampler, properties)
  • Group 3 – Lights (array of lights, shadow maps, info)

Shadow Rendering

  1. Pre-render pass per light:

    • Use light's projection matrix
    • Render only opaque meshes (alpha-tested separately)
    • Store depth to shadow atlas tile
  2. Main pass:

    • For each light, sample shadow atlas
    • PCF filter around shadow coordinate
    • Apply smoothstep falloff near shadow edges

Render Pass Ordering

1. Pre-render extensions (Skybox, Sun)
2. Opaque meshes (basic pipeline)
3. Transparent meshes (alpha pipeline, depth read-only)
4. Post-render extensions (Debug visualization)

Extensions System

Extensible rendering pipeline via RenderExtension interface:

export interface RenderExtension {
    init(): void;           // Called once after renderer init
    update(): void;         // Called each frame before render
    render(pass: GPURenderPassEncoder): void;  // Draw to render pass
}

Built-in Extensions

Skybox Extension

Renders background cubemap with inverse-projected view/projection

Sun Extension

Renders procedural sun disc and glow with HDR falloff

GPUDebug Extension

Visualizes debug geometry (light cones, shadow frustums, camera projection)

Example Usage:

const debug = renderer.create(GPUDebug);
debug.enabled = true;
renderer.addPostRenderExtension(debug);

Signal System Deep Dive

Core Concepts

Signal

Base reactive value wrapper

@signal(0.5)
declare roughness: number;

// Under the hood creates:
// roughness$ = new Signal(0.5)
// Getter: () => roughness$.value
// Setter: (v) => roughness$.value = v

ComputedSignal

Derived value that auto-updates when dependencies change

@computed((position, rotation) => {
    return position.add(rotation.getForward());
}, ['position', 'rotation'])
declare readonly worldPosition: Vector;

Optimization: Value is cached; recomputed only if dependency versions changed

SignalReader

One-time read that detects changes

const reader = SignalReader.of(light, 'intensity');

// Each frame
const newIntensity = reader.read();  // Returns only if changed
if (newIntensity !== undefined) {
    updateGPUBuffer(newIntensity);
}

Reactivity in Practice

Multi-level dependency chain:

Camera.fov (signal) → Signal change
    ↓
Camera.projMatrix (computed)
    ↓
Camera.viewProjectionMatrix (computed)
    ↓
Camera.inverseViewProjectionMatrix (computed)
    ↓
Skybox needs reproject

With signals: Only update GPU buffer once if fov changed, cascading through the dependency tree automatically.


Performance Considerations

GPU Upload Optimization

  1. Batching – Multiple small matrices packed into single buffer
  2. Pull-based evaluation – Unused computed values don't trigger GPU uploads
  3. Dirty flag tracking – Version numbers prevent redundant writes
  4. Atlas packing – Multiple shadow maps in one texture

Scene Optimization Tips

  • Reuse materials across meshes to reduce binding changes
  • Enable shadows selectively – Each shadow-casting light has cost
  • Use appropriate LOD for procedural chunks
  • Cull disabled objects – Inspector sets object.enabled = false

Profiling

Enable GPU debug visualization:

const debug = renderer.create(GPUDebug);
debug.enabled = true;
renderer.addPostRenderExtension(debug);

// Render light frustums, shadow map tiles, etc.

Development Notes

Adding New Light Types

  1. Extend ShadowLight for shadow support (or Light for unshadowed)
  2. Implement @computed for shader buffer layout
  3. Update shader light enumeration
  4. Register in GPULights.update()

Adding New Materials

  1. Create MaterialProperty subclass
  2. Update fragment shader group sampling
  3. Add to Material class

Custom Shaders

Reference shader code locations:

  • Vertex: src/engine/renderer/VertexShader.wgsl
  • Fragment: src/engine/material/Material.wgsl
  • Sky: src/engine/extensions/SkyShader.wgsl

All shaders are imported as strings and compiled at runtime.


Browser Support

Browser Version Status
Chrome 113+ ✅ Full support
Edge 113+ ✅ Full support
Safari 18+ ✅ Full support
Firefox TBD ⏳ In development

Enable WebGPU flags if not available:

  • Chrome: chrome://flags → search "WebGPU"
  • Safari: Develop → Experimental Features → WebGPU

Future Roadmap

  • Normal/parallax mapping – Per-fragment surface detail
  • Instanced rendering – Draw call optimization
  • Compute shaders – Particle simulation, terrain LOD
  • Deferred rendering – G-buffer for many-light scenarios
  • Temporal AA – Anti-aliasing via frame history
  • Volumetric fog – Depth-aware atmospheric effects
  • Terrain occlusion – Frustum/occluder culling

Contributing

Contributions welcome! Areas of interest:

  • Performance optimizations
  • Additional shader effects
  • Test coverage
  • Documentation improvements

License

MIT


Resources

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages