BlendLink XR

The spatial bridge between Blender and Meta Quest


ðŸ’Ą The Spark

Blender is an industry titan for 3D creation. But here's the thing — interacting with 3D spaces through a 2D monitor and a mouse is like sculpting clay while wearing oven mitts. There's always a layer of abstraction between the artist and the art.

VR promised to fix this. It didn't. Existing solutions rely on clunky controller emulation that lacks the precision professional artists demand.

Then Logitech shipped the MX Ink — a high-precision spatial stylus — and we saw the missing piece click into place.

What if we combined Logitech's most precise hardware with the world's most powerful 3D software to create an uncompromising spatial workflow?


ðŸŽŊ What BlendLink XR Does

BlendLink XR is a native Meta Quest application paired with a companion Blender plugin that fundamentally changes how artists touch their 3D work.

It routes exact 6-DoF tip-origin data and analog pressure directly into Blender's core operators. No mouse emulation. No screen-space projection. Real 3D input for real 3D work.

Artists can:

Workflow What It Feels Like
🖊ïļ Grease Pencil Draw pressure-sensitive strokes floating in mid-air
🏔ïļ Sculpting Grab, Clay Strips, Draw Sharp — with your actual hand
ðŸŽĻ Painting Vertex weights, vertex colors, textures — right on the mesh
📐 Modeling Reach out and grab vertices, edges, faces
ðŸĶī Rigging Physically drag armature joints into pose

The killer feature?

Zero-friction hybrid workflow. Put the headset on → sculpt in 3D space. Take it off → tweak the UI on your desktop. Same live Blender session. No mode switching. No restarts. No export/import dance.


âœĻ Why VR Changes Everything for These Workflows

Every one of these Blender tools was designed to manipulate 3D space — but artists have always been forced to do it through a 2D screen with a mouse. That's like painting through a window. BlendLink XR removes the window.

🖊ïļ Grease Pencil — Draw in the air, not on a plane

On desktop, Grease Pencil strokes are projected onto a flat surface or locked to a camera view. Artists constantly rotate the viewport to draw from different angles. In VR with the MX Ink, you just... draw. The stroke exists exactly where your hand puts it — floating in true 3D space with full pressure sensitivity. No projection math. No camera wrestling. Concept art and storyboarding become as natural as sketching on paper, except the paper is infinite and three-dimensional.

  Desktop Grease Pencil:              VR Grease Pencil:

  Camera View → Flat Plane              Your Hand → 3D Space
       ↓                                     ↓
  Rotate viewport to                   Stroke lives exactly
  draw from new angle                  where you drew it
       ↓                                     ↓
  Repeat endlessly                     Done. Move on.

🏔ïļ Sculpting — Touch the clay

Desktop sculpting means dragging a 2D cursor across a screen while Blender raycasts to figure out where on the mesh you meant to touch. You're constantly orbiting the model to reach the back, the underside, the crevices. In VR, you reach around the model. You push into a surface and feel the depth through pressure response. Brushes like Grab, Clay Strips, and Draw Sharp become intuitive because your hand is doing what it would do with real clay — just without the mess.

ðŸŽĻ Vertex Painting & Weight Painting — See what you're painting

Vertex color painting on desktop is a guessing game of angles. You paint one side, rotate, paint another, miss a spot behind an ear, rotate again. Weight painting is worse — you're trying to visualize influence gradients on a 3D mesh through a 2D viewport. In VR, you walk around the model. You lean in close to paint fine details. You see the weight gradient from every angle simultaneously because you're inside the scene. No blind spots.

📐 Modeling — Grab geometry with your hands

Edit mode on desktop: click a vertex, hit G, drag along an axis, hope you moved it the right amount in the right direction. In VR, you reach out and grab the vertex. You pull it where it needs to go. Edges and faces respond to spatial gestures that feel like bending wire or stretching fabric. The mental translation from "I want this point there" to actually getting it there drops from multiple steps to one motion.

ðŸĶī Posing — Move joints like a puppet

Posing an armature on desktop means selecting bones, rotating in screen space, switching axes, undoing, trying again. It's technical and slow. In VR, you grab a character's wrist and move it. The IK chain follows. You tilt the head by tilting your hand. Posing becomes physical and immediate — like posing an action figure instead of filling out a spreadsheet of rotation values.


🏗ïļ How We're Building It

Two components designed to work in tandem over a low-latency network bridge:

┌─────────────────────────────┐          ┌──────────────────────────────┐
│      META QUEST APP         │          │     BLENDER PLUGIN           │
│  (OpenXR Runtime / C++)     │◄────────▹│  (Plugin Backend)            │
│                             │  Low-    │                              │
│  ┌───────────────────────┐  │ Latency  │  ┌────────────────────────┐  │
│  │  Stylus Input Layer   │  │  Bridge  │  │  Custom Event Handlers │  │
│  │  ─ 6-DoF tip origin   │──┾────────▹├──│  ─ Inject 3D coords    │  │
│  │  ─ Rotation            │  │          │  │  ─ Bypass 2D mouse     │  │
│  │  ─ Analog pressure    │  │          │  │  ─ Direct operator call │  │
│  └───────────────────────┘  │          │  └────────────────────────┘  │
│                             │          │                              │
│  Lightweight VR Viewport    │          │  Live Scene Sync             │
└─────────────────────────────┘          └──────────────────────────────┘

The Quest App

A native Quest application built on the OpenXR runtime — chosen for its low-latency access to tracking, rendering, and input subsystems without going through higher-level abstraction layers. OpenXR gives us direct, frame-accurate control over pose prediction, display timing, and the Logitech MX Ink interaction profile, which is critical for sub-millisecond stylus tracking. The app will calculate exact tip coordinates, rotation, and pressure — then render a lightweight real-time viewport for spatial context. The runtime layer is abstracted in our architecture, so if a faster or more specialized runtime emerges, we can swap it without rearchitecting.

The Blender Plugin

A companion plugin that opens a low-latency network bridge to the Quest app. Instead of routing input through the OS mouse layer, we're building custom event handlers that inject the OpenXR-sourced tracking data directly into Blender's core operators.

This is the critical difference: Blender will understand the input is happening in true 3D space, not projected onto a 2D screen coordinate. Because OpenXR provides timestamped pose data with built-in prediction, we can compensate for network jitter on the Blender side before the frame is even rendered. The plugin's transport layer is protocol-agnostic — we can swap the underlying networking strategy for better performance without changing the Blender integration.

Data Flow — Frame by Frame

  MX Ink Stylus              Quest App                Network              Blender Plugin            Blender Core
       │                        │                       │                       │                       │
       │  6-DoF + Pressure      │                       │                       │                       │
       ├───────────────────────▹│                       │                       │                       │
       │                        │  Serialize Pose       │                       │                       │
       │                        ├──────────────────────▹│                       │                       │
       │                        │  Compact Packet       │                       │                       │
       │                        │  (~200 bytes)         │                       │                       │
       │                        │                       ├──────────────────────▹│                       │
       │                        │                       │                       │  Inject as 3D Event   │
       │                        │                       │                       ├──────────────────────▹│
       │                        │                       │                       │                       │
       │                        │                       │                       │  Operator Response    │
       │                        │                       │◄─────────────────────â”Ī◄──────────────────────â”Ī
       │                        │  Scene Delta          │                       │                       │
       │                        │◄──────────────────────â”Ī                       │                       │
       │                        │                       │                       │                       │
       │                   Render Updated Viewport      │                       │                       │

Each pose update is roughly 200 bytes. At 120 Hz tracking, that's about 24 KB/s of control data — practically invisible on any local network. OpenXR's predicted display timestamps travel with each packet, letting the Blender side know exactly when this pose will hit the user's eyes — enabling precise latency compensation. The transport protocol is a pluggable layer, so we can move from our current implementation to a faster alternative without rearchitecting.


🧗 Challenges We're Tackling

1. Bypassing 2D Screen Space

Blender's UI and many operators (especially sculpt and edit modes) are deeply hardcoded to expect 2D screen coordinates (X/Y) + depth raycast. Translating pure 3D spatial coordinates into inputs Blender can accept requires reverse-engineering several of Blender's interaction paradigms.

Traditional VR Input:                    BlendLink XR:

  Controller → Screen Project → Raycast    MX Ink → 3D Tip Origin → Direct Inject
       ↓              ↓           ↓              ↓                      ↓
  Imprecise      Lossy         Slow         Sub-mm precision      Zero conversion loss

2. Latency & Synchronization

For the "zero-friction" hybrid workflow to feel real, the Quest viewport and the desktop Blender session need to stay perfectly synced — no export, no import, no save. Our network pipeline will transmit only scene deltas rather than full state, keeping round-trip latency under perceptible thresholds.

3. Tip-Origin Precision

Traditional VR controllers track the center of the hand. A stylus interaction happens at the tip. Re-calibrating offsets so the physical MX Ink tip perfectly aligns with the digital interaction point will require a custom calibration routine and extensive iteration.

  Standard Controller Tracking:        MX Ink Tip-Origin Tracking:

       ┌──────┐                              â•ē
       │ ●    │  ← Tracked point              â•ē
       │      │     (palm center)              â•ē
       └──────┘                                 ● ← Tracked point
                                               ╱     (actual tip)
                                              ╱
                                             ╱

  Offset error: ~8-12cm                Offset error: <1mm

🏆 What Excites Us Most

Concurrent desktop + spatial workflow — We're designing a system where artists never get locked into "VR only" mode. The ability to seamlessly transition between desktop and spatial computing without hitting Save or Export is a genuine workflow breakthrough waiting to happen.

Grease Pencil in spatial 3D — Imagine a pressure-sensitive Grease Pencil stroke floating perfectly in mid-air, drawn with the natural ergonomics of a pen. This is the moment we believe will make artists never want to go back.

Native MX Ink integration via OpenXR — Capturing highly accurate analog pressure curves from the Logitech MX Ink stylus through OpenXR's interaction profile system, not just binary on/off. OpenXR's standardized input model means our input abstraction layer is designed to support additional spatial input devices as they emerge — any device with an OpenXR profile slots right in.

Modular architecture — Every layer of the stack is swappable. XR runtime, transport protocol, video codec, input profiles — all independently upgradeable. We're building for the long game, not just today's hardware.


📚 What We've Identified So Far

  • Spatial ergonomics matter as much as code. How artists hold a stylus vs. a controller will fundamentally shape our approach to tool offsets and interaction zones.
  • Blender's architecture runs deep. Our research into Blender's scene graph, modal operators, and event-handling systems confirms this integration is viable — and powerful.
  • Networked XR is its own discipline. Efficiently streaming dense coordinate and mesh data in real-time is a hard problem. We've designed the transport layer to be swappable so we can chase better performance as protocols evolve.

🚀 Roadmap

Milestone Status
Core architecture & networking layer 🔧 In development
MX Ink 6-DoF input pipeline 🔧 In development
Blender plugin — operator injection 📋 Planned
Grease Pencil spatial drawing 📋 Planned
Sculpt & paint mode integration 📋 Planned
Meta Quest Store release ðŸŽŊ Target milestone
Floating spatial UI panels ðŸ”Ū Future — Blender toolbars as interactable VR panels
Multiplayer collaborative sessions ðŸ”Ū Future — two headsets, one viewport, real-time co-sculpting
Extended tool support ðŸ”Ū Future — Geometry Nodes routing, physical lighting placement

🔧 Tech Stack

Component Technology
Quest App Native XR runtime (currently OpenXR) / C++
Rendering GPU-accelerated (runtime-agnostic)
Blender Plugin Python + native extension backend
Networking Low-latency transport (protocol-swappable)
Stylus Input Logitech MX Ink (abstracted input layer)
Video Encoding Hardware-accelerated codec (configurable)
Discovery Zero-config network discovery
Security Encrypted channels (standard DTLS)

Architecture is designed with swappable layers — XR runtime, transport protocol, video codec, and input profiles can all be upgraded independently without rearchitecting the system.


BlendLink XR — Stop looking at your 3D work. Start touching it.

Built With

Share this project:

Updates