Warning
warpsurf is an open-source research project under active development. Browser automation carries inherent risks — monitor agents while they work, use capped API keys, and read the full disclaimer before use.
- Python 3.11+
- Node.js 18+ (Puppeteer is used to launch Chrome and load the extension)
1. Get the warpsurf extension (pick one):
Option A — build from source:
git clone https://github.com/warpsurf/warpsurf.git
cd warpsurf
pnpm install
pnpm build:api
pnpm build:api(notpnpm build) — the:apivariant includes the local API surface that the SDK communicates with. The build output is indist/.
Option B — download warpsurf-api-v*.zip from the releases page and unzip it.
2. Install the SDK:
git clone https://github.com/warpsurf/warpsurf-python-sdk.git
cd warpsurf-python-sdk
pip install -e .
npm install3. Run:
python example.py /path/to/warpsurf/dist $GOOGLE_API_KEYOr use the SDK directly:
import asyncio, os
from warpsurf import WarpSurf, Config
async def main():
async with WarpSurf("/path/to/warpsurf/dist") as ws:
result = await ws.run(
task="What are the top 3 trending repositories on GitHub right now? List their names and star counts.",
config=Config(
provider="google",
model="gemini-3.1-flash-lite-preview",
api_key=os.environ["GOOGLE_API_KEY"],
),
workflow="agent",
)
print(result.output)
if result.usage:
print(f"${result.usage.cost:.4f} | {result.usage.latency_ms}ms")
asyncio.run(main())The main client. Launches Chrome via Puppeteer, loads the warpsurf extension, and communicates with its service worker.
ws = WarpSurf(
extension_path="./dist", # or set WARPSURF_EXTENSION_PATH
chrome_path=None, # auto-detected, or set CHROME_PATH
headless=False,
user_data_dir=None, # temp dir created if omitted
)| Parameter | Type | Default | Description |
|---|---|---|---|
extension_path |
str | None |
env var | Path to unpacked extension |
chrome_path |
str | None |
auto-detect | Custom Chrome binary |
headless |
bool |
False |
Run Chrome headless |
user_data_dir |
str | None |
temp dir | Persistent profile directory |
Supports
async withfor automatic connect/close.
LLM provider configuration passed to every run() call.
config = Config(
provider="google",
model="gemini-3.1-flash-lite-preview",
api_key=os.environ["GOOGLE_API_KEY"],
base_url=None,
temperature=None,
max_output_tokens=None,
thinking_level=None,
)| Parameter | Type | Required | Description |
|---|---|---|---|
provider |
str |
yes | Provider identifier (see table below) |
model |
str |
yes | Model name, e.g. "gemini-3.1-flash-lite-preview" |
api_key |
str |
yes | API key for the provider |
base_url |
str | None |
no | Custom API endpoint (required for "custom" provider) |
temperature |
float | None |
no | Sampling temperature |
max_output_tokens |
int | None |
no | Maximum output tokens |
thinking_level |
str | None |
no | "high", "medium", "low", "off", or "default" |
Supported providers:
provider value |
Service |
|---|---|
"google" or "gemini" |
Google Gemini |
"openai" |
OpenAI |
"anthropic" |
Anthropic |
"grok" |
xAI Grok |
"openrouter" |
OpenRouter |
"custom_openai" or "custom" |
Any OpenAI-compatible endpoint (requires base_url) |
Optional per-run behaviour overrides.
from warpsurf import Overrides, GeneralOverrides, FirewallOverrides
overrides = Overrides(
general=GeneralOverrides(
max_steps=50,
use_vision=True,
max_worker_agents=3,
),
firewall=FirewallOverrides(
enabled=True,
allow_list=["amazon.com", "bestbuy.com"],
),
)All GeneralOverrides fields
| Field | Type |
|---|---|
max_steps |
int |
max_actions_per_step |
int |
max_failures |
int |
max_validator_failures |
int |
retry_delay |
int |
max_input_tokens |
int |
use_vision |
bool |
planning_interval |
int |
min_wait_page_load |
int |
max_worker_agents |
int |
enable_planner |
bool |
enable_validator |
bool |
enable_workflow_estimation |
bool |
show_tab_previews |
bool |
response_timeout_seconds |
int |
All fields are optional — only set fields are sent to the extension.
Returned by run(), get_status(), and wait_for_completion().
result.ok # bool — True if status == 'completed'
result.status # 'completed' | 'error' | 'cancelled' | 'running' | 'pending'
result.output # str | None — the agent's response
result.error # str | None — error message if failed
result.task_id # strUsage statistics (available on completion):
result.usage.cost # float — total USD cost
result.usage.latency_ms # int — end-to-end milliseconds
result.usage.input_tokens # int
result.usage.output_tokens # int
result.usage.api_calls # int — number of LLM calls
result.usage.provider # str
result.usage.model # strExecution trace (agent workflows):
result.trace # tuple[TraceEntry, ...] | None
result.trace[0].action # str — e.g. "click", "navigate"
result.trace[0].status # 'start' | 'ok' | 'fail'
result.trace[0].details # strExecute a task and block until completion.
result = await ws.run(
"Find laptops under $1000",
config,
workflow="agent", # 'auto' | 'chat' | 'search' | 'agent' | 'multiagent'
overrides=overrides, # optional
task_id="my-task-001", # optional — auto-generated if omitted
timeout=600.0, # seconds
)Poll the current status of a running task.
Cancel a running task. Raises on failure.
Retrieve the extension's current settings (general, firewall, providers, agent models).
Poll get_status() until the task reaches a terminal state.
warpsurf-python-sdk/
├── src/
│ ├── __init__.py Public exports
│ ├── client.py WarpSurf class — spawns bridge, communicates via JSON
│ └── types.py Frozen dataclasses with beartype runtime validation
├── launch.mjs Puppeteer bridge — launches Chrome, loads extension
├── package.json Node.js dependencies (puppeteer, stealth plugin)
└── pyproject.toml Python packaging
How it works:
WarpSurf.connect()spawnsnode launch.mjs <extension_path>as a subprocess- The bridge uses Puppeteer to launch Chrome with
enableExtensions, which loads the extension via CDP'sExtensions.loadUnpacked - The bridge discovers the extension's service worker and opens a CDP session
- Python sends JSON commands over stdin, the bridge evaluates them on the service worker via
Runtime.evaluate, and sends JSON responses on stdout WarpSurf.close()sends a close command and terminates the subprocess
Dependencies: Python ≥ 3.11, beartype, Node.js ≥ 18, puppeteer.
