Docs/API

API Usage Guide

Base URL

http://YOUR_SERVER/api/v1/

For local installations:

http://localhost/api/v1/

The complete API contract is available via Swagger at the API Reference.

Authentication

All requests require two headers:

X-API-Key: YOUR_API_KEY
X-Install-ID: YOUR_INSTALL_ID

On a new installation, open DBConvert Streams once and connect your account before using API-only workflows.

Both values are available on the Account page in the app.

Create connections

MySQL

curl -X POST "http://localhost/api/v1/connections" \
  -H "X-API-Key: YOUR_API_KEY" \
  -H "X-Install-ID: YOUR_INSTALL_ID" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "mysql-source",
    "type": "mysql",
    "host": "localhost",
    "port": 3306,
    "username": "root",
    "password": "password",
    "defaultDatabase": "source_db"
  }'

defaultDatabase is optional. If omitted, the connection can access all databases on the server. Set it to scope the connection to a specific database by default.

PostgreSQL

curl -X POST "http://localhost/api/v1/connections" \
  -H "X-API-Key: YOUR_API_KEY" \
  -H "X-Install-ID: YOUR_INSTALL_ID" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "postgres-target",
    "type": "postgresql",
    "host": "localhost",
    "port": 5432,
    "username": "postgres",
    "password": "postgres",
    "defaultDatabase": "target_db"
  }'

File connection

Use type: "files" for local file-based connections. The connection defines a shared base path. Output format is selected in the stream config target spec. File-source selection is defined in source.connections[].files.

For Docker deployments, use the shared container paths:

  • /data/imports for local file sources
  • /data/exports for local file targets
curl -X POST "http://localhost/api/v1/connections" \
  -H "X-API-Key: YOUR_API_KEY" \
  -H "X-Install-ID: YOUR_INSTALL_ID" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "file-connection",
    "type": "files",
    "spec": {
      "files": {
        "basePath": "/data/exports"
      }
    }
  }'

S3 storage

curl -X POST "http://localhost/api/v1/connections" \
  -H "X-API-Key: YOUR_API_KEY" \
  -H "X-Install-ID: YOUR_INSTALL_ID" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "s3-csv-export",
    "type": "files",
    "file_format": "csv",
    "storage_config": {
      "provider": "s3",
      "uri": "s3://my-bucket/exports/csv/",
      "region": "us-east-1",
      "credentials_ref": "secret/data/aws/my-bucket"
    }
  }'

Response includes a connection ID:

{
  "id": "conn_2t3EmXcvcg2J6I9mJ02UaiMVxfb",
  "name": "mysql-source",
  "type": "mysql",
  "created": 1739572516
}

Create a stream configuration

The stream config defines source, target, mode, and scope.

For single-source streams, omit the alias field on the connection. Aliases are only needed in multi-source streams, where each connection gets a unique alias (e.g., db1, pg1) used as a prefix in table references.

Minimal example (full database migration)

curl -X POST "http://localhost/api/v1/stream-configs" \
  -H "X-API-Key: YOUR_API_KEY" \
  -H "X-Install-ID: YOUR_INSTALL_ID" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "full_migration",
    "mode": "convert",
    "source": {
      "connections": [
        {
          "connectionId": "conn_SOURCE_ID",
          "database": "source_db"
        }
      ]
    },
    "target": {
      "id": "conn_TARGET_ID",
      "spec": {
        "db": {
          "database": "target_db"
        }
      }
    }
  }'

Omitting tables causes auto-discovery of all user tables.

Selective table migration

curl -X POST "http://localhost/api/v1/stream-configs" \
  -H "X-API-Key: YOUR_API_KEY" \
  -H "X-Install-ID: YOUR_INSTALL_ID" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "selective_migration",
    "mode": "convert",
    "source": {
      "connections": [
        {
          "connectionId": "conn_SOURCE_ID",
          "database": "sakila",
          "tables": [
            {"name": "actor"},
            {"name": "film"},
            {"name": "customer"}
          ]
        }
      ],
      "options": {
        "dataBundleSize": 500
      }
    },
    "target": {
      "id": "conn_TARGET_ID",
      "spec": {
        "db": {
          "database": "target_db",
          "schema": "public",
          "structureOptions": {
            "tables": true,
            "indexes": true,
            "foreignKeys": true
          }
        }
      }
    },
    "reportingInterval": 3
  }'

Export to Parquet files

curl -X POST "http://localhost/api/v1/stream-configs" \
  -H "X-API-Key: YOUR_API_KEY" \
  -H "X-Install-ID: YOUR_INSTALL_ID" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "parquet_export",
    "mode": "convert",
    "source": {
      "connections": [
        {
          "connectionId": "conn_SOURCE_ID",
          "database": "sakila",
          "tables": [
            {"name": "film"},
            {"name": "actor"}
          ]
        }
      ]
    },
    "target": {
      "id": "conn_FILES_TARGET_ID",
      "spec": {
        "files": {
          "fileFormat": "parquet",
          "format": {
            "compression": "zstd"
          }
        }
      }
    }
  }'

Stream operations

Start a stream

curl -X POST "http://localhost/api/v1/stream-configs/CONFIG_ID/start" \
  -H "X-API-Key: YOUR_API_KEY" \
  -H "X-Install-ID: YOUR_INSTALL_ID"

Response:

{
  "id": "stream_2t3HSWnLvjeOCuuVPsuJVUhawj2"
}

Pause / Resume / Stop

# Pause
curl -X POST "http://localhost/api/v1/streams/STREAM_ID/pause" \
  -H "X-API-Key: YOUR_API_KEY" -H "X-Install-ID: YOUR_INSTALL_ID"

# Resume
curl -X POST "http://localhost/api/v1/streams/STREAM_ID/resume" \
  -H "X-API-Key: YOUR_API_KEY" -H "X-Install-ID: YOUR_INSTALL_ID"

# Stop
curl -X POST "http://localhost/api/v1/streams/STREAM_ID/stop" \
  -H "X-API-Key: YOUR_API_KEY" -H "X-Install-ID: YOUR_INSTALL_ID"

Check stream status

curl "http://localhost/api/v1/streams/STREAM_ID/stats" \
  -H "X-API-Key: YOUR_API_KEY" \
  -H "X-Install-ID: YOUR_INSTALL_ID"

Stream states

StateDescription
READYCreated, not yet started
RUNNINGActive data transfer
PAUSEDTemporarily suspended, can be resumed
STOPPEDTerminated, cannot be resumed
FINISHEDCompleted successfully
FAILEDError state
TIME_LIMIT_REACHEDStopped after configured time limit
EVENT_LIMIT_REACHEDStopped after configured event limit

Error codes

CodeMeaning
400Invalid request payload
401Missing or invalid authentication
404Resource not found
409Conflict (e.g. starting a stream while another is paused)
500Server error

Next steps