Skip to content

Add Deck.co integration (data collection API for JS-rendered pages) #59

@ameet

Description

@ameet

Platform

Deck.cohttps://deck.co

What it does

Deck provides a data collection API for extracting structured data from JavaScript-rendered web pages. It handles headless browser automation, anti-bot bypasses, and returns clean structured data from pages that traditional scraping tools can't access.

Why it's useful for AI agents

AI agents frequently need to collect data from web pages that require JavaScript rendering — company profiles, product pages, pricing tables, job boards, review sites. Traditional HTTP fetches return empty shells for these pages. Deck handles the browser automation layer so agents can focus on the data.

API documentation

  • API docs: https://console.deck.co (requires account)
  • Auth: OAuth2 client_credentials flow (client_id + client_secret → bearer token)
  • Key endpoints:
    • Token exchange: POST to auth endpoint with client credentials
    • Data collection: Submit collection jobs with target URLs and extraction schemas
    • Results retrieval: Poll or webhook for completed extractions

Suggested actions

  1. authenticate — Exchange client credentials for bearer token
  2. createCollection — Submit a data collection job with target URL(s) and extraction schema
  3. getCollection — Retrieve results of a collection job
  4. listCollections — List recent collection jobs and their statuses

Notes

  • Deck jobs can take 30-60+ seconds for complex pages. The integration should support async polling (submit job → poll for results) rather than synchronous execution.
  • This is particularly valuable for market intelligence workflows where agents need to collect competitor data, pricing, or product information from JS-heavy sites.

Metadata

Metadata

Labels

platform-requestRequest for a new platform integration on One.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions