Get your pages indexedin 48 hours.
Submit URLs straight to Google's Indexing API . Track coverage across every Search Console property in one place, keep your history past 16 months, and automate the whole thing from your deploy pipeline.
Indexing shouldn't be luck.
Three levers most SEO tools leave on the table: direct API submission, unified multi-site visibility, and retention that outlasts Google's 16-month window.
Submit directly. Skip the queue.
Google's Indexing API notifies the crawler immediately. New pages and updated content get picked up in hours, not weeks of hoping Discover stumbles on your sitemap.
Every site, one tab
Verified Search Console properties, indexing rate, and impressions in a single dashboard. No more switching accounts to compare.
History beyond 16 months
Google wipes your Search Console data every 16 months. We keep it as long as you need, searchable and exportable.
Open, portable, yours
GPL-3.0 on GitHub. Self-host on Cloudflare with your own Google OAuth app if you'd rather own the full stack.
Built for terminals too
Programmatic API for CI/CD pipelines. Submit URLs from your deploy script, not your browser.
From Search Console to indexed in four steps
Sign in with your Google account
One-click OAuth. We read your Search Console properties — never write, never share.
- Read-only access to Search Console
- Revoke any time from your Google account
- Tokens encrypted at rest
Choose an account
to continue to Request Indexing
Read-only access to Search Console
See every site in one place
All your verified properties, indexing rate, and Search Console metrics — no tab-switching.
- Aggregate across every property you own
- Filter by domain, path, or status
- Historical data preserved past Google's 16 months
Push URLs to the Indexing API
Paste a list or connect a sitemap. We batch the requests and respect Google's 200/day publish quota.
- Bulk submission up to the daily quota
- Automatic retry on transient failures
- Programmatic access via API
# Submit a URL to the Indexing API
curl -X POST \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TOKEN" \
--data '{
"url": "https://example.com/new-page",
"type": "URL_UPDATED"
}' \
indexing.googleapis.com/v3/urlNotifications:publishWatch the status flip to indexed
We poll the getMetadata endpoint and update your dashboard. No more refreshing Search Console.
- Live status per URL
- Alerts when pages drop out of the index
- Export the full history as CSV
Understand the API, then automate it.
The complete Google Indexing API guide
How the API works, authentication, quotas, error codes, plus working code samples in curl, TypeScript, and Python. One read, production-ready.
Step-by-step tutorial
Set up a GCP project, create a service account, connect to Search Console, and make your first API call.
Node.js implementation
Error handling, batch requests, retry logic, and production-ready patterns you can drop into your deploy pipeline.
Bulk URL submission
Submit hundreds of URLs a day within Google's quota. Batching, throttling, and failure recovery patterns.
Quota & rate limits
The 200 requests/day cap, the 180 req/min getMetadata limit, and how to stay under both without throttling errors.
Google is deleting
your data.
After 16 months, your Search Console history is gone. Seasonal patterns, year-over-year comparisons, the whole picture — wiped.
Search Console keeps data for the last 16 months. As a result, SEO reports in Analytics also include a maximum of 16 months of data.
Google Analytics Documentation →
No paywall, no surveillance, no lock-in.
Released on GitHub. Built with Nuxt by a Nuxt core team member.
We collect what's needed to make the product work. Delete it any time.
At rest and in transit for every sensitive field, including OAuth tokens.