Save the content that matters most.

ArchiveBox Plugin Gallery

Browse the plugins that ArchiveBox & abx-dl use to archive content from websites.

61 Plugins

Run The Selected Plugin

Pick some plugins below and copy the command to run them.

Quick Start abx-dl
Expand to see pip install details
pip install abx-dl
abx-dl install
mkdir archive
cd archive
abx-dl 'https://example.com'
61 / 61 visible

yt-dlp

ytdlp

Download video and audio media with metadata, subtitles, thumbnails, and description sidecars.

#02 Snapshot Embed Fullscreen 1 hook(s) 9 configs
yt-dlp (env,pip,brew,apt) node (env,apt,brew) +1 more
🎡 audio 🎬 video πŸ–Ό image πŸ’¬ application/x-subrip +3 more

Download video and audio media with metadata, subtitles, thumbnails, and description sidecars.

Dependencies & Outputs

Required Binaries
yt-dlp providers=env,pip,brew,apt node providers=env,apt,brew ffmpeg providers=env,apt,brew
Output Mimetypes
🎡 audio 🎬 video πŸ–Ό image πŸ’¬ application/x-subrip πŸ’¬ text/vtt πŸ“¦ application/json πŸ“„ text/plain

Run It

abx-dl --plugins=ytdlp 'https://example.com'
YTDLP_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
YTDLP_ENABLEDEnable video/audio downloading with yt-dlp true boolean MEDIA_ENABLED, SAVE_MEDIA, USE_MEDIA, USE_YTDLP, FETCH_MEDIA, SAVE_YTDLP
YTDLP_BINARYPath to yt-dlp binary "yt-dlp" string YOUTUBEDL_BINARY, YOUTUBE_DL_BINARY
NODE_BINARYPath to Node.js binary for yt-dlp JS runtime "node" string β€”
YTDLP_TIMEOUTTimeout for yt-dlp downloads in seconds 3600 integer
min 30
MEDIA_TIMEOUT
fallback: TIMEOUT
YTDLP_COOKIES_FILEPath to cookies file "" string fallback: COOKIES_FILE
YTDLP_MAX_SIZEMaximum file size for yt-dlp downloads "750m" string
pattern ^\d+[kmgKMG]?$
MEDIA_MAX_SIZE
YTDLP_CHECK_SSL_VALIDITYWhether to verify SSL certificates true boolean fallback: CHECK_SSL_VALIDITY
YTDLP_ARGSDefault yt-dlp arguments [
"--restrict-filenames"
"--trim-filenames=128"
"--write-description"
"--write-info-json"
"--write-thumbnail"
"--write-sub"
"--write-auto-subs"
"--convert-subs=srt"
"--yes-playlist"
"--continue"
"--no-abort-on-error"
"--ignore-errors"
"--geo-bypass"
"--add-metadata"
"--no-progress"
"--remote-components=ejs:github"
"-o"
"%(title)s.%(ext)s"
]
array YTDLP_DEFAULT_ARGS
YTDLP_ARGS_EXTRAExtra arguments to append to yt-dlp command [] array YTDLP_EXTRA_ARGS

gallery-dl

gallerydl

Download image and media galleries along with metadata sidecars from supported sites.

#03 Snapshot Embed Fullscreen 1 hook(s) 7 configs
gallery-dl (env,pip,brew,apt)
πŸ–Ό image 🎬 video πŸ“¦ application/json πŸ“„ text/plain +1 more

Download image and media galleries along with metadata sidecars from supported sites.

Dependencies & Outputs

Required Binaries
gallery-dl providers=env,pip,brew,apt
Output Mimetypes
πŸ–Ό image 🎬 video πŸ“¦ application/json πŸ“„ text/plain πŸ—œ application/zip

Run It

abx-dl --plugins=gallerydl 'https://example.com'
GALLERYDL_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
GALLERYDL_ENABLEDEnable gallery downloading with gallery-dl true boolean SAVE_GALLERYDL, USE_GALLERYDL
GALLERYDL_BINARYPath to gallery-dl binary "gallery-dl" string β€”
GALLERYDL_TIMEOUTTimeout for gallery downloads in seconds 3600 integer
min 30
fallback: TIMEOUT
GALLERYDL_COOKIES_FILEPath to cookies file "" string fallback: COOKIES_FILE
GALLERYDL_CHECK_SSL_VALIDITYWhether to verify SSL certificates true boolean fallback: CHECK_SSL_VALIDITY
GALLERYDL_ARGSDefault gallery-dl arguments [
"--write-metadata"
"--write-info-json"
]
array GALLERYDL_DEFAULT_ARGS
GALLERYDL_ARGS_EXTRAExtra arguments to append to gallery-dl command [] array GALLERYDL_EXTRA_ARGS

forum-dl

forumdl

Download forum threads and exports in JSONL, WARC, and mailbox-style archive formats.

#04 Snapshot Embed Fullscreen 1 hook(s) 6 configs
forum-dl (env,pip)
πŸ“¦ application/x-ndjson πŸ—„ application/warc βœ‰οΈ message/rfc822

Download forum threads and exports in JSONL, WARC, and mailbox-style archive formats.

Dependencies & Outputs

Required Binaries
forum-dl providers=env,pip
Output Mimetypes
πŸ“¦ application/x-ndjson πŸ—„ application/warc βœ‰οΈ message/rfc822

Run It

abx-dl --plugins=forumdl 'https://example.com'
FORUMDL_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
FORUMDL_ENABLEDEnable forum downloading with forum-dl true boolean SAVE_FORUMDL, USE_FORUMDL
FORUMDL_BINARYPath to forum-dl binary "forum-dl" string β€”
FORUMDL_TIMEOUTTimeout for forum downloads in seconds 3600 integer
min 30
fallback: TIMEOUT
FORUMDL_OUTPUT_FORMATOutput format for forum downloads "jsonl" string
jsonl | warc | mbox | maildir | mh | mmdf | babyl
β€”
FORUMDL_ARGSDefault forum-dl arguments [] array FORUMDL_DEFAULT_ARGS
FORUMDL_ARGS_EXTRAExtra arguments to append to forum-dl command [] array FORUMDL_EXTRA_ARGS

Git

git

Clone git repositories from supported repository URLs into the snapshot output directory.

#05 Snapshot Embed 1 hook(s) 6 configs
git (env,apt,brew)
πŸ“„ text πŸ’Ύ application πŸ–Ό image 🎡 audio +2 more

Clone git repositories from supported repository URLs into the snapshot output directory.

Dependencies & Outputs

Required Binaries
git providers=env,apt,brew
Output Mimetypes
πŸ“„ text πŸ’Ύ application πŸ–Ό image 🎡 audio 🎬 video πŸ”€ font

Run It

abx-dl --plugins=git 'https://example.com'
GIT_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
GIT_ENABLEDEnable git repository cloning true boolean SAVE_GIT, USE_GIT
GIT_BINARYPath to git binary "git" string β€”
GIT_TIMEOUTTimeout for git operations in seconds 120 integer
min 10
fallback: TIMEOUT
GIT_DOMAINSComma-separated list of domains to treat as git repositories "github.com,gitlab.com,bitbucket.org,gist.github.com,codeberg.org,gitea.com,git.sr.ht" string β€”
GIT_ARGSDefault git arguments [
"clone"
"--depth=1"
"--recursive"
]
array GIT_DEFAULT_ARGS
GIT_ARGS_EXTRAExtra arguments to append to git command [] array GIT_EXTRA_ARGS

wget

wget

Archive pages and their requisites with wget, optionally writing WARC captures.

#06 Snapshot Embed 1 hook(s) 9 configs
wget (env,apt,brew)
🌐 text/html πŸ—„ application/warc πŸ—œ application/gzip πŸ–Ό image +5 more

Archive pages and their requisites with wget, optionally writing WARC captures.

Dependencies & Outputs

Required Binaries
wget providers=env,apt,brew
Output Mimetypes
🌐 text/html πŸ—„ application/warc πŸ—œ application/gzip πŸ–Ό image 🎨 text/css ⚑ application/javascript πŸ”€ font 🎡 audio 🎬 video

Run It

abx-dl --plugins=wget 'https://example.com'
WGET_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
WGET_ENABLEDEnable wget archiving true boolean SAVE_WGET, USE_WGET
WGET_WARC_ENABLEDSave WARC archive file true boolean SAVE_WARC, WGET_SAVE_WARC
WGET_BINARYPath to wget binary "wget" string β€”
WGET_TIMEOUTTimeout for wget in seconds 60 integer
min 5
fallback: TIMEOUT
WGET_USER_AGENTUser agent string for wget "" string fallback: USER_AGENT
WGET_COOKIES_FILEPath to cookies file "" string fallback: COOKIES_FILE
WGET_CHECK_SSL_VALIDITYWhether to verify SSL certificates true boolean fallback: CHECK_SSL_VALIDITY
WGET_ARGSDefault wget arguments [
"--no-verbose"
"--adjust-extension"
"--convert-links"
"--force-directories"
"--backup-converted"
"--span-hosts"
"--no-parent"
"--page-requisites"
"--restrict-file-names=windows"
"--tries=2"
"-e"
"robots=off"
]
array WGET_DEFAULT_ARGS
WGET_ARGS_EXTRAExtra arguments to append to wget command [] array WGET_EXTRA_ARGS

Archive.org

archivedotorg

Submit URLs to the Internet Archive Wayback Machine and save the resulting archive link.

#08 Snapshot Embed 1 hook(s) 3 configs
πŸ“„ text/plain

Submit URLs to the Internet Archive Wayback Machine and save the resulting archive link.

Dependencies & Outputs

Output Mimetypes
πŸ“„ text/plain

Run It

abx-dl --plugins=archivedotorg 'https://example.com'
ARCHIVEDOTORG_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
ARCHIVEDOTORG_ENABLEDSubmit URLs to archive.org Wayback Machine true boolean SAVE_ARCHIVEDOTORG, USE_ARCHIVEDOTORG, SUBMIT_ARCHIVEDOTORG
ARCHIVEDOTORG_TIMEOUTTimeout for archive.org submission in seconds 60 integer
min 10
fallback: TIMEOUT
ARCHIVEDOTORG_USER_AGENTUser agent string "" string fallback: USER_AGENT

Favicon

favicon

Fetch and save the site favicon or touch icon.

#11 Snapshot Embed 1 hook(s) 4 configs
πŸ–Ό image

Fetch and save the site favicon or touch icon.

Dependencies & Outputs

Output Mimetypes
πŸ–Ό image

Run It

abx-dl --plugins=favicon 'https://example.com'
FAVICON_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
FAVICON_ENABLEDEnable favicon downloading true boolean SAVE_FAVICON, USE_FAVICON
FAVICON_TIMEOUTTimeout for favicon fetch in seconds 30 integer
min 5
fallback: TIMEOUT
FAVICON_USER_AGENTUser agent string "" string fallback: USER_AGENT
FAVICON_PROVIDERFallback favicon provider URL template. Supports either {} or {domain} placeholders. "https://www.google.com/s2/favicons?domain={}&format=ico" string β€”

uBlock Origin

ublock

Install the uBlock Origin extension to block ads, trackers, and other page clutter during archiving.

#12 Snapshot 1 hook(s) 2 configs
chromium (env,puppeteer) ublock (chromewebstore)

Install the uBlock Origin extension to block ads, trackers, and other page clutter during archiving.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer ublock providers=chromewebstore

Run It

abx-dl --plugins=ublock 'https://example.com'
UBLOCK_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
UBLOCK_ENABLEDEnable uBlock Origin browser extension for ad blocking true boolean USE_UBLOCK

I Still Don't Care About Cookies

istilldontcareaboutcookies

Install the I Still Don't Care About Cookies extension to dismiss cookie banners during archiving.

#13 Snapshot 1 hook(s) 2 configs
chromium (env,puppeteer) istilldontcareaboutcookies (chromewebstore)

Install the I Still Don't Care About Cookies extension to dismiss cookie banners during archiving.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer istilldontcareaboutcookies providers=chromewebstore

Run It

abx-dl --plugins=istilldontcareaboutcookies 'https://example.com'
ISTILLDONTCAREABOUTCOOKIES_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
ISTILLDONTCAREABOUTCOOKIES_ENABLEDEnable I Still Don't Care About Cookies browser extension true boolean USE_ISTILLDONTCAREABOUTCOOKIES

2Captcha

twocaptcha

Install and configure the 2Captcha extension to solve CAPTCHAs during browser-based archiving.

#14 Snapshot 2 hook(s) 7 configs
chromium (env,puppeteer) twocaptcha (chromewebstore)

Install and configure the 2Captcha extension to solve CAPTCHAs during browser-based archiving.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer twocaptcha providers=chromewebstore

Run It

abx-dl --plugins=twocaptcha 'https://example.com'
TWOCAPTCHA_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
TWOCAPTCHA_ENABLEDEnable 2captcha browser extension for automatic CAPTCHA solving true boolean CAPTCHA2_ENABLED, USE_CAPTCHA2, USE_TWOCAPTCHA
TWOCAPTCHA_API_KEY2captcha API key for CAPTCHA solving service (get from https://2captcha.com) "" string API_KEY_2CAPTCHA, CAPTCHA2_API_KEY
TWOCAPTCHA_RETRY_COUNTNumber of times to retry CAPTCHA solving on error 3 integer
min 0
CAPTCHA2_RETRY_COUNT
TWOCAPTCHA_RETRY_DELAYDelay in seconds between CAPTCHA solving retries 5 integer
min 0
CAPTCHA2_RETRY_DELAY
TWOCAPTCHA_TIMEOUTTimeout for CAPTCHA solving in seconds 60 integer
min 5
CAPTCHA2_TIMEOUT
fallback: TIMEOUT
TWOCAPTCHA_AUTO_SUBMITAutomatically submit forms after CAPTCHA is solved false boolean β€”

Modal Closer

modalcloser

Automatically dismiss dialogs, cookie banners, and framework modals while the page is being archived.

#15 Snapshot 1 hook(s) 4 configs
chromium (env,puppeteer)

Automatically dismiss dialogs, cookie banners, and framework modals while the page is being archived.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer

Run It

abx-dl --plugins=modalcloser 'https://example.com'
MODALCLOSER_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
MODALCLOSER_ENABLEDEnable automatic modal and dialog closing true boolean CLOSE_MODALS, AUTO_CLOSE_MODALS
MODALCLOSER_TIMEOUTDelay before auto-closing dialogs (ms) 1250 integer
min 100
β€”
MODALCLOSER_POLL_INTERVALHow often to check for CSS modals (ms) 500 integer
min 100
β€”

Console Log

consolelog

Capture browser console messages emitted while the page loads.

#21 Snapshot 1 hook(s) 3 configs
chromium (env,puppeteer)
πŸ“¦ application/x-ndjson

Capture browser console messages emitted while the page loads.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer
Output Mimetypes
πŸ“¦ application/x-ndjson

Run It

abx-dl --plugins=consolelog 'https://example.com'
CONSOLELOG_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
CONSOLELOG_ENABLEDEnable console log capture true boolean SAVE_CONSOLELOG, USE_CONSOLELOG
CONSOLELOG_TIMEOUTTimeout for console log capture in seconds 30 integer
min 5
fallback: TIMEOUT

DNS

dns

Record DNS activity observed while loading the page in Chrome.

#22 Snapshot 1 hook(s) 3 configs
chromium (env,puppeteer)
πŸ“¦ application/x-ndjson

Record DNS activity observed while loading the page in Chrome.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer
Output Mimetypes
πŸ“¦ application/x-ndjson

Run It

abx-dl --plugins=dns 'https://example.com'
DNS_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
DNS_ENABLEDEnable DNS traffic recording during page load true boolean SAVE_DNS, USE_DNS
DNS_TIMEOUTTimeout for DNS recording in seconds 30 integer
min 5
fallback: TIMEOUT

SSL Certificates

sslcerts

Capture TLS certificate and connection metadata for the loaded page.

#23 Snapshot 1 hook(s) 3 configs
chromium (env,puppeteer)
πŸ“¦ application/x-ndjson

Capture TLS certificate and connection metadata for the loaded page.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer
Output Mimetypes
πŸ“¦ application/x-ndjson

Run It

abx-dl --plugins=sslcerts 'https://example.com'
SSLCERTS_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
SSLCERTS_ENABLEDEnable SSL certificate capture true boolean SAVE_SSLCERTS, USE_SSLCERTS
SSLCERTS_TIMEOUTTimeout for SSL capture in seconds 30 integer
min 5
fallback: TIMEOUT

Responses

responses

Capture HTTP response metadata for requests made during page load.

#24 Snapshot 1 hook(s) 3 configs
chromium (env,puppeteer)
πŸ“¦ application/x-ndjson πŸ“„ text πŸ–Ό image 🎡 audio +3 more

Capture HTTP response metadata for requests made during page load.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer
Output Mimetypes
πŸ“¦ application/x-ndjson πŸ“„ text πŸ–Ό image 🎡 audio 🎬 video πŸ’Ύ application πŸ”€ font

Run It

abx-dl --plugins=responses 'https://example.com'
RESPONSES_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
RESPONSES_ENABLEDEnable HTTP response capture true boolean SAVE_RESPONSES, USE_RESPONSES
RESPONSES_TIMEOUTTimeout for response capture in seconds 30 integer
min 5
fallback: TIMEOUT

Redirects

redirects

Capture the redirect chain encountered while loading the page.

#25 Snapshot 1 hook(s) 3 configs
chromium (env,puppeteer)
πŸ“¦ application/x-ndjson

Capture the redirect chain encountered while loading the page.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer
Output Mimetypes
πŸ“¦ application/x-ndjson

Run It

abx-dl --plugins=redirects 'https://example.com'
REDIRECTS_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
REDIRECTS_ENABLEDEnable redirect chain capture true boolean SAVE_REDIRECTS, USE_REDIRECTS
REDIRECTS_TIMEOUTTimeout for redirect capture in seconds 30 integer
min 5
fallback: TIMEOUT

Static File

staticfile

Detect and download static-file responses directly when a URL resolves to a non-HTML asset.

#26 Snapshot Embed 1 hook(s) 3 configs
chromium (env,puppeteer)
πŸ“• application/pdf πŸ“š application/epub+zip πŸ–Ό image 🎡 audio +8 more

Detect and download static-file responses directly when a URL resolves to a non-HTML asset.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer
Output Mimetypes
πŸ“• application/pdf πŸ“š application/epub+zip πŸ–Ό image 🎡 audio 🎬 video πŸ“¦ application/json πŸ“‹ application/xml πŸ“Š text/csv πŸ“‹ text/xml πŸ—œ application/zip πŸ’Ύ application/octet-stream πŸ’Ύ application/x-

Run It

abx-dl --plugins=staticfile 'https://example.com'
STATICFILE_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
STATICFILE_ENABLEDEnable static file detection true boolean SAVE_STATICFILE, USE_STATICFILE
STATICFILE_TIMEOUTTimeout for static file detection in seconds 30 integer
min 5
fallback: TIMEOUT

Headers

headers

Capture HTTP headers for the main document response.

#27 Snapshot 1 hook(s) 3 configs
chromium (env,puppeteer)
πŸ“¦ application/json

Capture HTTP headers for the main document response.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer
Output Mimetypes
πŸ“¦ application/json

Run It

abx-dl --plugins=headers 'https://example.com'
HEADERS_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
HEADERS_ENABLEDEnable HTTP headers capture true boolean SAVE_HEADERS, USE_HEADERS
HEADERS_TIMEOUTTimeout for headers capture in seconds 30 integer
min 5
fallback: TIMEOUT

Chrome

chrome

Launch and manage a shared Chromium session for browser-driven plugins.

#30 Snapshot 7 hook(s) 22 configs
node (env,apt,brew) chromium (env,puppeteer)
πŸ“„ text/plain πŸ“¦ application/json

Launch and manage a shared Chromium session for browser-driven plugins.

Dependencies & Outputs

Required Binaries
node providers=env,apt,brew chromium providers=env,puppeteer
Output Mimetypes
πŸ“„ text/plain πŸ“¦ application/json

Run It

abx-dl --plugins=chrome 'https://example.com'
CHROME_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_ENABLEDEnable Chromium browser integration for archiving true boolean USE_CHROME
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
NODE_BINARYPath to Node.js binary "node" string β€”
CHROME_TIMEOUTTimeout for Chrome operations in seconds 60 integer
min 5
fallback: TIMEOUT
CHROME_LAUNCH_ATTEMPTSMaximum Chromium launch attempts before failing on transient startup errors 3 integer
min 1
β€”
CHROME_HEADLESSRun Chrome in headless mode true boolean β€”
PERSONAS_DIRShared Chrome/browser personas root "" string β€”
ACTIVE_PERSONAActive browser persona name "Default" string β€”
CHROME_SANDBOXEnable Chrome sandbox (disable in Docker with --no-sandbox) true boolean β€”
CHROME_RESOLUTIONBrowser viewport resolution (width,height) "1440,2000" string
pattern ^\d+,\d+$
fallback: RESOLUTION
CHROME_USER_DATA_DIRPath to Chrome user data directory for persistent sessions (derived from ACTIVE_PERSONA if not set) "" string β€”
CHROME_USER_AGENTUser agent string for Chrome "" string fallback: USER_AGENT
CHROME_CDP_URLConnect to an already-running browser over CDP instead of launching a new local Chromium process "" string β€”
CHROME_IS_LOCALWhether the managed browser process is local and should have a live chrome.pid marker true boolean β€”
CHROME_KEEPALIVEKeep the browser alive after the owning crawl/snapshot hook exits instead of closing it during cleanup false boolean β€”
CHROME_ISOLATIONWhether Chrome runs as one shared browser per crawl or a separate browser per snapshot "crawl" string
crawl | snapshot
β€”
CHROME_ARGSDefault Chrome command-line arguments (static flags only, dynamic args like --user-data-dir are added at runtime) [
"--no-first-run"
"--no-default-browser-check"
"--disable-default-apps"
"--disable-sync"
"--disable-infobars"
"--disable-blink-features=AutomationControlled"
"--disable-component-update"
"--disable-domain-reliability"
"--disable-breakpad"
"--disable-client-side-phishing-detection"
"--disable-hang-monitor"
"--disable-speech-synthesis-api"
"--disable-speech-api"
"--disable-print-preview"
"--disable-notifications"
"--disable-desktop-notifications"
"--disable-popup-blocking"
"--disable-prompt-on-repost"
"--disable-external-intent-requests"
"--disable-session-crashed-bubble"
"--disable-search-engine-choice-screen"
"--disable-datasaver-prompt"
"--ash-no-nudges"
"--hide-crash-restore-bubble"
"--suppress-message-center-popups"
"--noerrdialogs"
"--no-pings"
"--silent-debugger-extension-api"
"--deny-permission-prompts"
"--safebrowsing-disable-auto-update"
"--metrics-recording-only"
"--password-store=basic"
"--use-mock-keychain"
"--disable-cookie-encryption"
"--font-render-hinting=none"
"--force-color-profile=srgb"
"--disable-partial-raster"
"--disable-skia-runtime-opts"
"--disable-2d-canvas-clip-aa"
"--enable-webgl"
"--hide-scrollbars"
"--export-tagged-pdf"
"--generate-pdf-document-outline"
"--disable-lazy-loading"
"--disable-renderer-backgrounding"
"--disable-background-networking"
"--disable-background-timer-throttling"
"--disable-backgrounding-occluded-windows"
"--disable-ipc-flooding-protection"
"--disable-extensions-http-throttling"
"--disable-field-trial-config"
"--disable-back-forward-cache"
"--autoplay-policy=no-user-gesture-required"
"--disable-gesture-requirement-for-media-playback"
"--lang=en-US,en;q=0.9"
"--log-level=2"
"--enable-logging=stderr"
]
array CHROME_DEFAULT_ARGS
CHROME_ARGS_EXTRAExtra arguments to append to Chrome command (for user customization) [] array CHROME_EXTRA_ARGS
CHROME_PAGELOAD_TIMEOUTTimeout for page navigation/load in seconds 60 integer
min 5
fallback: CHROME_TIMEOUT
CHROME_WAIT_FORPage load completion condition (domcontentloaded, load, networkidle0, networkidle2) "networkidle2" string
domcontentloaded | load | networkidle0 | networkidle2
β€”
CHROME_DELAY_AFTER_LOADExtra delay in seconds after page load completes before archiving (useful for JS-heavy SPAs) 0 number
min 0
β€”
CHROME_CHECK_SSL_VALIDITYWhether to verify SSL certificates (disable for self-signed certs) true boolean fallback: CHECK_SSL_VALIDITY

SEO

seo

Capture SEO-related metadata such as meta tags and Open Graph fields.

#38 Snapshot 1 hook(s) 3 configs
chromium (env,puppeteer)
πŸ“¦ application/json

Capture SEO-related metadata such as meta tags and Open Graph fields.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer
Output Mimetypes
πŸ“¦ application/json

Run It

abx-dl --plugins=seo 'https://example.com'
SEO_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
SEO_ENABLEDEnable SEO metadata capture true boolean SAVE_SEO, USE_SEO
SEO_TIMEOUTTimeout for SEO capture in seconds 30 integer
min 5
fallback: TIMEOUT

Accessibility

accessibility

Capture the browser accessibility tree for the archived page.

#39 Snapshot 1 hook(s) 3 configs
chromium (env,puppeteer)
πŸ“¦ application/json

Capture the browser accessibility tree for the archived page.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer
Output Mimetypes
πŸ“¦ application/json

Run It

abx-dl --plugins=accessibility 'https://example.com'
ACCESSIBILITY_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
ACCESSIBILITY_ENABLEDEnable accessibility tree capture true boolean SAVE_ACCESSIBILITY, USE_ACCESSIBILITY
ACCESSIBILITY_TIMEOUTTimeout for accessibility capture in seconds 30 integer
min 5
fallback: TIMEOUT

Infinite Scroll

infiniscroll

Expand infinite-scroll pages and load additional content before downstream capture plugins run.

#45 Snapshot 1 hook(s) 8 configs
chromium (env,puppeteer)

Expand infinite-scroll pages and load additional content before downstream capture plugins run.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer

Run It

abx-dl --plugins=infiniscroll 'https://example.com'
INFINISCROLL_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
INFINISCROLL_ENABLEDEnable infinite scroll page expansion true boolean SAVE_INFINISCROLL, USE_INFINISCROLL
INFINISCROLL_TIMEOUTMaximum timeout for scrolling in seconds 120 integer
min 10
β€”
INFINISCROLL_SCROLL_DELAYDelay between scrolls in milliseconds 2000 integer
min 500
β€”
INFINISCROLL_SCROLL_DISTANCEDistance to scroll per step in pixels 1600 integer
min 100
β€”
INFINISCROLL_SCROLL_LIMITMaximum number of scroll steps 10 integer
min 1
β€”
INFINISCROLL_MIN_HEIGHTMinimum page height to scroll to in pixels 16000 integer
min 1000
β€”
INFINISCROLL_EXPAND_DETAILSExpand <details> elements and click 'load more' buttons for comments true boolean β€”

Claude Chrome

claudechrome

Use Claude computer-use to interact with pages in Chrome via CDP screenshots and the Anthropic API.

#47 Snapshot Embed Fullscreen 2 hook(s) 10 configs
chromium (env,puppeteer) node (env,apt,brew) +1 more
πŸ“¦ application/json πŸ–Ό image/png

Use Claude computer-use to interact with pages in Chrome via CDP screenshots and the Anthropic API.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer node providers=env,apt,brew claudechrome providers=chromewebstore
Output Mimetypes
πŸ“¦ application/json πŸ–Ό image/png

Run It

abx-dl --plugins=claudechrome 'https://example.com'
CLAUDECHROME_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
NODE_BINARYPath to Node.js binary "node" string β€”
PERSONAS_DIRShared Chrome/browser personas root "" string β€”
ACTIVE_PERSONAActive browser persona name "Default" string β€”
CLAUDECHROME_ENABLEDEnable Claude for Chrome browser extension for AI-driven page interaction false boolean USE_CLAUDECHROME
CLAUDECHROME_PROMPTPrompt for Claude to execute on the page. Claude can click buttons, fill forms, download files, and interact with any page element. "Look at the current page. If there are any "expand", "show more", "load more", or similar buttons/links, click them all to reveal hidden content. Report what you did." string β€”
CLAUDECHROME_TIMEOUTTimeout for Claude for Chrome operations in seconds 120 integer
min 10
fallback: TIMEOUT
CLAUDECHROME_MODELClaude model to use (e.g. claude-sonnet-4-6, claude-opus-4-6, claude-haiku-4-5-20251001). Availability depends on your plan. "claude-sonnet-4-6" string β€”
CLAUDECHROME_MAX_ACTIONSMaximum number of agentic loop iterations (screenshots + actions) per page 15 integer
min 1
β€”
ANTHROPIC_API_KEYAnthropic API key for Claude for Chrome authentication "" string β€”

SingleFile

singlefile

Save a complete page as a single self-contained HTML file using the SingleFile extension or CLI.

#50 Snapshot Embed 1 hook(s) 13 configs
chromium (env,puppeteer) single-file (env,npm) +1 more
🌐 text/html

Save a complete page as a single self-contained HTML file using the SingleFile extension or CLI.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer single-file providers=env,npm singlefile providers=chromewebstore
Output Mimetypes
🌐 text/html

Run It

abx-dl --plugins=singlefile 'https://example.com'
SINGLEFILE_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
PERSONAS_DIRShared Chrome/browser personas root "" string β€”
ACTIVE_PERSONAActive browser persona name "Default" string β€”
SINGLEFILE_ENABLEDEnable SingleFile archiving true boolean SAVE_SINGLEFILE, USE_SINGLEFILE
SINGLEFILE_BINARYPath to single-file binary "single-file" string SINGLE_FILE_BINARY
NODE_BINARYPath to Node.js binary "node" string β€”
SINGLEFILE_TIMEOUTTimeout for SingleFile in seconds 60 integer
min 10
fallback: TIMEOUT
SINGLEFILE_USER_AGENTUser agent string "" string fallback: USER_AGENT
SINGLEFILE_COOKIES_FILEPath to cookies file "" string fallback: COOKIES_FILE
SINGLEFILE_CHECK_SSL_VALIDITYWhether to verify SSL certificates true boolean fallback: CHECK_SSL_VALIDITY
SINGLEFILE_CHROME_ARGSChrome command-line arguments for SingleFile [] array fallback: CHROME_ARGS
SINGLEFILE_ARGSDefault single-file arguments [
"--browser-headless"
]
array SINGLEFILE_DEFAULT_ARGS
SINGLEFILE_ARGS_EXTRAExtra arguments to append to single-file command [] array SINGLEFILE_EXTRA_ARGS

Screenshot

screenshot

Capture a PNG screenshot of the rendered page.

#51 Snapshot Embed Fullscreen 1 hook(s) 4 configs
chromium (env,puppeteer)
πŸ–Ό image/png

Capture a PNG screenshot of the rendered page.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer
Output Mimetypes
πŸ–Ό image/png

Run It

abx-dl --plugins=screenshot 'https://example.com'
SCREENSHOT_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
SCREENSHOT_ENABLEDEnable screenshot capture true boolean SAVE_SCREENSHOT, USE_SCREENSHOT
SCREENSHOT_TIMEOUTTimeout for screenshot capture in seconds 60 integer
min 5
fallback: TIMEOUT
SCREENSHOT_RESOLUTIONScreenshot resolution (width,height) "1440,2000" string
pattern ^\d+,\d+$
fallback: RESOLUTION

PDF

pdf

Render the current page to PDF using the shared Chrome session.

#52 Snapshot Embed Fullscreen 1 hook(s) 4 configs
chromium (env,puppeteer)
πŸ“• application/pdf

Render the current page to PDF using the shared Chrome session.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer
Output Mimetypes
πŸ“• application/pdf

Run It

abx-dl --plugins=pdf 'https://example.com'
PDF_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
PDF_ENABLEDEnable PDF generation true boolean SAVE_PDF, USE_PDF
PDF_TIMEOUTTimeout for PDF generation in seconds 60 integer
min 5
fallback: TIMEOUT
PDF_RESOLUTIONPDF page resolution (width,height) "1440,2000" string
pattern ^\d+,\d+$
fallback: RESOLUTION

DOM

dom

Save the fully rendered DOM HTML from the live page.

#53 Snapshot Embed 1 hook(s) 3 configs
chromium (env,puppeteer)
🌐 text/html

Save the fully rendered DOM HTML from the live page.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer
Output Mimetypes
🌐 text/html

Run It

abx-dl --plugins=dom 'https://example.com'
DOM_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
DOM_ENABLEDEnable DOM capture true boolean SAVE_DOM, USE_DOM
DOM_TIMEOUTTimeout for DOM capture in seconds 60 integer
min 5
fallback: TIMEOUT

Title

title

Capture the final document title from the rendered page.

#54 Snapshot 1 hook(s) 3 configs
chromium (env,puppeteer)
πŸ“„ text/plain

Capture the final document title from the rendered page.

Dependencies & Outputs

Required Binaries
chromium providers=env,puppeteer
Output Mimetypes
πŸ“„ text/plain

Run It

abx-dl --plugins=title 'https://example.com'
TITLE_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
TITLE_ENABLEDEnable title extraction true boolean SAVE_TITLE, USE_TITLE
TITLE_TIMEOUTTimeout for title extraction in seconds 30 integer
min 5
fallback: TIMEOUT

Readability

readability

Extract article HTML, text, and metadata using Mozilla Readability.

#56 Snapshot Embed Fullscreen 1 hook(s) 5 configs
readability-extractor (env,npm)
🌐 text/html πŸ“„ text/plain πŸ“¦ application/json

Extract article HTML, text, and metadata using Mozilla Readability.

Dependencies & Outputs

Required Binaries
readability-extractor providers=env,npm
Output Mimetypes
🌐 text/html πŸ“„ text/plain πŸ“¦ application/json

Run It

abx-dl --plugins=readability 'https://example.com'
READABILITY_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
READABILITY_ENABLEDEnable Readability text extraction true boolean SAVE_READABILITY, USE_READABILITY
READABILITY_BINARYPath to readability-extractor binary "readability-extractor" string β€”
READABILITY_TIMEOUTTimeout for Readability in seconds 30 integer
min 5
fallback: TIMEOUT
READABILITY_ARGSDefault Readability arguments [] array READABILITY_DEFAULT_ARGS
READABILITY_ARGS_EXTRAExtra arguments to append to Readability command [] array READABILITY_EXTRA_ARGS

Defuddle

defuddle

Extract cleaned article HTML, text, and metadata from archived HTML using Defuddle.

#57 Snapshot 1 hook(s) 5 configs
defuddle (env,npm)
🌐 text/html πŸ“„ text/plain πŸ“¦ application/json

Extract cleaned article HTML, text, and metadata from archived HTML using Defuddle.

Dependencies & Outputs

Required Binaries
defuddle providers=env,npm
Output Mimetypes
🌐 text/html πŸ“„ text/plain πŸ“¦ application/json

Run It

abx-dl --plugins=defuddle 'https://example.com'
DEFUDDLE_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
DEFUDDLE_ENABLEDEnable Defuddle text extraction true boolean SAVE_DEFUDDLE, USE_DEFUDDLE
DEFUDDLE_BINARYPath to defuddle binary "defuddle" string β€”
DEFUDDLE_TIMEOUTTimeout for Defuddle in seconds 30 integer
min 5
fallback: TIMEOUT
DEFUDDLE_ARGSDefault Defuddle arguments [] array DEFUDDLE_DEFAULT_ARGS
DEFUDDLE_ARGS_EXTRAExtra arguments to append to Defuddle command [] array DEFUDDLE_EXTRA_ARGS

Mercury

mercury

Extract article HTML, text, and metadata using the Postlight Mercury parser.

#57 Snapshot Embed 1 hook(s) 5 configs
postlight-parser (npm,env)
🌐 text/html πŸ“„ text/plain πŸ“¦ application/json

Extract article HTML, text, and metadata using the Postlight Mercury parser.

Dependencies & Outputs

Required Binaries
postlight-parser providers=npm,env
Output Mimetypes
🌐 text/html πŸ“„ text/plain πŸ“¦ application/json

Run It

abx-dl --plugins=mercury 'https://example.com'
MERCURY_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
MERCURY_ENABLEDEnable Mercury text extraction true boolean SAVE_MERCURY, USE_MERCURY
MERCURY_BINARYPath to Mercury/Postlight parser binary "postlight-parser" string β€”
MERCURY_TIMEOUTTimeout for Mercury in seconds 30 integer
min 5
fallback: TIMEOUT
MERCURY_ARGSDefault Mercury parser arguments [] array MERCURY_DEFAULT_ARGS
MERCURY_ARGS_EXTRAExtra arguments to append to Mercury parser command [] array MERCURY_EXTRA_ARGS

Claude Code Extract

claudecodeextract

Use Claude Code to generate clean Markdown from snapshot extractor outputs.

#58 Snapshot Embed Fullscreen 1 hook(s) 6 configs
claude (env,npm)
πŸ“ text/markdown

Use Claude Code to generate clean Markdown from snapshot extractor outputs.

Dependencies & Outputs

Required Binaries
claude providers=env,npm
Output Mimetypes
πŸ“ text/markdown

Run It

abx-dl --plugins=claudecodeextract 'https://example.com'
CLAUDECODEEXTRACT_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CLAUDECODE_BINARYPath to Claude Code CLI binary "claude" string β€”
CLAUDECODEEXTRACT_ENABLEDEnable Claude Code AI extraction false boolean USE_CLAUDECODEEXTRACT
CLAUDECODEEXTRACT_TIMEOUTTimeout for Claude Code extraction in seconds 120 integer
min 10
fallback: CLAUDECODE_TIMEOUT
CLAUDECODEEXTRACT_PROMPTCustom prompt for Claude Code extraction. Use this to define what Claude should extract or generate from the snapshot. "Read all the previously extracted outputs in this snapshot directory (readability/, mercury/, defuddle/, htmltotext/, dom/, singlefile/, etc.). Using the best available source, generate a clean, well-formatted Markdown representation of the page content. Save the output as content.md in your output directory." string β€”
CLAUDECODEEXTRACT_MODELClaude model to use for extraction (e.g. claude-sonnet-4-6, claude-opus-4-6, claude-haiku-4-5-20251001) "claude-sonnet-4-6" string fallback: CLAUDECODE_MODEL
CLAUDECODEEXTRACT_MAX_TURNSMaximum number of agentic turns for extraction 50 integer
min 1
fallback: CLAUDECODE_MAX_TURNS

HTML to Text

htmltotext

Convert archived HTML from other extractors into plain text for indexing and analysis.

#58 Snapshot 1 hook(s) 2 configs
πŸ“„ text/plain

Convert archived HTML from other extractors into plain text for indexing and analysis.

Dependencies & Outputs

Output Mimetypes
πŸ“„ text/plain

Run It

abx-dl --plugins=htmltotext 'https://example.com'
HTMLTOTEXT_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
HTMLTOTEXT_ENABLEDEnable HTML to text conversion true boolean SAVE_HTMLTOTEXT, USE_HTMLTOTEXT
HTMLTOTEXT_TIMEOUTTimeout for HTML to text conversion in seconds 30 integer
min 5
fallback: TIMEOUT

Trafilatura

trafilatura

Extract article content from archived HTML into text, markdown, HTML, CSV, JSON, and XML formats.

#59 Snapshot 1 hook(s) 4 configs
trafilatura (env,pip)
πŸ“„ text/plain πŸ“ text/markdown 🌐 text/html πŸ“Š text/csv +3 more

Extract article content from archived HTML into text, markdown, HTML, CSV, JSON, and XML formats.

Dependencies & Outputs

Required Binaries
trafilatura providers=env,pip
Output Mimetypes
πŸ“„ text/plain πŸ“ text/markdown 🌐 text/html πŸ“Š text/csv πŸ“¦ application/json πŸ“‹ application/xml πŸ“‹ application/tei+xml

Run It

abx-dl --plugins=trafilatura 'https://example.com'
TRAFILATURA_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
TRAFILATURA_ENABLEDEnable Trafilatura extraction true boolean SAVE_TRAFILATURA, USE_TRAFILATURA
TRAFILATURA_BINARYPath to trafilatura binary "trafilatura" string β€”
TRAFILATURA_TIMEOUTTimeout for Trafilatura in seconds 30 integer
min 5
fallback: TIMEOUT
TRAFILATURA_OUTPUT_FORMATSComma-separated trafilatura output formats to write (txt, markdown, html, csv, json, xml, xmltei) "txt,markdown,html" string β€”

OpenDataLoader

opendataloader

Extract structured text, tables, and metadata from PDFs using opendataloader-pdf. Supports OCR for scanned PDFs via hybrid backend.

#60 Snapshot 1 hook(s) 8 configs
opendataloader-pdf (env,pip) java>=11.0.0 (env,apt,brew)
πŸ“„ text/plain πŸ“ text/markdown πŸ“¦ application/json

Extract structured text, tables, and metadata from PDFs using opendataloader-pdf. Supports OCR for scanned PDFs via hybrid backend.

Dependencies & Outputs

Required Binaries
opendataloader-pdf providers=env,pip java providers=env,apt,brew min_version=11.0.0
Output Mimetypes
πŸ“„ text/plain πŸ“ text/markdown πŸ“¦ application/json

Run It

abx-dl --plugins=opendataloader 'https://example.com'
OPENDATALOADER_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
OPENDATALOADER_ENABLEDEnable PDF text extraction with opendataloader-pdf true boolean SAVE_OPENDATALOADER, USE_OPENDATALOADER
OPENDATALOADER_BINARYPath to opendataloader-pdf binary "opendataloader-pdf" string β€”
OPENDATALOADER_JAVA_BINARYPath to the Java runtime used by opendataloader-pdf "java" string fallback: JAVA_BINARY
OPENDATALOADER_TIMEOUTTimeout for PDF extraction in seconds 120 integer
min 10
fallback: TIMEOUT
OPENDATALOADER_FORCE_OCRUse hybrid OCR backend (--hybrid docling-fast) for scanned/image-based PDFs. Requires opendataloader-pdf-hybrid server running. false boolean β€”
OPENDATALOADER_HYBRID_URLURL of the opendataloader-pdf-hybrid server (e.g. http://localhost:5002). If empty, uses the default built-in URL. "" string β€”
OPENDATALOADER_ARGSDefault opendataloader-pdf arguments [] array OPENDATALOADER_DEFAULT_ARGS
OPENDATALOADER_ARGS_EXTRAExtra arguments to append to opendataloader-pdf command [] array OPENDATALOADER_EXTRA_ARGS

LiteParse

liteparse

Extract text and metadata from PDFs and documents using LiteParse (by LlamaIndex). Supports OCR via Tesseract.js.

#61 Snapshot 1 hook(s) 5 configs
lit (env,npm)
πŸ“„ text/plain πŸ“¦ application/json

Extract text and metadata from PDFs and documents using LiteParse (by LlamaIndex). Supports OCR via Tesseract.js.

Dependencies & Outputs

Required Binaries
lit providers=env,npm
Output Mimetypes
πŸ“„ text/plain πŸ“¦ application/json

Run It

abx-dl --plugins=liteparse 'https://example.com'
LITEPARSE_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
LITEPARSE_ENABLEDEnable LiteParse document extraction true boolean SAVE_LITEPARSE, USE_LITEPARSE
LITEPARSE_BINARYPath to lit binary "lit" string β€”
LITEPARSE_TIMEOUTTimeout for LiteParse extraction in seconds 120 integer
min 10
fallback: TIMEOUT
LITEPARSE_ARGSDefault LiteParse arguments [] array LITEPARSE_DEFAULT_ARGS
LITEPARSE_ARGS_EXTRAExtra arguments to append to LiteParse command [] array LITEPARSE_EXTRA_ARGS

papers-dl

papersdl

Fetch downloadable academic papers from paper URLs and DOI targets.

#66 Snapshot Embed Fullscreen 1 hook(s) 5 configs
papers-dl (env,pip)
πŸ“• application/pdf

Fetch downloadable academic papers from paper URLs and DOI targets.

Dependencies & Outputs

Required Binaries
papers-dl providers=env,pip
Output Mimetypes
πŸ“• application/pdf

Run It

abx-dl --plugins=papersdl 'https://example.com'
PAPERSDL_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
PAPERSDL_ENABLEDEnable paper downloading with papers-dl true boolean SAVE_PAPERSDL, USE_PAPERSDL
PAPERSDL_BINARYPath to papers-dl binary "papers-dl" string β€”
PAPERSDL_TIMEOUTTimeout for paper downloads in seconds 300 integer
min 30
fallback: TIMEOUT
PAPERSDL_ARGSDefault papers-dl arguments [
"fetch"
]
array PAPERSDL_DEFAULT_ARGS
PAPERSDL_ARGS_EXTRAExtra arguments to append to papers-dl command [] array PAPERSDL_EXTRA_ARGS

Parse HTML URLs

parse_html_urls

Parse HTML documents and emit discovered links as JSONL snapshot records.

#70 Snapshot 1 hook(s) 1 configs
πŸ“¦ application/x-ndjson

Parse HTML documents and emit discovered links as JSONL snapshot records.

Dependencies & Outputs

Output Mimetypes
πŸ“¦ application/x-ndjson

Run It

abx-dl --plugins=parse_html_urls 'https://example.com'
PARSE_HTML_URLS_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
PARSE_HTML_URLS_ENABLEDEnable HTML URL parsing true boolean USE_PARSE_HTML_URLS

Parse Text URLs

parse_txt_urls

Parse plain text documents and emit discovered URLs as JSONL snapshot records.

#71 Snapshot 1 hook(s) 1 configs
πŸ“¦ application/x-ndjson

Parse plain text documents and emit discovered URLs as JSONL snapshot records.

Dependencies & Outputs

Output Mimetypes
πŸ“¦ application/x-ndjson

Run It

abx-dl --plugins=parse_txt_urls 'https://example.com'
PARSE_TXT_URLS_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
PARSE_TXT_URLS_ENABLEDEnable plain text URL parsing true boolean USE_PARSE_TXT_URLS

Parse RSS URLs

parse_rss_urls

Parse RSS and Atom feeds and emit discovered entry URLs as JSONL snapshot records.

#72 Snapshot 1 hook(s) 1 configs
πŸ“¦ application/x-ndjson

Parse RSS and Atom feeds and emit discovered entry URLs as JSONL snapshot records.

Dependencies & Outputs

Output Mimetypes
πŸ“¦ application/x-ndjson

Run It

abx-dl --plugins=parse_rss_urls 'https://example.com'
PARSE_RSS_URLS_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
PARSE_RSS_URLS_ENABLEDEnable RSS/Atom feed URL parsing true boolean USE_PARSE_RSS_URLS

Parse Netscape URLs

parse_netscape_urls

Parse Netscape bookmark HTML exports and emit discovered URLs as JSONL snapshot records.

#73 Snapshot 1 hook(s) 1 configs
πŸ“¦ application/x-ndjson

Parse Netscape bookmark HTML exports and emit discovered URLs as JSONL snapshot records.

Dependencies & Outputs

Output Mimetypes
πŸ“¦ application/x-ndjson

Run It

abx-dl --plugins=parse_netscape_urls 'https://example.com'
PARSE_NETSCAPE_URLS_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
PARSE_NETSCAPE_URLS_ENABLEDEnable Netscape bookmarks HTML URL parsing true boolean USE_PARSE_NETSCAPE_URLS

Parse JSONL URLs

parse_jsonl_urls

Parse JSONL bookmark exports and emit discovered URLs as JSONL snapshot records.

#74 Snapshot 1 hook(s) 1 configs
πŸ“¦ application/x-ndjson

Parse JSONL bookmark exports and emit discovered URLs as JSONL snapshot records.

Dependencies & Outputs

Output Mimetypes
πŸ“¦ application/x-ndjson

Run It

abx-dl --plugins=parse_jsonl_urls 'https://example.com'
PARSE_JSONL_URLS_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
PARSE_JSONL_URLS_ENABLEDEnable JSON Lines URL parsing true boolean USE_PARSE_JSONL_URLS

SQLite Search

search_backend_sqlite

Index archived snapshot content into a SQLite FTS database for local search.

#90 Snapshot 1 hook(s) 5 configs
πŸ—ƒ application/vnd.sqlite3

Index archived snapshot content into a SQLite FTS database for local search.

Dependencies & Outputs

Output Mimetypes
πŸ—ƒ application/vnd.sqlite3

Run It

abx-dl --plugins=search_backend_sqlite 'https://example.com'
archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
SEARCH_BACKEND_ENGINESelected search backend implementation "sqlite" string β€”
USE_INDEXING_BACKENDEnable search indexing for archived snapshots true boolean β€”
SEARCH_BACKEND_SQLITE_DBSQLite FTS database filename "search.sqlite3" string SQLITEFTS_DB
SEARCH_BACKEND_SQLITE_SEPARATE_DATABASEUse separate database file for FTS index true boolean FTS_SEPARATE_DATABASE, SQLITEFTS_SEPARATE_DATABASE
SEARCH_BACKEND_SQLITE_TOKENIZERSFTS5 tokenizer configuration "porter unicode61 remove_diacritics 2" string FTS_TOKENIZERS, SQLITEFTS_TOKENIZERS

Sonic Search

search_backend_sonic

Index archived snapshot content into a Sonic search backend.

#91 Snapshot 2 hook(s) 9 configs
sonic (env,apt,brew,cargo)

Index archived snapshot content into a Sonic search backend.

Dependencies & Outputs

Required Binaries
sonic providers=env,apt,brew,cargo

Run It

abx-dl --plugins=search_backend_sonic 'https://example.com'
archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
SEARCH_BACKEND_ENGINESelected search backend implementation "sonic" string β€”
USE_INDEXING_BACKENDEnable search indexing for archived snapshots true boolean β€”
SONIC_BINARYPath to Sonic server binary "sonic" string β€”
SONIC_DIRDirectory used to store the Sonic config, logs, and index data "" string β€”
SEARCH_BACKEND_SONIC_HOST_NAMESonic server hostname "127.0.0.1" string SEARCH_BACKEND_HOST_NAME, SONIC_HOST
SEARCH_BACKEND_SONIC_PORTSonic server port 1491 integer
min 1
SEARCH_BACKEND_PORT, SONIC_PORT
SEARCH_BACKEND_SONIC_PASSWORDSonic server password "SecretPassword" string SEARCH_BACKEND_PASSWORD, SONIC_PASSWORD
SEARCH_BACKEND_SONIC_COLLECTIONSonic collection name "archivebox" string SONIC_COLLECTION
SEARCH_BACKEND_SONIC_BUCKETSonic bucket name "snapshots" string SONIC_BUCKET

Claude Code Cleanup

claudecodecleanup

Use Claude Code to deduplicate and clean up redundant snapshot extractor outputs.

#92 Snapshot Embed Fullscreen 1 hook(s) 6 configs
claude (env,npm)
πŸ“„ text/plain

Use Claude Code to deduplicate and clean up redundant snapshot extractor outputs.

Dependencies & Outputs

Required Binaries
claude providers=env,npm
Output Mimetypes
πŸ“„ text/plain

Run It

abx-dl --plugins=claudecodecleanup 'https://example.com'
CLAUDECODECLEANUP_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CLAUDECODE_BINARYPath to Claude Code CLI binary "claude" string β€”
CLAUDECODECLEANUP_ENABLEDEnable Claude Code AI cleanup of snapshot files false boolean USE_CLAUDECODECLEANUP
CLAUDECODECLEANUP_TIMEOUTTimeout for Claude Code cleanup in seconds 180 integer
min 10
fallback: CLAUDECODE_TIMEOUT
CLAUDECODECLEANUP_PROMPTCustom prompt for Claude Code cleanup. Defines what Claude should clean up and how to determine which duplicates to keep. "Analyze all the extractor output directories in this snapshot. Look for duplicate or redundant outputs across plugins (e.g. multiple HTML extractions, multiple text extractions, multiple URL extraction outputs, etc.). For each group of similar outputs, inspect the content and determine which version is the best quality. Delete the inferior/redundant versions, keeping only the best one. Also remove any unnecessary temporary files, empty directories, or incomplete outputs. Write a summary of what you cleaned up to cleanup_report.txt in your output directory." string β€”
CLAUDECODECLEANUP_MODELClaude model to use for cleanup (e.g. claude-sonnet-4-6, claude-opus-4-6, claude-haiku-4-5-20251001) "claude-sonnet-4-6" string fallback: CLAUDECODE_MODEL
CLAUDECODECLEANUP_MAX_TURNSMaximum number of agentic turns for cleanup 50 integer
min 1
fallback: CLAUDECODE_MAX_TURNS

Hashes

hashes

Generate a hash manifest for files produced in the snapshot directory.

#93 Snapshot 1 hook(s) 2 configs
πŸ“¦ application/json

Generate a hash manifest for files produced in the snapshot directory.

Dependencies & Outputs

Output Mimetypes
πŸ“¦ application/json

Run It

abx-dl --plugins=hashes 'https://example.com'
HASHES_ENABLED=true archivebox add 'https://example.com'

Runtime plugins execute while archiving a URL.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
HASHES_ENABLEDEnable merkle tree hash generation true boolean SAVE_HASHES, USE_HASHES
HASHES_TIMEOUTTimeout for merkle tree generation in seconds 30 integer
min 5
fallback: TIMEOUT

APT

apt

Install binaries through the Debian and Ubuntu APT package manager.

1 hook(s) 0 configs

Install binaries through the Debian and Ubuntu APT package manager.

Run It

abx-dl plugins --install apt
archivebox init --setup

Setup plugins install dependencies or prepare shared runtime state.

Hook Scripts

Env Var Config Options

This plugin does not define a config.json schema.

Base

base

Provide shared utilities, helpers, and test support used by other plugins.

0 hook(s) 19 configs

Provide shared utilities, helpers, and test support used by other plugins.

Run It

abx-dl plugins base
archivebox add 'https://example.com'

Utility plugins are typically consumed indirectly, so the example shows the closest inspection workflow.

Hook Scripts

No hook scripts are defined in this plugin directory.

Env Var Config Options

Key Default Type Aliases / Fallback
DATA_DIRBase data directory for the current runtime "" string β€”
ABX_RUNTIMECurrent host runtime name, eg. abx-dl or archivebox "abx-dl" string β€”
SNAP_DIRBase snapshot directory for per-snapshot hook output "" string β€”
CRAWL_DIRBase crawl directory for per-crawl hook output "" string β€”
LIB_DIRShared tools and binary installation root "" string β€”
PERSONAS_DIRShared Chrome/browser personas root "" string β€”
ACTIVE_PERSONAActive browser persona name "Default" string β€”
EXTRA_CONTEXTJSON object merged into emitted JSONL event records "" string β€”
TIMEOUTDefault timeout in seconds for hooks that support a TIMEOUT fallback 60 integer
min 0
β€”
USER_AGENTDefault user agent string for HTTP requests and browser automation "Mozilla/5.0 (compatible; ArchiveBox/1.0)" string β€”
PATHExecutable search path "" string β€”
NODE_MODULES_DIRShared Node.js module resolution root "" string β€”
NODE_MODULE_DIRLegacy alias for NODE_MODULES_DIR "" string β€”
NODE_PATHNode.js module lookup path "" string β€”
NODE_V8_COVERAGEOptional V8 coverage output directory for Node.js hooks "" string β€”
CHROME_BINARYResolved Chromium/Chrome binary path shared across plugins "" string β€”
CHROME_USER_DATA_DIRChrome user data directory for persistent browser state "" string β€”
CHROME_DOWNLOADS_DIRChrome downloads directory shared by browser plugins "" string β€”
CHROME_EXTENSIONS_DIRChrome extensions directory shared by browser plugins "" string β€”

Bash

bash

Install binaries using arbitrary bash shell commands.

1 hook(s) 0 configs

Install binaries using arbitrary bash shell commands.

Run It

abx-dl plugins --install bash
archivebox init --setup

Setup plugins install dependencies or prepare shared runtime state.

Hook Scripts

Env Var Config Options

This plugin does not define a config.json schema.

Homebrew

brew

Install binaries through the Homebrew package manager.

1 hook(s) 0 configs

Install binaries through the Homebrew package manager.

Run It

abx-dl plugins --install brew
archivebox init --setup

Setup plugins install dependencies or prepare shared runtime state.

Hook Scripts

Env Var Config Options

This plugin does not define a config.json schema.

Cargo

cargo

Install binaries through Rust's Cargo package manager.

1 hook(s) 0 configs

Install binaries through Rust's Cargo package manager.

Run It

abx-dl plugins --install cargo
archivebox init --setup

Setup plugins install dependencies or prepare shared runtime state.

Hook Scripts

Env Var Config Options

This plugin does not define a config.json schema.

Chrome Web Store Provider

chromewebstore

Resolve Chrome Web Store extensions as installable binary-like artifacts.

1 hook(s) 4 configs
node (env,apt,brew)

Resolve Chrome Web Store extensions as installable binary-like artifacts.

Dependencies & Outputs

Required Binaries
node providers=env,apt,brew

Run It

abx-dl plugins --install chromewebstore
archivebox init --setup

Setup plugins install dependencies or prepare shared runtime state.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
NODE_BINARYPath to Node.js binary "node" string β€”
PERSONAS_DIRShared Chrome/browser personas root "" string β€”
ACTIVE_PERSONAActive browser persona name "Default" string β€”
CHROME_EXTENSIONS_DIRPath to installed Chrome extensions directory "" string β€”

Claude Code

claudecode

Run Claude Code AI agent on snapshots to extract, analyze, or transform archived content.

Embed Fullscreen 0 hook(s) 7 configs
node (env,apt,brew) claude (env,npm)
πŸ“¦ application/json

Run Claude Code AI agent on snapshots to extract, analyze, or transform archived content.

Dependencies & Outputs

Required Binaries
node providers=env,apt,brew claude providers=env,npm
Output Mimetypes
πŸ“¦ application/json

Run It

abx-dl plugins claudecode
archivebox add 'https://example.com'

Utility plugins are typically consumed indirectly, so the example shows the closest inspection workflow.

Hook Scripts

No hook scripts are defined in this plugin directory.

Env Var Config Options

Key Default Type Aliases / Fallback
NODE_BINARYPath to Node.js binary "node" string β€”
CLAUDECODE_ENABLEDEnable Claude Code AI agent integration. Controls whether the Claude CLI dependency is resolved for this plugin; child plugins still need the claudecode plugin enabled and a working Claude binary. false boolean USE_CLAUDECODE
CLAUDECODE_BINARYPath to Claude Code CLI binary "claude" string β€”
CLAUDECODE_TIMEOUTTimeout for Claude Code operations in seconds 120 integer
min 10
fallback: TIMEOUT
ANTHROPIC_API_KEYAnthropic API key for Claude Code authentication "" string β€”
CLAUDECODE_MODELClaude model to use (e.g. claude-sonnet-4-6, claude-opus-4-6, claude-haiku-4-5-20251001) "claude-sonnet-4-6" string β€”
CLAUDECODE_MAX_TURNSMaximum number of agentic turns per invocation 50 integer
min 1
β€”

Environment

env

Discover binaries that are already available on the system PATH.

1 hook(s) 0 configs

Discover binaries that are already available on the system PATH.

Run It

abx-dl plugins --install env
archivebox init --setup

Setup plugins install dependencies or prepare shared runtime state.

Hook Scripts

Env Var Config Options

This plugin does not define a config.json schema.

Media

media

Provide a shared namespace for media-related plugin outputs and helpers.

0 hook(s) 0 configs

Provide a shared namespace for media-related plugin outputs and helpers.

Run It

abx-dl plugins media
archivebox add 'https://example.com'

Utility plugins are typically consumed indirectly, so the example shows the closest inspection workflow.

Hook Scripts

No hook scripts are defined in this plugin directory.

Env Var Config Options

This plugin does not define a config.json schema.

npm

npm

Install binaries from npm packages and expose Node module paths.

1 hook(s) 2 configs
node (env,apt,brew)

Install binaries from npm packages and expose Node module paths.

Dependencies & Outputs

Required Binaries
node providers=env,apt,brew

Run It

abx-dl plugins --install npm
archivebox init --setup

Setup plugins install dependencies or prepare shared runtime state.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
NODE_BINARYPath to Node.js binary "node" string β€”
NPM_BINARYPath to npm binary "npm" string β€”

pip

pip

Install Python-based binaries into a managed virtual environment.

1 hook(s) 1 configs
python (env)

Install Python-based binaries into a managed virtual environment.

Dependencies & Outputs

Required Binaries
python providers=env

Run It

abx-dl plugins --install pip
archivebox init --setup

Setup plugins install dependencies or prepare shared runtime state.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
PIP_VENV_PYTHONPreferred Python interpreter for creating the shared pip virtualenv "" string β€”

Puppeteer

puppeteer

Install and manage Chromium through the Puppeteer toolchain.

1 hook(s) 3 configs
puppeteer (npm)

Install and manage Chromium through the Puppeteer toolchain.

Dependencies & Outputs

Required Binaries
puppeteer providers=npm

Run It

abx-dl plugins --install puppeteer
PUPPETEER_ENABLED=true archivebox init --setup

Setup plugins install dependencies or prepare shared runtime state.

Hook Scripts

Env Var Config Options

Key Default Type Aliases / Fallback
CHROME_BINARYPath to Chromium binary "chromium" string CHROMIUM_BINARY, GOOGLE_CHROME_BINARY
PUPPETEER_ENABLEDEnable Puppeteer dependency installation during crawl setup true boolean β€”
PUPPETEER_CACHE_DIROverride the Puppeteer browser cache directory "" string β€”

ripgrep Search

search_backend_ripgrep

Search archived snapshot files directly with ripgrep instead of maintaining an index.

0 hook(s) 5 configs
rg (env,apt,brew)

Search archived snapshot files directly with ripgrep instead of maintaining an index.

Dependencies & Outputs

Required Binaries
rg providers=env,apt,brew

Run It

abx-dl plugins search_backend_ripgrep
archivebox add 'https://example.com'

Utility plugins are typically consumed indirectly, so the example shows the closest inspection workflow.

Hook Scripts

No hook scripts are defined in this plugin directory.

Env Var Config Options

Key Default Type Aliases / Fallback
SEARCH_BACKEND_ENGINESelected search backend implementation "ripgrep" string β€”
RIPGREP_BINARYPath to ripgrep binary "rg" string β€”
RIPGREP_TIMEOUTSearch timeout in seconds 90 integer
min 5
SEARCH_BACKEND_TIMEOUT
fallback: TIMEOUT
RIPGREP_ARGSDefault ripgrep arguments [
"--files-with-matches"
"--no-messages"
"--ignore-case"
]
array RIPGREP_DEFAULT_ARGS
RIPGREP_ARGS_EXTRAExtra arguments to append to ripgrep command [] array RIPGREP_EXTRA_ARGS

SSL

ssl

Utility plugin namespace reserved for SSL-related integration points and metadata.

0 hook(s) 0 configs

Utility plugin namespace reserved for SSL-related integration points and metadata.

Run It

abx-dl plugins ssl
archivebox add 'https://example.com'

Utility plugins are typically consumed indirectly, so the example shows the closest inspection workflow.

Hook Scripts

No hook scripts are defined in this plugin directory.

Env Var Config Options

This plugin does not define a config.json schema.