dmarcreport is a cron-friendly reporting tool for people who want to keep an eye on DMARC and TLS delivery without building a mail-security platform from scratch.
It fetches DMARC aggregate reports, DMARC forensic reports, and TLS-RPT reports from IMAP mailboxes or local directories, stores them in SQLite or MySQL/MariaDB, builds a static dashboard, and can send scheduled reports and alerts.
The model is simple: run it, let it do the work, let it exit.
The default storage is SQLite. MySQL/MariaDB storage is available for when the dataset gets larger and/or better performance is needed.
If you are new to the project, start with these:
- INSTALL.md for installation and first-run setup
- DATABASE.md for SQLite vs MySQL/MariaDB, migration, and switch-over
- DMARC-GUIDE.md for DMARC guidance
- TLSRPT-GUIDE.md for TLS-RPT and MTA-STS guidance
- CHANGELOG.md for public release history
# Configure
cp dmarcreport.yaml.example dmarcreport.yaml
# Edit dmarcreport.yaml with your settings
# Import reports from a local directory
dmarcreport fetch --source local
# Or from IMAP
dmarcreport fetch --source imap
# Generate the dashboard
dmarcreport generate
# Open public/index.html in your browserFor most installations, the workflow is:
dmarcreport fetch --source imap # 1. Ingest new reports
dmarcreport generate --deploy # 2. Rebuild and deploy site
dmarcreport notify # 3. Send alerts/reports if dueThat is also the intended cron or scheduler model.
If you want to check the DNS side before waiting for reports:
dmarcreport check dns example.comThat checks DMARC, TLS-RPT, and MTA-STS records, as well as the MTA-STS policy file.
It also points out a few practical issues, such as monitoring-only DMARC rollout, missing external report authorization records, and MTA-STS policies that look inconsistent.
| Command | Description | Key Options |
|---|---|---|
fetch |
Import reports and store in database | --source local|imap, --family dmarc|tlsrpt|all, --chunk N, --skip-prune, --dry-run |
generate |
Build static dashboard from database | --clean, --deploy |
demo |
Build a synthetic local dashboard using a temporary SQLite database | --output DIR, --reports N, --tlsrpt-reports N, --forensics N, --seed N |
deploy |
Deploy the generated site | |
push |
Push application code to live server or a local staging path | --dry-run, --local DIR |
notify |
Evaluate alerts and send scheduled reports | --force daily|weekly |
forensics |
List forensic (RUF) reports from the database | --json, --limit N |
| Command | Description | Key Options |
|---|---|---|
check config |
Validate configuration file | |
check dns |
Check DMARC, TLS-RPT, and MTA-STS for one or more domains | DOMAIN ... |
check deploy |
Test site deployment configuration | |
check push |
Test code push configuration | |
check imap |
Test IMAP connection and list folders | |
check smtp |
Send a test email | |
check notify |
Send a test notification email | --type report|alert, --to EMAIL |
check db |
Show database driver, connectivity, retention, permissions, and statistics | |
db migrate |
Copy all data between supported database backends | --to BACKEND, --target-config FILE |
db optimize |
Run backend-specific maintenance on the active database | |
export |
Export database contents to JSON | --format json, --output FILE |
reset |
Wipe all data from the database | --confirm |
prune |
Delete reports older than N days | --older-than N |
These work directly on .eml files without using the database. They can be useful for one-off analysis, debugging, and sample files.
| Command | Description | Key Options |
|---|---|---|
parse |
Parse individual .eml files | --all, --json, --failures |
summary |
Aggregate summary by domain | --json, --report, --email |
failures |
List all DMARC failures | --json, --report, --email |
passes |
List all DMARC passes | --json |
sources |
Aggregate by mail server | --json, --anomalies, --report, --email |
trends |
Daily pass/fail trends | --json, --report, --email |
anomalies |
Detect anomalous patterns | --json, --report, --email |
report |
Combined HTML report | --report, --email |
dmarcreport supports two source models:
- pull reports from IMAP
- ingest report files from a local directory
For DMARC and TLS-RPT, the cleanest setup is usually separate folders or separate mailboxes.
Place DMARC .eml files in the source directory (default: ./source/), then:
dmarcreport fetch --source localImported files are moved to ./source/Imported/. Files that fail to parse are moved to ./source/Failed/. Duplicate reports are detected and skipped.
TLS-RPT can use its own local source path and accepts .json, .json.gz, or .eml messages that carry TLS-RPT attachments:
dmarcreport fetch --source local --family tlsrptBy default these files are read from ./source/tlsrpt/ and moved to ./source/tlsrpt/Imported/ or ./source/tlsrpt/Failed/.
Configure IMAP in dmarcreport.yaml, then:
dmarcreport fetch --source imapMessages are processed and then moved to Imported or Failed subfolders under the configured IMAP folder. If imap.folder is INBOX/DMARC, mail ends up in INBOX/DMARC/Imported or INBOX/DMARC/Failed. The subfolders are created automatically if needed.
TLS-RPT can use the same mailbox, a different folder, or completely separate credentials:
dmarcreport fetch --source imap --family tlsrptThat uses tlsrpt.imap.* from the configuration file, so you can keep TLS-RPT in a separate mailbox if needed.
Phase-1 note: if you fetch both families with --family all, keep them in separate local directories or separate IMAP folders. Using the exact same directory or mailbox folder for both families is blocked to avoid misclassifying messages.
If you want to see what would happen before anything is moved or imported:
dmarcreport fetch --source local --dry-run
dmarcreport fetch --source imap --dry-run
dmarcreport fetch --source imap --family all --dry-run
dmarcreport fetch --source imap --chunk 500For very large first-run imports, use --chunk N and work through the source in smaller runs. That is especially useful for large IMAP folders, where long sessions can look dead even when they are still working.
If you set database.retention_days, each real fetch run will also prune old data afterward. If you want to skip that for a particular run, use:
dmarcreport fetch --source imap --skip-pruneThe generate command builds a static HTML site from the data you have already imported:
dmarcreport generateThe generated site includes:
- Dashboard with pass rate, message volume charts, domain summary, recent failures, and "last updated" timestamp
- Year and month pages with daily breakdowns and trend charts
- Per-domain pages with provider summary, source analysis, and failure history
- Failures page with collapsible month sections (most recent expanded)
- Forensics page with individual failure details from RUF reports
- TLS-RPT page with policy-domain summaries, failure result categories, and recent TLS delivery failures
- About page with project attribution and license
- Provider identification β source IPs resolved to provider names (Google, Microsoft, Yahoo, etc.) via reverse DNS suffix matching
- Domain names displayed in monospace font; domains sorted by failure count (most troubled first)
- Auto/light/dark mode with theme toggle
- Works from
file://if you just want to open it locally
| Domain detail | Failures |
|---|---|
![]() |
![]() |
To regenerate from scratch:
dmarcreport generate --cleanIf you want to look at the UI without touching live data, use the demo command:
dmarcreport demoThis command always uses a temporary SQLite database, even if your real installation uses MariaDB or MySQL. It generates synthetic DMARC, forensic, and TLS-RPT data, builds a local static site, and exits without touching your live database.
By default the output goes to a sibling directory such as ./public-demo/. Use --output if you want a different local target:
dmarcreport demo --output ./public-demo
dmarcreport demo --reports 300 --tlsrpt-reports 220 --forensics 40After generation, deploy the site to wherever it should live:
dmarcreport deployOr generate and deploy in one step:
dmarcreport generate --deploySupported deploy methods (set site.deploy in config):
| Method | Description |
|---|---|
local |
No-op (default) β site stays in ./public/ |
rsync |
Sync to remote server via rsync |
git |
Push to a git branch (e.g., gh-pages) |
ftp |
Upload via FTP or FTPS (plain/TLS/SSL) |
If you develop locally and run a separate live installation, push syncs the application code without touching its configuration, database, or generated content.
# Preview what would be pushed
dmarcreport push --dry-run
# Push code to the live server
dmarcreport push
# Or copy to a local staging directory instead of a configured remote target
dmarcreport push --local ./stagingConfigure the target in dmarcreport.yaml if you want remote pushes:
push:
method: rsync
target: user@server:/opt/dmarcreport/Or via FTP:
push:
method: ftp
ftp_host: ftp.example.com
ftp_username: user
ftp_password: pass
ftp_path: /opt/dmarcreport/If you do not configure push.method, dmarcreport push will tell you to either configure a remote method or use --local. It no longer silently assumes rsync.
What gets pushed: dmarcreport/ package, pyproject.toml, README.md, CHANGELOG.md, DATABASE.md, INSTALL.md, DMARC-GUIDE.md, TLSRPT-GUIDE.md, TRANSLATE.md, LICENSE, dmarcreport.yaml.example
Python cache directories and compiled files such as __pycache__ and .pyc are filtered out automatically.
What stays untouched on LIVE: dmarcreport.yaml, data/, source/, public/, .venv/, and all other environment-specific files
push copies application files. It does not install the package on the target host, and it does not build or refresh the target virtual environment for you.
On a Linux host, a simple layout is:
/opt/dmarcreport/
.venv/
dmarcreport/
pyproject.toml
dmarcreport.yaml
...
The .venv can live in the same directory as the project, but it should be created on the Linux host itself. Do not copy a virtual environment from Windows to Linux.
cd /opt/dmarcreport
python3 -m venv .venv
. .venv/bin/activate
python -m pip install .If that host uses MySQL or MariaDB support, install the extra there as well:
python -m pip install ".[mysql]"If you push to a local staging directory and then copy that tree to the live Linux host, the usual follow-up is:
cd /opt/dmarcreport
. .venv/bin/activate
python -m pip install .
dmarcreport --aboutThat last command is just a quick sanity check that the installed launcher and the copied code agree.
If you prefer, you can also verify the source tree directly before reinstalling:
.venv/bin/python -m dmarcreport.cli --aboutIf that works but dmarcreport --about does not, the code is in the right place and the install step is what is missing.
After a push, the live server may need to reinstall the package, especially if dependencies changed:
# On the live server
cd /opt/dmarcreport && . .venv/bin/activate && python -m pip install .The notify command evaluates alert rules and sends scheduled reports via email:
dmarcreport notifyConfigure in dmarcreport.yaml:
reports:
daily:
enabled: true
time: "08:00"
recipients: [[email protected]]
weekly:
enabled: true
day: monday
time: "08:00"
recipients: [[email protected]]
alerts:
- name: high-failure-rate
condition: "failure_rate > 5%"
period: 24h
recipients: [[email protected]]
- name: spike-detection
condition: "failures > 50"
period: 24h
recipients: [[email protected]]
- name: tlsrpt-failure-rate
condition: "tlsrpt_failure_rate > 5%"
period: 24h
recipients: [[email protected]]Force send a report regardless of schedule:
dmarcreport notify --force dailySupported alert conditions:
failure_rate > N%β triggers when failure rate exceeds thresholdfailures > Nβ triggers when failure count exceeds thresholdtlsrpt_failure_rate > N%β triggers when the TLS-RPT failed-session rate exceeds thresholdtlsrpt_failed_sessions > Nβ triggers when TLS-RPT failed-session count exceeds threshold
Periods: 24h, 7d, 1w, 1m
Test your SMTP configuration and preview notification emails:
# Send a test report email (uses live data or sample data if DB is empty)
dmarcreport check notify
# Send a test alert email
dmarcreport check notify --type alert
# Send to a specific address instead of the configured notification recipients
dmarcreport check notify --to [email protected]By default, check notify sends to the configured recipients for the notification type being tested:
--type reportuses the union of configured report recipients--type alertuses the union of configured alert recipients- If none are configured yet, it falls back to
smtp.from_address
Test emails are clearly marked as [TEST] in the subject line and are not logged to the notification history.
Create dmarcreport.yaml from the example:
cp dmarcreport.yaml.example dmarcreport.yamlAll settings support environment variable overrides. See dmarcreport.yaml.example for the full reference with all options documented.
Priority order: Environment variables > ./dmarcreport.yaml > ~/dmarcreport.yaml > defaults
| Option | Description |
|---|---|
--config / -c |
Path to configuration file |
--about |
Show version and credits |
Use --config / -c to specify a different configuration file:
dmarcreport --config production.yaml fetch --source imap
dmarcreport -c staging.yaml check configThis lets you maintain separate configurations (e.g., DEV, staging, production) in the same directory and invoke them independently.
Key environment variables:
| Variable | Description |
|---|---|
DB_DRIVER |
Database backend: sqlite, mysql, mariadb |
DB_PATH |
Database file path |
DB_RETENTION_DAYS |
Automatic retention period in days |
DB_HOST, DB_PORT, DB_USER, DB_PASSWORD, DB_NAME |
MySQL/MariaDB connection |
DB_CHARSET, DB_COLLATION |
MySQL/MariaDB charset and collation |
SOURCE_PATH |
Local .eml source directory |
IMAP_HOST, IMAP_USER, IMAP_PASSWORD |
IMAP connection |
TLSRPT_SOURCE_PATH |
Local TLS-RPT source directory |
TLSRPT_IMAP_HOST, TLSRPT_IMAP_USER, TLSRPT_IMAP_PASSWORD |
TLS-RPT IMAP connection |
SMTP_HOST, SMTP_FROM |
SMTP for notifications |
SITE_OUTPUT |
Static site output directory |
SITE_DEPLOY |
Deploy method |
FTP_SECURITY |
FTP security mode: plain, tls, ssl |
LOG_PATH |
Activity log file path |
NUMBERS_LOCALE |
Number format: en, sv, de, none |
LANGUAGE |
Dashboard language: en, sv |
The generated static dashboard can be displayed in different languages:
language: sv # SwedishAvailable languages: en (English, default), sv (Swedish). See TRANSLATE.md for instructions on adding more translations.
Organise domains on the dashboard into named groups:
site:
domain_groups:
Main Domains:
- example.com
- example.org
Client Domains:
- client1.com
- client2.comDomains not listed in any group appear under "Other Domains". When no groups are configured, all domains are shown in a single table sorted by failure count.
Control thousand separators and decimal points in statistics across the dashboard, email reports, and CLI output:
# en (default): 1,234.5 | sv: 1 234,5 | de: 1.234,5 | none: 1234.5
numbers_locale: enValidate your configuration:
dmarcreport check configFor smaller installs, keep the default SQLite configuration:
database:
driver: sqlite
path: ./data/dmarcreport.db
retention_days: 365For larger installs, use MySQL or MariaDB:
database:
driver: mariadb
host: 127.0.0.1
port: 3306
username: dmarcreport
password: secret
name: dmarcreport
charset: utf8mb4
collation: utf8mb4_swedish_ciMySQL/MariaDB support requires the optional extra:
pip install ".[mysql]"See INSTALL.md for example SQL to create the database, application user, and grants for MySQL/MariaDB.
If you prefer scheduled cleanup instead of manual pruning, add a retention policy:
database:
driver: sqlite
path: ./data/dmarcreport.db
retention_days: 365That tells fetch to prune data older than one year after each normal run. Use --skip-prune if you need to bypass it once.
To migrate between SQLite and MySQL/MariaDB, keep one config file for the source database and one for the target database:
# SQLite -> MariaDB
dmarcreport -c sqlite.yaml db migrate --to mariadb --target-config mariadb.yaml
# MariaDB -> SQLite
dmarcreport -c mariadb.yaml db migrate --to sqlite --target-config sqlite.yamlRules:
- The target database must already exist and be empty.
- Same-backend migrations are blocked.
- SQLite <-> MySQL/MariaDB is supported.
- MySQL <-> MariaDB direct migration is not supported yet.
Use the full path to the virtualenv entry point β no cd or source activate needed:
# Fetch new reports every 30 minutes
*/30 * * * * /opt/dmarcreport/.venv/bin/dmarcreport --config /opt/dmarcreport/dmarcreport.yaml fetch --source imap > /dev/null
# Regenerate and deploy site 5 minutes after each fetch
5,35 * * * * /opt/dmarcreport/.venv/bin/dmarcreport --config /opt/dmarcreport/dmarcreport.yaml generate --deploy > /dev/null
# Evaluate alerts and send scheduled reports once a day
0 7 * * * /opt/dmarcreport/.venv/bin/dmarcreport --config /opt/dmarcreport/dmarcreport.yaml notify > /dev/nullPath resolution: When
--configis given an absolute path, all relative paths in the YAML file (e.g../data/dmarcreport.db,./public) are automatically resolved relative to the directory containing that config file β not the working directory. This means the cron entries above work correctly with relative paths indmarcreport.yaml.If you run without
--config(config found by searching CWD) or set paths via environment variables, relative paths will resolve against the working directory, which is unpredictable from cron. In that case use absolute paths, or rundmarcreport check configto get a warning if any paths are still relative.Redirect stdout to
/dev/nullfor silent operation; cron will still mail you on errors (stderr).
All operations (fetch, generate, deploy, push, notify) are logged. Two targets are supported:
File (default) β append-only log file, rotation left to logrotate or similar:
log:
target: file
path: /opt/dmarcreport/data/activity.logSyslog β writes to the system logger, useful for log-tailers, SIEMs, and journalctl:
log:
target: syslog
syslog_facility: local0 # local0βlocal7, daemon, user
syslog_ident: dmarcreport # program name shown in syslogOn Linux, syslog connects to /dev/log; on macOS to /var/run/syslog; falls back to UDP localhost:514. Entries look like:
Mar 4 12:00:00 hostname dmarcreport: [FETCH] Imported 3 reports, 0 duplicates
Each config file can have its own log target, so separate environments write to separate destinations.
FTP connections support three security modes (set security in config):
| Mode | Description | Default Port |
|---|---|---|
plain |
Unencrypted FTP (default) | 21 |
tls |
Explicit FTPS β upgrades via AUTH TLS | 21 |
ssl |
Implicit FTPS β SSL from connection start | 990 |
This applies to both site deployment (site.ftp.security) and code push (push.ftp_security).
In addition to aggregate reports (RUA), dmarcreport can parse forensic failure reports (RUF/AFRF, RFC 6591). These contain per-message failure details including original headers and authentication results.
Note: Most large providers (Google, Microsoft, Yahoo) do not send RUF reports due to privacy concerns. Some smaller providers and corporate mail servers do. Configure ruf in your DMARC record to receive them when available.
# List forensic reports from the database
dmarcreport forensics
# JSON output with custom limit
dmarcreport forensics --json --limit 100The static dashboard includes a Forensics page that displays all imported forensic reports. See DMARC-GUIDE.md for recommended DMARC record configuration to enable RUF reporting.
# Show database stats
dmarcreport check db
# Run backend-specific maintenance
dmarcreport db optimize
# Preview what prune would remove
dmarcreport prune --older-than 365 --dry-run
# Delete reports older than 1 year
dmarcreport prune --older-than 365
# Wipe everything and start fresh
dmarcreport reset --confirmcheck db now also shows the active retention policy and, for SQLite, the connection settings being used for journal mode, synchronous mode, temp storage, and cache size.
prune --dry-run is there when you want to see the impact before deleting anything.
Written by Joaquim Homrighausen, sponsored by WebbPlatsen i Sverige AB, Sweden β bringing you GDPR safe Internet Solutions since 1998.
Dashboard built with Bulma and Chart.js, both MIT licensed.
For commercial support and SaaS hosting of dmarcreport, please contact [email protected].
Copyright 2026 Joaquim Homrighausen. All rights reserved.
AGPLv3 β see LICENSE.
All runtime dependencies (Typer, Pydantic, Rich, PyYAML, optional PyMySQL) are MIT/BSD licensed. Bundled frontend assets (Bulma CSS, Chart.js) are MIT licensed.


