Skip to content

joho1968/dmarcreport

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

1 Commit
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

dmarcreport

dmarcreport is a cron-friendly reporting tool for people who want to keep an eye on DMARC and TLS delivery without building a mail-security platform from scratch.

It fetches DMARC aggregate reports, DMARC forensic reports, and TLS-RPT reports from IMAP mailboxes or local directories, stores them in SQLite or MySQL/MariaDB, builds a static dashboard, and can send scheduled reports and alerts.

The model is simple: run it, let it do the work, let it exit.

The default storage is SQLite. MySQL/MariaDB storage is available for when the dataset gets larger and/or better performance is needed.

Dashboard

Quick start

If you are new to the project, start with these:

# Configure
cp dmarcreport.yaml.example dmarcreport.yaml
# Edit dmarcreport.yaml with your settings

# Import reports from a local directory
dmarcreport fetch --source local

# Or from IMAP
dmarcreport fetch --source imap

# Generate the dashboard
dmarcreport generate

# Open public/index.html in your browser

Normal workflow

For most installations, the workflow is:

dmarcreport fetch --source imap    # 1. Ingest new reports
dmarcreport generate --deploy      # 2. Rebuild and deploy site
dmarcreport notify                 # 3. Send alerts/reports if due

That is also the intended cron or scheduler model.

If you want to check the DNS side before waiting for reports:

dmarcreport check dns example.com

That checks DMARC, TLS-RPT, and MTA-STS records, as well as the MTA-STS policy file.

It also points out a few practical issues, such as monitoring-only DMARC rollout, missing external report authorization records, and MTA-STS policies that look inconsistent.

Commands

Pipeline commands

Command Description Key Options
fetch Import reports and store in database --source local|imap, --family dmarc|tlsrpt|all, --chunk N, --skip-prune, --dry-run
generate Build static dashboard from database --clean, --deploy
demo Build a synthetic local dashboard using a temporary SQLite database --output DIR, --reports N, --tlsrpt-reports N, --forensics N, --seed N
deploy Deploy the generated site
push Push application code to live server or a local staging path --dry-run, --local DIR
notify Evaluate alerts and send scheduled reports --force daily|weekly
forensics List forensic (RUF) reports from the database --json, --limit N

Maintenance commands

Command Description Key Options
check config Validate configuration file
check dns Check DMARC, TLS-RPT, and MTA-STS for one or more domains DOMAIN ...
check deploy Test site deployment configuration
check push Test code push configuration
check imap Test IMAP connection and list folders
check smtp Send a test email
check notify Send a test notification email --type report|alert, --to EMAIL
check db Show database driver, connectivity, retention, permissions, and statistics
db migrate Copy all data between supported database backends --to BACKEND, --target-config FILE
db optimize Run backend-specific maintenance on the active database
export Export database contents to JSON --format json, --output FILE
reset Wipe all data from the database --confirm
prune Delete reports older than N days --older-than N

File-based analysis commands

These work directly on .eml files without using the database. They can be useful for one-off analysis, debugging, and sample files.

Command Description Key Options
parse Parse individual .eml files --all, --json, --failures
summary Aggregate summary by domain --json, --report, --email
failures List all DMARC failures --json, --report, --email
passes List all DMARC passes --json
sources Aggregate by mail server --json, --anomalies, --report, --email
trends Daily pass/fail trends --json, --report, --email
anomalies Detect anomalous patterns --json, --report, --email
report Combined HTML report --report, --email

Fetching reports

dmarcreport supports two source models:

  • pull reports from IMAP
  • ingest report files from a local directory

For DMARC and TLS-RPT, the cleanest setup is usually separate folders or separate mailboxes.

From a local directory

Place DMARC .eml files in the source directory (default: ./source/), then:

dmarcreport fetch --source local

Imported files are moved to ./source/Imported/. Files that fail to parse are moved to ./source/Failed/. Duplicate reports are detected and skipped.

TLS-RPT can use its own local source path and accepts .json, .json.gz, or .eml messages that carry TLS-RPT attachments:

dmarcreport fetch --source local --family tlsrpt

By default these files are read from ./source/tlsrpt/ and moved to ./source/tlsrpt/Imported/ or ./source/tlsrpt/Failed/.

From IMAP

Configure IMAP in dmarcreport.yaml, then:

dmarcreport fetch --source imap

Messages are processed and then moved to Imported or Failed subfolders under the configured IMAP folder. If imap.folder is INBOX/DMARC, mail ends up in INBOX/DMARC/Imported or INBOX/DMARC/Failed. The subfolders are created automatically if needed.

TLS-RPT can use the same mailbox, a different folder, or completely separate credentials:

dmarcreport fetch --source imap --family tlsrpt

That uses tlsrpt.imap.* from the configuration file, so you can keep TLS-RPT in a separate mailbox if needed.

Phase-1 note: if you fetch both families with --family all, keep them in separate local directories or separate IMAP folders. Using the exact same directory or mailbox folder for both families is blocked to avoid misclassifying messages.

Dry run

If you want to see what would happen before anything is moved or imported:

dmarcreport fetch --source local --dry-run
dmarcreport fetch --source imap --dry-run
dmarcreport fetch --source imap --family all --dry-run
dmarcreport fetch --source imap --chunk 500

For very large first-run imports, use --chunk N and work through the source in smaller runs. That is especially useful for large IMAP folders, where long sessions can look dead even when they are still working.

If you set database.retention_days, each real fetch run will also prune old data afterward. If you want to skip that for a particular run, use:

dmarcreport fetch --source imap --skip-prune

Static dashboard

The generate command builds a static HTML site from the data you have already imported:

dmarcreport generate

The generated site includes:

  • Dashboard with pass rate, message volume charts, domain summary, recent failures, and "last updated" timestamp
  • Year and month pages with daily breakdowns and trend charts
  • Per-domain pages with provider summary, source analysis, and failure history
  • Failures page with collapsible month sections (most recent expanded)
  • Forensics page with individual failure details from RUF reports
  • TLS-RPT page with policy-domain summaries, failure result categories, and recent TLS delivery failures
  • About page with project attribution and license
  • Provider identification β€” source IPs resolved to provider names (Google, Microsoft, Yahoo, etc.) via reverse DNS suffix matching
  • Domain names displayed in monospace font; domains sorted by failure count (most troubled first)
  • Auto/light/dark mode with theme toggle
  • Works from file:// if you just want to open it locally
Domain detail Failures
Domain detail Failures

To regenerate from scratch:

dmarcreport generate --clean

Demo dashboard

If you want to look at the UI without touching live data, use the demo command:

dmarcreport demo

This command always uses a temporary SQLite database, even if your real installation uses MariaDB or MySQL. It generates synthetic DMARC, forensic, and TLS-RPT data, builds a local static site, and exits without touching your live database.

By default the output goes to a sibling directory such as ./public-demo/. Use --output if you want a different local target:

dmarcreport demo --output ./public-demo
dmarcreport demo --reports 300 --tlsrpt-reports 220 --forensics 40

Deployment

After generation, deploy the site to wherever it should live:

dmarcreport deploy

Or generate and deploy in one step:

dmarcreport generate --deploy

Supported deploy methods (set site.deploy in config):

Method Description
local No-op (default) β€” site stays in ./public/
rsync Sync to remote server via rsync
git Push to a git branch (e.g., gh-pages)
ftp Upload via FTP or FTPS (plain/TLS/SSL)

Pushing code

If you develop locally and run a separate live installation, push syncs the application code without touching its configuration, database, or generated content.

# Preview what would be pushed
dmarcreport push --dry-run

# Push code to the live server
dmarcreport push

# Or copy to a local staging directory instead of a configured remote target
dmarcreport push --local ./staging

Configure the target in dmarcreport.yaml if you want remote pushes:

push:
  method: rsync
  target: user@server:/opt/dmarcreport/

Or via FTP:

push:
  method: ftp
  ftp_host: ftp.example.com
  ftp_username: user
  ftp_password: pass
  ftp_path: /opt/dmarcreport/

If you do not configure push.method, dmarcreport push will tell you to either configure a remote method or use --local. It no longer silently assumes rsync.

What gets pushed: dmarcreport/ package, pyproject.toml, README.md, CHANGELOG.md, DATABASE.md, INSTALL.md, DMARC-GUIDE.md, TLSRPT-GUIDE.md, TRANSLATE.md, LICENSE, dmarcreport.yaml.example

Python cache directories and compiled files such as __pycache__ and .pyc are filtered out automatically.

What stays untouched on LIVE: dmarcreport.yaml, data/, source/, public/, .venv/, and all other environment-specific files

push copies application files. It does not install the package on the target host, and it does not build or refresh the target virtual environment for you.

Recommended live-host layout

On a Linux host, a simple layout is:

/opt/dmarcreport/
  .venv/
  dmarcreport/
  pyproject.toml
  dmarcreport.yaml
  ...

The .venv can live in the same directory as the project, but it should be created on the Linux host itself. Do not copy a virtual environment from Windows to Linux.

First-time setup on the live host

cd /opt/dmarcreport
python3 -m venv .venv
. .venv/bin/activate
python -m pip install .

If that host uses MySQL or MariaDB support, install the extra there as well:

python -m pip install ".[mysql]"

Normal update routine after a code push

If you push to a local staging directory and then copy that tree to the live Linux host, the usual follow-up is:

cd /opt/dmarcreport
. .venv/bin/activate
python -m pip install .
dmarcreport --about

That last command is just a quick sanity check that the installed launcher and the copied code agree.

If you prefer, you can also verify the source tree directly before reinstalling:

.venv/bin/python -m dmarcreport.cli --about

If that works but dmarcreport --about does not, the code is in the right place and the install step is what is missing.

After a push, the live server may need to reinstall the package, especially if dependencies changed:

# On the live server
cd /opt/dmarcreport && . .venv/bin/activate && python -m pip install .

Alerts and reports

The notify command evaluates alert rules and sends scheduled reports via email:

dmarcreport notify

Configure in dmarcreport.yaml:

reports:
  daily:
    enabled: true
    time: "08:00"
    recipients: [[email protected]]
  weekly:
    enabled: true
    day: monday
    time: "08:00"
    recipients: [[email protected]]

alerts:
  - name: high-failure-rate
    condition: "failure_rate > 5%"
    period: 24h
    recipients: [[email protected]]
  - name: spike-detection
    condition: "failures > 50"
    period: 24h
    recipients: [[email protected]]
  - name: tlsrpt-failure-rate
    condition: "tlsrpt_failure_rate > 5%"
    period: 24h
    recipients: [[email protected]]

Force send a report regardless of schedule:

dmarcreport notify --force daily

Supported alert conditions:

  • failure_rate > N% β€” triggers when failure rate exceeds threshold
  • failures > N β€” triggers when failure count exceeds threshold
  • tlsrpt_failure_rate > N% β€” triggers when the TLS-RPT failed-session rate exceeds threshold
  • tlsrpt_failed_sessions > N β€” triggers when TLS-RPT failed-session count exceeds threshold

Periods: 24h, 7d, 1w, 1m

Testing notifications

Test your SMTP configuration and preview notification emails:

# Send a test report email (uses live data or sample data if DB is empty)
dmarcreport check notify

# Send a test alert email
dmarcreport check notify --type alert

# Send to a specific address instead of the configured notification recipients
dmarcreport check notify --to [email protected]

By default, check notify sends to the configured recipients for the notification type being tested:

  • --type report uses the union of configured report recipients
  • --type alert uses the union of configured alert recipients
  • If none are configured yet, it falls back to smtp.from_address

Test emails are clearly marked as [TEST] in the subject line and are not logged to the notification history.

Configuration

Create dmarcreport.yaml from the example:

cp dmarcreport.yaml.example dmarcreport.yaml

All settings support environment variable overrides. See dmarcreport.yaml.example for the full reference with all options documented.

Priority order: Environment variables > ./dmarcreport.yaml > ~/dmarcreport.yaml > defaults

Global options

Option Description
--config / -c Path to configuration file
--about Show version and credits

Use --config / -c to specify a different configuration file:

dmarcreport --config production.yaml fetch --source imap
dmarcreport -c staging.yaml check config

This lets you maintain separate configurations (e.g., DEV, staging, production) in the same directory and invoke them independently.

Key environment variables:

Variable Description
DB_DRIVER Database backend: sqlite, mysql, mariadb
DB_PATH Database file path
DB_RETENTION_DAYS Automatic retention period in days
DB_HOST, DB_PORT, DB_USER, DB_PASSWORD, DB_NAME MySQL/MariaDB connection
DB_CHARSET, DB_COLLATION MySQL/MariaDB charset and collation
SOURCE_PATH Local .eml source directory
IMAP_HOST, IMAP_USER, IMAP_PASSWORD IMAP connection
TLSRPT_SOURCE_PATH Local TLS-RPT source directory
TLSRPT_IMAP_HOST, TLSRPT_IMAP_USER, TLSRPT_IMAP_PASSWORD TLS-RPT IMAP connection
SMTP_HOST, SMTP_FROM SMTP for notifications
SITE_OUTPUT Static site output directory
SITE_DEPLOY Deploy method
FTP_SECURITY FTP security mode: plain, tls, ssl
LOG_PATH Activity log file path
NUMBERS_LOCALE Number format: en, sv, de, none
LANGUAGE Dashboard language: en, sv

Language

The generated static dashboard can be displayed in different languages:

language: sv  # Swedish

Available languages: en (English, default), sv (Swedish). See TRANSLATE.md for instructions on adding more translations.

Domain grouping

Organise domains on the dashboard into named groups:

site:
  domain_groups:
    Main Domains:
      - example.com
      - example.org
    Client Domains:
      - client1.com
      - client2.com

Domains not listed in any group appear under "Other Domains". When no groups are configured, all domains are shown in a single table sorted by failure count.

Number formatting

Control thousand separators and decimal points in statistics across the dashboard, email reports, and CLI output:

# en (default): 1,234.5  |  sv: 1 234,5  |  de: 1.234,5  |  none: 1234.5
numbers_locale: en

Validate your configuration:

dmarcreport check config

Database backends

For smaller installs, keep the default SQLite configuration:

database:
  driver: sqlite
  path: ./data/dmarcreport.db
  retention_days: 365

For larger installs, use MySQL or MariaDB:

database:
  driver: mariadb
  host: 127.0.0.1
  port: 3306
  username: dmarcreport
  password: secret
  name: dmarcreport
  charset: utf8mb4
  collation: utf8mb4_swedish_ci

MySQL/MariaDB support requires the optional extra:

pip install ".[mysql]"

See INSTALL.md for example SQL to create the database, application user, and grants for MySQL/MariaDB.

If you prefer scheduled cleanup instead of manual pruning, add a retention policy:

database:
  driver: sqlite
  path: ./data/dmarcreport.db
  retention_days: 365

That tells fetch to prune data older than one year after each normal run. Use --skip-prune if you need to bypass it once.

Database migration

To migrate between SQLite and MySQL/MariaDB, keep one config file for the source database and one for the target database:

# SQLite -> MariaDB
dmarcreport -c sqlite.yaml db migrate --to mariadb --target-config mariadb.yaml

# MariaDB -> SQLite
dmarcreport -c mariadb.yaml db migrate --to sqlite --target-config sqlite.yaml

Rules:

  • The target database must already exist and be empty.
  • Same-backend migrations are blocked.
  • SQLite <-> MySQL/MariaDB is supported.
  • MySQL <-> MariaDB direct migration is not supported yet.

Cron setup

Use the full path to the virtualenv entry point β€” no cd or source activate needed:

# Fetch new reports every 30 minutes
*/30 * * * * /opt/dmarcreport/.venv/bin/dmarcreport --config /opt/dmarcreport/dmarcreport.yaml fetch --source imap > /dev/null

# Regenerate and deploy site 5 minutes after each fetch
5,35 * * * * /opt/dmarcreport/.venv/bin/dmarcreport --config /opt/dmarcreport/dmarcreport.yaml generate --deploy > /dev/null

# Evaluate alerts and send scheduled reports once a day
0 7 * * * /opt/dmarcreport/.venv/bin/dmarcreport --config /opt/dmarcreport/dmarcreport.yaml notify > /dev/null

Path resolution: When --config is given an absolute path, all relative paths in the YAML file (e.g. ./data/dmarcreport.db, ./public) are automatically resolved relative to the directory containing that config file β€” not the working directory. This means the cron entries above work correctly with relative paths in dmarcreport.yaml.

If you run without --config (config found by searching CWD) or set paths via environment variables, relative paths will resolve against the working directory, which is unpredictable from cron. In that case use absolute paths, or run dmarcreport check config to get a warning if any paths are still relative.

Redirect stdout to /dev/null for silent operation; cron will still mail you on errors (stderr).

Activity Log

All operations (fetch, generate, deploy, push, notify) are logged. Two targets are supported:

File (default) β€” append-only log file, rotation left to logrotate or similar:

log:
  target: file
  path: /opt/dmarcreport/data/activity.log

Syslog β€” writes to the system logger, useful for log-tailers, SIEMs, and journalctl:

log:
  target: syslog
  syslog_facility: local0    # local0–local7, daemon, user
  syslog_ident: dmarcreport  # program name shown in syslog

On Linux, syslog connects to /dev/log; on macOS to /var/run/syslog; falls back to UDP localhost:514. Entries look like:

Mar  4 12:00:00 hostname dmarcreport: [FETCH] Imported 3 reports, 0 duplicates

Each config file can have its own log target, so separate environments write to separate destinations.

FTP security modes

FTP connections support three security modes (set security in config):

Mode Description Default Port
plain Unencrypted FTP (default) 21
tls Explicit FTPS β€” upgrades via AUTH TLS 21
ssl Implicit FTPS β€” SSL from connection start 990

This applies to both site deployment (site.ftp.security) and code push (push.ftp_security).

Forensic reports (RUF)

In addition to aggregate reports (RUA), dmarcreport can parse forensic failure reports (RUF/AFRF, RFC 6591). These contain per-message failure details including original headers and authentication results.

Note: Most large providers (Google, Microsoft, Yahoo) do not send RUF reports due to privacy concerns. Some smaller providers and corporate mail servers do. Configure ruf in your DMARC record to receive them when available.

# List forensic reports from the database
dmarcreport forensics

# JSON output with custom limit
dmarcreport forensics --json --limit 100

The static dashboard includes a Forensics page that displays all imported forensic reports. See DMARC-GUIDE.md for recommended DMARC record configuration to enable RUF reporting.

Database maintenance

# Show database stats
dmarcreport check db

# Run backend-specific maintenance
dmarcreport db optimize

# Preview what prune would remove
dmarcreport prune --older-than 365 --dry-run

# Delete reports older than 1 year
dmarcreport prune --older-than 365

# Wipe everything and start fresh
dmarcreport reset --confirm

check db now also shows the active retention policy and, for SQLite, the connection settings being used for journal mode, synchronous mode, temp storage, and cache size.

prune --dry-run is there when you want to see the impact before deleting anything.

Credits

Written by Joaquim Homrighausen, sponsored by WebbPlatsen i Sverige AB, Sweden β€” bringing you GDPR safe Internet Solutions since 1998.

Dashboard built with Bulma and Chart.js, both MIT licensed.

For commercial support and SaaS hosting of dmarcreport, please contact [email protected].

License

Copyright 2026 Joaquim Homrighausen. All rights reserved.

AGPLv3 β€” see LICENSE.

All runtime dependencies (Typer, Pydantic, Rich, PyYAML, optional PyMySQL) are MIT/BSD licensed. Bundled frontend assets (Bulma CSS, Chart.js) are MIT licensed.