Quarterly Report

The State of DMARC:
First Quarter, 2026

What the DMARCeye monitoring dataset reveals about organizations already engaged with DMARC: adoption, compliance, and the senders moving their mail, as of Q1 2026.

May 2026 DMARCeye Research dmarceye.com
Key Findings
  1. 01

    Over one-third of DMARC-engaged domains haven't reached enforcement.

    About 37% are still at p=none (monitor-only); only 26% have reached p=reject. The path from monitoring to enforcement is where the industry is stuck, and that's among domains already engaged with DMARC. The picture across the broader internet is worse, as our scanner data shows below.

  2. 02

    Compliance sharply climbs with email volume.

    Domains sending under 100 emails a month average 62% DMARC compliance. Domains sending over 10 million a month average 99.8%. Higher-volume senders are essentially forced into clean configurations; smaller senders aren't.

  3. 03

    Staged rollout is barely used.

    Among domains at p=reject, 94% enforce at 100% from day one, skipping DMARC's built-in pct= mechanism for gradual rollout. The DMARCbis revision currently in IETF review removes pct= entirely in favor of a simpler binary toggle, on the grounds that the granular percentage control was rarely used in practice. The "all-or-nothing" reality of DMARC enforcement isn't going to change under the new spec.

About

About DMARCeye

DMARCeye is a DMARC monitoring platform that turns aggregate report data into a clear picture of who is sending as your domain and what to do about it. It was built by the engineering team behind Ecomail, a European email marketing platform sending over a billion emails a month, and tested at scale in a high-volume sending environment before being offered as a standalone product.

This report is one of the ways we are sharing what we are seeing in that data. A more detailed company description appears at the end.

About This Report

This report is built primarily on the DMARCeye monitoring platform's reporting data as of the end of Q1 2026: DMARC aggregate reports collected from mailbox providers worldwide for several thousand domains under active monitoring, spanning multiple industries, regions, and sender-volume tiers. One view supplements that with a sample of several thousand public-facing domains scanned blindly for DMARC, SPF, and DKIM records.

All figures represent a snapshot at the end of Q1 2026 (March 2026), reflecting the cumulative state of DMARC posture and authentication activity as of that date.

A full methodology section, including details on how DMARC compliance is computed, how sender attribution works, and which records are excluded, appears at the end of the report.

Contents

The full report is gated

Enter your email on the cover above to access all 12 views.

Download PDF

DMARC Policy Distribution

This chart shows where DMARC-engaged domains sit on the policy spectrum, and how reliably the typical domain in each tier actually authenticates. Every domain is counted equally, regardless of how much mail it sends. p=none is monitor-only; p=quarantine sends suspicious mail to spam; p=reject is full enforcement. About a third of monitored domains have reached full enforcement, but moving to p=reject doesn't automatically fix authentication. Even at full enforcement, one in ten domains still struggles to clear DMARC.

Policy Distribution and Per-Policy Compliance

Bars show the share of active domains at each policy level; the pill above each bar is the average domain compliance in that tier.

Active domains with a valid DMARC policy in Q1 2026

DMARC Adoption: Monitored Domains vs. the Public Internet

DMARCeye runs a public scanner via which it has collected a wide, unfiltered sample of thousands of domains. Comparing that to the DMARC-engaged population (organizations actively monitoring their own DMARC posture) shows what changes once teams take the problem more seriously. 28% of open-internet domains have no DMARC record at all, a threshold the engaged group has by definition cleared. Stricter enforcement at the right end (p=reject) is also more common among the engaged: 27% vs 18%.

DMARC Posture: Engaged Organizations vs Public Internet

What fraction of each population sits at each DMARC posture.

"DMARC-engaged" = organizations actively monitoring their domains' DMARC posture · "Open internet" = a broader sample of public-facing domains scanned by DMARCeye's tools

DMARC Compliance Climbs With Email Volume

This chart shows the compliance rate broken out by how much email a domain sends. Every domain is counted equally within its tier, regardless of how much mail it sends. The lowest-volume domains struggle the most; many haven't fully configured SPF and DKIM. As volume scales up, authentication tightens dramatically. The biggest senders run effectively perfect compliance.

Average Compliance Rate by Domain Email Volume

Each bar is the average compliance rate of all domains in that volume bucket.

Each domain counted equally within its volume bucket: small domains aren't drowned out by big ones

The Compliance Tail

Tier averages hide a wide spread; this view shows what falls inside each one. Each domain counts as one, regardless of how much mail it sends. The headline finding: even the biggest senders have a small share of domains authenticating poorly. Roughly 1 in 100 domains in the >1M tier still authenticates below 50%. In the <100 tier, that ratio is closer to 1 in 3. The 10M+ tier is uniformly healthy in this snapshot, with no domain falling below 90% compliance.

Compliance Band Distribution by Volume Tier

Each row is one volume tier. The green portion is the share of domains running ≥99% compliance; the dusty-red portion is the share running below 50%.

More Email Means a Bigger Sending Footprint

As a domain's email volume scales, so does the size of its authenticated email infrastructure (the cumulative pool of distinct IP addresses sending on its behalf). The biggest senders surface thousands of distinct sources in their DMARC reports. This isn't a count of separate vendors or services. A single Google Workspace tenant can surface dozens of distinct sending IPs across regions, and a single ESP customer can surface IPs from multiple network ranges. These bars represent the total authenticated infrastructure footprint, not the count of business relationships.

Average Distinct Sending IPs per Domain, by Volume Tier

Each bar is the average count of distinct sending IPs across domains in that volume bucket.

Top Vendors Behind the Mail

Two vendors (Klaviyo and Amazon SES) together carry roughly 58% of total mail volume in the dataset. The rest of the long tail is dominated by familiar names: SendGrid, Google, Mailchimp, Microsoft, Mailgun. The remaining quarter sits in the "other / unattributed" bucket: a long tail of customer-specific signing domains, niche regional ESPs, and in-house mail systems. This view groups senders by the brand on the envelope, identified by which service DKIM-signed the message. It's the closest answer to "who actually sent this," and it's a different question from "whose network physically delivered it." Klaviyo, for example, runs on top of Amazon's infrastructure but signs its own mail, so it appears here as a distinct vendor, but not at all in view 7, which focuses on networks rather than brands.

Top Vendors by Share of Volume

Each bar is the share of total mail volume signed by that vendor.

Vendors identified by the DKIM-signing domain on outgoing mail (e.g. klaviyomail.com → Klaviyo, mcsv.net → Mailchimp). Bars sum to 100%. The "Other / unattributed" bar covers customer-specific signing domains, niche regional ESPs, and in-house mail systems not rolled up into a named vendor family.

Top Email Service Providers

This chart ranks the ten biggest email infrastructure providers in the dataset (attributed by ASN, the network owner of each sender IP) by how reliably their mail clears DMARC. Note that some of the biggest sender brands from the previous view don't appear here because they don't own a network. Klaviyo is the clearest example: it ranks #1 by DKIM-signing domain in the chart above, but its mail rides on Amazon and SendGrid infrastructure, so it reattributes into those rows here.

The top of the table is essentially perfect on overall compliance, but the SPF and DKIM columns reveal something more subtle. Several providers (Mailchimp, Sendinblue, Mailjet) have very high SPF fail rates yet still post 99% compliance, because DKIM picks up the slack. That's the normal pattern for mail sent on behalf of customer domains: SPF fails because the customer hasn't included the provider's IPs in their SPF record, but the provider DKIM-signs and DMARC passes through alignment. Mailgun and Proofpoint are different. Their DKIM fail rates sit above 22%, meaning roughly one in four of their messages arrives without a valid DKIM signature. With both SPF and DKIM unreliable, DMARC has nothing to pass on. That's almost always a configuration problem on the sender side, not a flaw with the provider itself; customers haven't added the right DKIM keys to their DNS.

Top 10 ESPs, Ranked by Compliance

Per-provider authentication outcomes across all attributed mail.

# Email Service Provider Compliance SPF Fail Rate DKIM Fail Rate
1Bird.com Inc.99.9%12.2%0.4%
2Amazon.com, Inc.99.8%3.5%0.5%
3Mailjet SAS99.8%39.1%0.2%
4SendGrid, Inc.99.7%0.6%0.5%
5Sendinblue SAS99.5%93.1%0.5%
6MailChimp98.9%89.2%1.3%
7Google LLC95.8%26.5%11.1%
8Microsoft Corporation90.0%26.6%19.6%
9Proofpoint, Inc.78.1%38.9%26.3%
10Mailgun Technologies Inc.77.2%24.0%22.9%
ASN = Autonomous System Number, the network owner of an IP. Mail is attributed by IP rather than DKIM signing domain, so brands using a provider's pipes (e.g. Klaviyo on Amazon SES) roll up to the underlying provider here.

Same ESP, Different Compliance by Country

Even the same global provider performs differently across regions. Google's compliance varies just over a point between the US and India; Microsoft's spread is roughly two points across Ireland, France, and the UK; Amazon's spread is just over three points between Germany and the US. Each provider's regional IP ranges carry a slightly different mix of senders: well-configured tenants, misconfigured tenants, and a portion of spoofed mail attempting to impersonate that provider's infrastructure. The chart can't separate these on its own, but the takeaway holds: a provider's brand alone isn't a guarantee of compliance, and the compliance number you see is a regional figure, not a global one.

Compliance by Provider and Country

Compliance rate of the same ESP varies by where its IPs are located.

Top 3 multi-country ESPs · countries shown have ≥1M email volume

Top DNS Providers, by Market Share

This chart shows where monitored domains manage their DNS, and how well their email authenticates. Cloudflare alone handles a quarter of identified domains. Authentication compliance is strong across major providers (96–99.5%), with one notable exception: Azure DNS at 87.66%, likely reflecting enterprise configurations where mail infrastructure is more complex than the DNS setup suggests.

Top 10 DNS Providers: Domain Share & Compliance

Domain Share: this provider's share of all domains with an identified DNS provider · Compliance: authentication health, weighted by email volume.

Provider Compliance
Healthy: ≥97% Watch: 93–97% Issue: <93%

DMARC Policy by Top-Level Domain

Different top-level domains have markedly different DMARC adoption patterns. The generic .com namespace anchors the chart at the top; its mix is roughly 43% none, 34% quarantine, 22% reject, a useful baseline for comparing against country-specific TLDs. South African domains (.za) lead on enforcement, with over half at p=reject. Czech (.cz) domains run the most permissive, with two-thirds still at p=none.

Policy Mix per TLD

Each row is one TLD, showing what share of its domains sit at each policy level.

Each TLD's percentages sum to 100: domains without a published p= tag (likely onboarding or data-quality cases) are excluded

Subdomain Policy Inheritance

When a domain sets a DMARC policy, it can also explicitly declare a separate policy for its subdomains via the sp= tag. Most domains don't bother; their subdomains inherit the primary automatically. Each row below is one primary policy level, showing what was explicitly declared. The pattern: most domains skip sp= entirely (the subdomain inherits the parent), but among p=reject domains, more than a quarter explicitly declare sp=reject too, reinforcing the strictest setting all the way down.

Subdomain Policy Declared, by Primary Policy

Each row is one primary policy level, showing the share of those domains that explicitly declared each subdomain policy via sp=.

"Not declared" means the domain didn't set a separate sp= tag, so its subdomains inherit the primary p= automatically.

Staged Rollout Usage

When a domain moves to p=quarantine or p=reject, DMARC allows applying the policy to only a percentage of mail at first (via the pct= tag), which is a safer path than flipping to full enforcement overnight. Most domains skip this. 94% of quarantined and 93.5% of rejecting domains apply the policy at 100% from day one. Only about 6% use the staged rollout safety mechanism. The forthcoming DMARCbis revision of the standard reflects this reality: pct= is being deprecated in favor of a simpler binary t=y flag for testing mode, on the grounds that the granular percentage control was rarely used in practice.

Staged Rollout Usage, by Primary Policy

Each row is one primary policy level, showing what share of those domains apply the policy at full enforcement vs at a reduced percentage.

A missing pct= tag defaults to 100 per the DMARC spec, so it's grouped with full enforcement here.

How this report was built

Datasets

This report draws on two distinct datasets:

  • The DMARCeye monitoring dataset. A representative sample of domains actively monitored by the DMARCeye platform, comprising several thousand domains across multiple industries, regions, and sender-volume tiers. Mail attributed to these domains is reflected in their DMARC aggregate (RUA) reports submitted by receiving mail providers worldwide.
  • The public scanner sample. A representative sample of public-facing internet domains scanned blindly for DMARC, SPF, and DKIM record presence and configuration, comprising several thousand domains. This sample is used only in the comparison shown in view 02.

Time period

All figures in this report represent a snapshot taken at the end of Q1 2026 (March 2026), reflecting the cumulative state of DMARC posture and authentication activity across the sampled domains as of that date.

What we measured

For each domain in the monitoring dataset, we computed DMARC compliance as the share of mail that passed DMARC alignment among all mail with a pass-or-fail result over the measurement window. SPF and DKIM fail rates are reported separately against each mechanism's own totals. Sender attribution is computed at two layers: by DKIM signing domain (the "vendor brand" view) and by ASN owner of the sending IP (the "infrastructure" view). The two layers can attribute the same mail differently because layered services like Klaviyo sign their own DKIM but ride on Amazon and SendGrid infrastructure.

What this report doesn't cover

This report focuses on email authentication (DMARC, SPF, and DKIM), not raw deliverability or inbox placement. A domain with high DMARC compliance can still see its mail filtered to spam for reputation reasons; a domain with low DMARC compliance can still reach the inbox if the receiving provider chooses to deliver despite alignment failures. The two are correlated but distinct.

Caveats and exclusions

  • Customer-specific signing domains (signing domains belonging to single end-user organizations rather than vendors that other businesses send through) are excluded from the vendor share view in chart 06. They are bucketed into "Other / unattributed."
  • Mail attributed by ASN can roll up multiple tenant services into a single provider row. Klaviyo's mail volume, for example, counts toward Amazon's and SendGrid's rows in chart 07 because Klaviyo runs on those networks, even though Klaviyo signs its own DKIM and surfaces as a distinct vendor in chart 06.
  • A small number of malformed ASN entries in the underlying network data were filtered out before aggregation.

More About DMARCeye

DMARCeye is a DMARC monitoring platform that turns aggregate report data into a clear picture of who is sending as your domain and what to do about it. It was built by the engineering team behind Ecomail, a European email marketing platform sending over a billion emails a month, and tested at scale in a high-volume sending environment before being offered as a standalone product.

The findings in this report are shaping how we build DMARCeye. If the data shows that organizations are getting stuck between monitoring and enforcement, and that authentication failures happen when SPF and DKIM are not both properly configured, the answer is not more dashboards, but clearer visibility into what is failing and why. DMARCeye already provides domain-specific recommendations, and lets users ask questions about their own DMARC data in plain language through AI assistants.

DMARCeye is headquartered in Prague, Czech Republic. It is the fourth standalone product built by Ecomail's engineering team, alongside Topol.io (an embeddable email editor), Lettr (a transactional email platform), and Ecomail itself.

DMARCeye offers free DNS, SPF, DKIM, and BIMI checkers, a 14-day free trial on paid plans, and a free tier for single domains. Paid plans start at $4 per domain per month.

dmarceye.com  |  Media contact: jack@dmarceye.com  |  Prague, Czech Republic