Email Deliverability Data
We just published our 2004 year-end email deliverability report. Feel free to download the pdf, but I’ll summarize here. First, this report is very different from the reports you see published by Email Service Providers like Digital Impact and DoubleClick, because (a) it measures deliverability across a broad cross-section of mailers, not just a single ESP’s clients, and (b) it is a true measure of deliverability — what made it to the inbox — as opposed to the way some ESPs measure and report on deliverability, which is usually just the percentage of email that didn’t bounce or get outright blocked as spam.
Headline number one: the "false positive" problem (non-spam ending up in the junk mailbox) is getting worse, not better. Here’s the trend:
Full year 2004: 22%
Second half 2003: 18.7%
First half 2003: 17%
Second half 2002: 15%
Headline number two: mailers who work on the problem can have a huge impact on their deliverability. Obviously, I’m biased to Return Path’s own solution for mailers, but I think you can extrapolate our data to the broader universe: companies that work on understanding, measuring, and solving the root causes of weak deliverablility can raise their inbox rate dramatically in a short time — in our study, the average improvement was a decrease in false positives from 22% to about 9% over the first three months. But we have a number of mailers who are now closer to the 2% false positive level on a regular basis.