← Back to Debrief
Analytics

Bots Now Generate 53% of All Web Traffic. Your Analytics Are Lying to You.

If 53% of your traffic is automated, then your bounce rate, session duration and pageviews per session are measuring something other than customer behaviour.

Filip Ivanković··2 min read
2 min read

More than half the traffic hitting your website is not human. Imperva's 2026 Bad Bot Report confirms automated traffic now accounts for 53% of all web activity, up from 51% last year. Human activity has dropped to 47% and continues falling. Malicious bots alone represent 37% of total traffic.

This is the sixth consecutive year bad bot activity has risen. It is no longer a trend. It is the new baseline.

For marketers, the implications are not about security. They are about measurement. Every metric you report, every conversion rate you calculate, every traffic source you attribute is potentially contaminated by activity that has nothing to do with a human making a buying decision.

Why it matters

The rise is driven by AI and large language models making bot creation cheaper and more sophisticated. Scrapers, crawlers, credential stuffers and AI training bots now operate at a scale that makes them indistinguishable from real traffic in aggregate reporting. Financial services, healthcare and ecommerce are the most targeted sectors.

53%

Of all web traffic is now automated, with human activity at just 47%

For a marketing team reporting to a board, this corrupts everything. Your traffic is inflated. Your conversion rates are deflated (real conversions divided by bot-inflated sessions). Your cost-per-acquisition looks worse than it is because the denominator includes thousands of non-human sessions. Your retargeting audiences are polluted with bots. Your A/B test results are skewed by non-human interactions.

The marketers still reporting raw GA4 session counts as a KPI are making decisions on contaminated data. The ones filtering aggressively, cross-referencing with server logs and segmenting by known-human signals are getting closer to reality.

What to do about it

First, assume your traffic numbers are overstated by 15 to 30% after GA4's built-in filtering. GA4 filters known bots but misses sophisticated ones. Second, segment your analytics by engagement quality signals (scroll depth, time on page, multiple page views in sequence) to isolate human-like behaviour. Third, check your paid media campaigns for click fraud, particularly on display and programmatic channels where bot rates are highest. Fourth, if you run ecommerce, audit your cart abandonment rates against bot filtering. A significant portion of abandoned carts may never have been real.

The web is no longer a primarily human environment. Your analytics practice needs to account for that reality, or every decision you make downstream is built on a lie.

ShareLinkedInX

Debrief

Get the next one

No spam. No fluff. Just the next article, straight to your inbox.

Filip Ivanković
Filip IvankovićFounder, New Rebellion

10+ years leading performance marketing across agencies and in-house teams in Australia. Writes about the gap between marketing activity and commercial outcomes, and what it takes to close it.

Keep reading

All articles →

If this resonated

Let's talk about your marketing

30 minutes with a senior strategist. No pitch deck, no obligation. Just an honest conversation about what you need.