A randomized field study just measured what AI Overviews actually do to your traffic. The result lands harder than any anecdote.
Researchers from Indian School of Business and Carnegie Mellon recruited 1,065 U.S. desktop Chrome users. Three groups, two weeks each, January and February. Control saw normal Google. One group had AI Overviews stripped out. One group got AI Mode for everything.
The pre-registered finding: queries that triggered AI Overviews lost 38% of organic clicks. AI Overviews triggered on 42% of queries and sat at the top of the page 85% of the time. Zero-click searches climbed from 54% to 72% when overviews appeared. Outbound clicks per search dropped from 0.61 to 0.38.
That's the cleanest causal data we have on what's happening to publisher traffic right now. Search Engine Journal flagged the working paper this week, posted to SSRN by Saharsh Agarwal and Ananya Sen.
What the Methodology Actually Proves
This isn't another correlation report or a vendor-funded survey. The setup was a Chrome extension that randomly assigned users into groups. The assignment was hidden, and 95% of the "hide AI Overviews" group didn't notice anything was missing.
The pre-registration on the AEA RCT Registry means the researchers committed to what they would measure before they measured it. That detail matters. It rules out the kind of after-the-fact framing that makes most search studies useless.
Pre-registered, randomized, hidden assignment. That's about as clean as web experiments get without owning the search engine itself.
Operational reality: when AI Overviews show up, almost two out of every five clicks that would have gone to publishers don't happen. The clicks didn't go to a different site. They didn't happen at all.
The Satisfaction Number Is the Story
Most of the defense of AI Overviews has rested on user preference. Google's argument has been that users like them. Some publishers have conceded the trade-off, traffic for utility.
The study measured satisfaction on a five-point Likert scale across the three groups. It also measured perceived information quality and ease of finding information.
No difference. None.
The "users prefer AI answers" claim doesn't survive contact with this data. The hide-AIO group reported the same satisfaction as the AI Mode group. They didn't know what they were missing. They didn't miss it either.
That changes the framing entirely. AI Overviews aren't a trade between publisher traffic and user value. They're a transfer. Google captures more time on platform. Publishers lose the click. The user gets the same experience either way.
If you've been telling your CMO that AI Overviews are good for users so the click loss is "worth it", that argument is now empirically false.
What This Means for Your Marketing Plan
A 38% click reduction on queries where AI Overviews appear, applied to 42% of total queries, compounds into something material at the channel level. Categories with high AIO trigger rates lose more. Some commercial-intent categories I've seen in client data are closer to 60% click loss.
The shift isn't "AI is going to change search soon". Search already changed. The real question is whether your reporting changed with it.
A performance dashboard built around impressions, clicks, and rank positions is now measuring maybe 60% of the visibility surface. The other 40% lives inside answers, citations, and AI references. If your team is celebrating a stable ranking on a query that's quietly losing clicks to an AI summary, you're celebrating an artifact.
What I tell our agency clients at difrnt., and what GEOflux.ai is built to track: rebuild the visibility report so it shows AI presence alongside organic position. Track citations across ChatGPT, Gemini, Perplexity. Track which sources AI Overviews pull from, which they don't, and what your share of voice looks like across both surfaces.
This isn't an optimization. It's a measurement reset.
The brands that move first don't win because they're better at SEO. They win because they're already optimizing for the metric that matters in 2026. Citation share, not rank share.
The traffic isn't coming back. Build for what's actually there.
FAQ
How big is the AI Overviews click reduction?
A randomized field experiment with 1,065 U.S. desktop Chrome users found a 38% reduction in organic clicks on queries where AI Overviews appeared. Zero-click searches rose from 54% to 72% on those queries. Outbound clicks per search dropped from 0.61 to 0.38.
Do users actually prefer AI Overviews?
The same study found no measurable satisfaction difference between users who saw AI Overviews and users who didn't. Information quality, satisfaction, and ease of finding information were statistically equivalent across all three test groups. The pre-registered methodology rules out cherry-picked results.
What should brands do about declining AI search traffic?
Rebuild measurement to track citation presence in AI answers, not just blue-link rankings. Tools like GEOflux.ai measure brand mentions across ChatGPT, Gemini, Perplexity, and other LLM surfaces, which is the new visibility layer. Optimizing for citation share is now a separate discipline from traditional SEO.
