ChatGPT has 17.6% of Google’s query volume. It sends 96% less traffic to websites. Most brands look at those two numbers and move on. They shouldn’t — because both statistics are measuring the wrong thing.

The unit of AI search isn’t the query. It’s the decision.

Marketers have spent two decades optimizing for Google clicks. Impressions, CTR, sessions, bounce rate — an entire discipline built around the assumption that the click is the moment that matters. That assumption is quietly breaking. A single user on ChatGPT is doing something a Google click almost never does: forming a complete, synthesized opinion about your brand before they ever visit your site. Possibly before they ever search for you at all.

The numbers below make that concrete.


The Data, Side by Side

Google Search
A distributed journey
6m 12s Average session duration First Page Sage, Q2 2026
2–3 Average queries per session ALM Corp, 2026
60–65% Searches that end with zero clicks to any website Superprompt, Q1 2026
360 Clicks to open web per 1,000 searches Neotype, 2025
−61% Organic CTR drop when AI Overview appears Superprompt, 2025
~5 words Average query length WebFX, 2025
ChatGPT Session
One complete research journey
13m 9s Average session duration First Page Sage, Q2 2026
348 words Average words per conversation — 70× longer than a Google query WebFX: 13,252 real conversations
1,686 chars Average response length — 22 sentences, 10+ sources cited WebFX, 2025
93.7% Of ChatGPT queries are informational — the opinion-formation stage NP Digital via Rosemont Media
80% Of B2B buyers now begin their research in AI tools, not Google ABM Agency, 2026
2.5B Prompts processed by ChatGPT per day OpenAI via Panto AI, July 2025

A Note on One Widely Cited Number

Methodology context

Semrush’s 17-month clickstream study puts ChatGPT’s average queries per session at 1.75. That number requires context. Semrush measures clickstream data: page loads and URL visits tracked at the browser level. A ChatGPT session is a single URL. Everything that happens inside the conversation — every follow-up, every refinement, every comparison — is invisible to their methodology. They count how many times someone opened a new chat, not how many exchanges happened inside it.

WebFX’s analysis of 13,252 actual ChatGPT conversations found 1.7 messages per session — which looks similar to Semrush’s figure. But those messages average 348 words each, versus a typical Google query of ~5 words. The surface number is comparable. What it represents is not. Semrush’s core business is traditional SEO tooling. Their methodology cannot see inside a ChatGPT conversation. That is a description of what clickstream data measures — not an accusation.


What a Single ChatGPT User Is Actually Worth

This is the reframe marketers need.

For two decades, a Google click meant someone saw a headline and tapped it. Maybe they bounced in 8 seconds. Maybe they skimmed a page and left. The average Google session spans 6 minutes across 2–3 queries — and 60–65% of searches don’t even produce a click. Marketers have been optimizing a metric where the majority outcome is nothing.

A single ChatGPT user in a research session is something categorically different. They are 13 minutes in. They have typed 348 words of context. They are reading a synthesized answer that cites 10+ sources. They are following up. They are refining. And 93.7% of the time, they are in pure opinion-formation mode.

That user will not arrive at your site with an open mind. They will arrive already holding a narrative about you — accurate or not, favorable or not — assembled by an AI that may have pulled from a press release you published two years ago, a Reddit thread you never saw, or a competitor’s positioning that quietly frames you as the budget option.

Google gave brands a distributed journey. Multiple sessions, multiple days, multiple recovery chances. The B2B buyer used an average of 10 interaction channels before deciding (McKinsey via Ahrefs), up from 5 in 2016. You had surface area across all of them.

ChatGPT removes most of that surface area and concentrates it into one session. 80% of B2B buyers now begin their research in AI tools (ABM Agency, 2026). Gartner projects traditional search volume declines 25% by 2026 — and the queries migrating first are not the transactional ones. Those still go to Google. The ones migrating are the research queries: “what’s the best tool for X,” “how does this compare,” “what are companies like mine actually using.” The ones that form the shortlist.

Those queries now get answered once. In 13 minutes. Before your brand ever gets a chance to speak.


Your Mention Count Misses All of This

Most brand monitoring tools report mentions. They count how often your brand name appears in AI outputs across a set of tracked prompts. The number looks clean. Management appreciates it.

It measures almost nothing that matters.

ChatGPT processes 2.5 billion prompts per day. Your tracking tool is watching a few dozen. Of the answers it does catch, it tells you you were mentioned — but not what was said. Not whether the description was accurate. Not whether a competitor was named alongside you as the preferred option. Not whether the AI characterized you as the budget alternative, the complicated one, the one that’s hard to onboard.

Those characterizations land inside a 13-minute session on a buyer who has no particular reason to verify them. They become the starting point for every click, every demo request, every deal that follows — or doesn’t.

You get the mention count. You also get the story behind it — or you don’t, and you find out the hard way.


What Showing Up Actually Requires

Session depth changes what brand visibility means. It is not enough to appear. You have to appear accurately, consistently, and in the right context — because for a growing share of buyers, the AI answer is the research journey, not the start of it.

The brands that hold up are the ones whose narrative is consistent everywhere AI can read: their site, their press coverage, their review profiles, their category positioning. When a buyer asks a follow-up question and the AI pulls from a different source, the story needs to match.

Run the scan. See what ChatGPT actually says about your brand across the prompts your buyers are running. See if the story holds.

Sources: First Page Sage, Q2 2026 · ALM Corp, 2026 · WebFX: 13,252 ChatGPT Conversations · Semrush 17-Month Clickstream Study · Superprompt Zero-Click Report, Q1 2026 · Neotype Zero-Click Study, 2025 · NP Digital via Rosemont Media · ABM Agency B2B AI Buyer Journey, 2026 · Ahrefs / McKinsey B2B SEO Statistics · OpenAI via Panto AI, 2026 · Gartner via Psyke