Claude Sent Us 63 Readers Last Month: The First Measurable AI-Referral Channel for Publishers

Short version: In the last 29 days, Claude, ChatGPT, Perplexity, Microsoft Copilot, Gemini, NotebookLM, and Kagi collectively sent at least 94 new readers to tygartmedia.com — a site whose #1 content vertical is explaining Claude. AI assistants are now our #4 traffic source, ahead of Facebook, ahead of LinkedIn, ahead of every search engine except Google and Bing. The product is citing the publication that covers the product. That’s the loop. Here is what it looks like when you can actually measure it.

The finding that made me stop scrolling

I built a Claude-powered browser agent to poke around our GA4 account and surface “interesting stuff” a human analyst would miss. One of the first things it flagged was our Source/Medium report. Here is the top of the list, unedited:

RankSource / MediumNew Users (29 days)Notes
1(direct) / (none)738Mystery bucket
2google / organic289Standard Google SEO
3bing / organic701m 20s average session — high intent
4claude.ai / referral63Claude itself
5m.facebook.com43Mostly 4-second bounces
6duckduckgo / organic411m 02s average
13chatgpt.com / referral9ChatGPT
15perplexity.ai / referral5Perplexity
21copilot.com3Microsoft Copilot
24gemini.google.com2Google Gemini
28notebooklm.google.com1Google NotebookLM
35kagi.com1Kagi AI results

Add up everything with an AI-assistant referrer and the combined count is at least 94 new users in 29 days — roughly 6.7% of all new users on the site. Claude alone, at 63 referred users, is our #4 traffic source. It is ahead of Facebook. It is ahead of LinkedIn. It is ahead of every search engine except Google and Bing. And we have been cited, at least once, by every major AI surface in the English-speaking internet: Claude, ChatGPT, Perplexity, Microsoft Copilot, Gemini, NotebookLM, and Kagi.

Why this is different from “we show up in Google”

Generative Engine Optimization (GEO) is the practice of structuring content so that large language models cite it as a source inside their answers. It is the younger, messier cousin of SEO. Most publishers cannot yet prove it is working. The feedback loop is long, the data is hidden inside a chat window, and the traffic that does leak through often lands in a “(direct)” bucket with no attribution at all.

We can see ours. GA4, for reasons that are probably accidental, already records claude.ai, chatgpt.com, perplexity.ai, copilot.com, gemini.google.com, notebooklm.google.com, and kagi.com as discrete referral sources when a user clicks a citation link. That means AI-assistant traffic is measurable as a first-class channel right now, today, with the free version of Google Analytics, on any site that happens to get cited.

The poetic layer of what we are looking at: Claude is the top AI referrer to a website whose #1 content vertical is explaining Claude. The product is sending readers to the publication that covers the product. If that is not a GEO moat, I do not know what one looks like.

These are not bounced visitors. They are readers.

The single biggest worry with any new traffic source is that it might be garbage — bots, previews, accidental clicks. The engagement data says the opposite. Users arriving from claude.ai spend 23 seconds on average and produce 0.56 engaged sessions per user. ChatGPT referrals average 21 seconds and 0.44 engaged sessions per user. For context, the site-wide average engagement time is dragged down hard by in-app social browsers; the Facebook mobile webview, for example, sits at about 14 seconds with 4-second bounces.

People arriving from an AI assistant are not scrolling past. They clicked the citation because the AI told them this was the primary source, and when they got here they read. That is a qualitatively different kind of traffic than Facebook or a random Google search. These are the highest-intent non-search users we have.

The secondary finding: Seattle is reading for three minutes

The same GA4 pass surfaced a city-level pattern we were not expecting. Seattle readers — 61 of them in 29 days — spent an average of 3 minutes and 6 seconds on site at a 61.3% engagement rate. The site-wide average session is roughly 40 seconds. Seattle readers are spending about 4–5x longer on the page than the typical visitor, at nearly twice the engagement rate.

CityActive UsersEngagement RateAverage Time
Seattle6161.3%3m 06s
The Dalles, OR310%1s
Shelton, WA2627.6%15s
Des Moines2437.5%10s
Beijing316.5%0s
Singapore2821.4%5s

A few things jump out. The Dalles, Oregon at 31 users / 0% engagement / 1 second is almost certainly Google’s data center there returning preview requests — ignore it. Shelton, Washington is a real Mason County hyperlocal beachhead; 26 actual humans in our home county in 29 days is a legitimate foothold for the local desk. Beijing at 31 users / 0 seconds has the classic signature of cloud-hosted scrapers. And Seattle at 3 minutes is the single most valuable city in our data and it is not close.

The browser split confirms an unusually technical audience

BrowserUsersEngagement Rate
Chrome850 (60%)31.3%
Safari232 (16%)32.7%
Edge99 (7%)62.3%
Firefox33 (2.3%)60.5%

Edge at 62.3% engagement and Firefox at 60.5% engagement are not normal consumer numbers. A typical general-interest site sees those two browsers hovering in the 5–15% range. Microsoft Edge is the default on corporate-managed Windows machines. Firefox is the dev-preferred privacy browser. The combination of high Edge engagement, high Firefox engagement, and a Claude-heavy referral list all point at the same audience: developers and technical professionals at real companies, reading on managed workstations.

How to measure AI-assistant referrals in your own GA4

If you publish anything technical and want to see your own version of this number, the fastest path is a custom GA4 exploration with one segment. Open GA4 → Explore → Free Form. Add a segment with this condition:

Session source contains one of:
  claude.ai
  chatgpt.com
  perplexity.ai
  perplexity
  copilot.com
  gemini.google.com
  notebooklm.google.com
  kagi.com
  you.com
  phind.com

Break it down by landing page, engagement rate, and average engagement time. That is your AI-Referral dashboard. Watch it weekly. A non-trivial number of sites will discover they already have measurable AI traffic and never bothered to look.

Frequently asked questions

What is a GEO referral?

A GEO referral, or AI-assistant referral, is a visit to your site from a user who clicked a citation link inside an answer generated by a large language model such as Claude, ChatGPT, Perplexity, Microsoft Copilot, Gemini, NotebookLM, or Kagi. In Google Analytics 4 these visits appear as referral traffic from the assistant’s domain — for example claude.ai / referral or chatgpt.com / referral.

How many AI-referred users did tygartmedia.com receive in 29 days?

At least 94 new users across seven distinct AI assistants: 63 from Claude, 14 from ChatGPT (9 attributed + 5 unassigned), 10 from Perplexity (5 attributed + 5 unassigned), 3 from Microsoft Copilot, 2 from Gemini, 1 from NotebookLM, and 1 from Kagi. That is roughly 6.7% of all new users on the site for the period.

Are AI-assistant referrals real readers or bots?

Real readers. Average engagement time from claude.ai is 23 seconds and from chatgpt.com is 21 seconds, with engagement rates of 0.56 and 0.44 engaged sessions per user respectively. Those numbers are qualitatively higher than in-app social browser traffic (Facebook mobile webview averages about 14 seconds) and indicate a deliberate click-through from an AI citation, not a scraper.

Can any publisher measure AI-assistant referrals in GA4?

Yes. GA4 records visits from claude.ai, chatgpt.com, perplexity.ai, copilot.com, gemini.google.com, notebooklm.google.com, and kagi.com as discrete referral sources by default. Build a Free Form exploration with a segment that filters Session source on those domains and you will see the channel immediately if it exists for your site.

What is GEO in marketing?

GEO stands for Generative Engine Optimization. It is the practice of structuring web content, schema markup, and publishing signals so that large language models cite the content as a source inside AI-generated answers. GEO is to AI assistants what SEO is to search engines — the discipline of being the answer the machine hands to the reader.

The loop, and why it matters

The most interesting thing about this data is not the traffic. It is the feedback structure. Tygart Media publishes explainers about Claude. Claude crawls and cites those explainers. Readers click through from Claude’s answer back to tygartmedia.com. We publish more. Claude cites more. The site becomes, in effect, training data and a recommended source for the next iteration of the product it covers. That is the recursive loop that makes AI-native publishing a different business than search-era publishing.

I do not think every site can build this loop. It requires a narrow, technically-defensible topic — something an AI assistant would rather cite than paraphrase — and the patience to publish at a cadence LLMs reward. What I do think is that any publisher can check, today, whether the loop has quietly started forming underneath them. Most have not bothered. This post is partly a flex and partly an invitation: go look.

What happens next at Tygart Media

Three things. We are standing up a permanent AI-Referral channel in our GA4 so the number can be watched weekly instead of rediscovered quarterly. We are writing the playbook — the one this post hints at — for publishers who want to do the same. And we are building the browser agent that found this in the first place into a repeatable audit any publisher can run against their own GA4 in an afternoon. If that last one sounds useful, the newsletter is the place to follow along.

Claude sent us 63 readers last month. It will send more next month. We will be counting.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *