- Superlines analyzed 34,234 AI responses across 10 platforms over 30 days and found citation volumes for the same brand differ by 615x between Grok (highest) and Claude (lowest).
- Grok cites sources at a 27.01% rate compared to ChatGPT‘s 0.59% — a 46x gap even between popular platforms — while Perplexity leads in driving actual clicks at 13.05%.
- “Ghost citations” are a newly identified phenomenon where AI platforms link to your website without ever mentioning your brand name. Gemini cited superlines.io 182 times in 30 days without naming the brand once.
- Only 30% of brands maintain visibility between consecutive AI answers, and just 20% remain present across five consecutive runs of the same query.
What Happened
Superlines published research in March 2026 based on 34,234 AI responses collected across 10 platforms over a 30-day period from January 14 to February 13, 2026. The central finding: citation volumes for the same brand, the same content, during the same time period differ by a factor of 615 depending on which AI platform a user queries.
Grok produced the highest citation rate at 27.01% with 8.47% brand visibility. At the other end, Claude produced the lowest citation volume for the same brand. Between them sit Perplexity at 13.05% citation rate with 0.64% brand visibility, and Google AI Mode at 9.09% citation rate with 2.14% brand visibility. ChatGPT, the most widely used AI assistant, cited sources at just 0.59%.
Why It Matters
Most brands that track AI visibility at all monitor a single platform, typically ChatGPT or Perplexity. The Superlines data reveals that single-platform tracking creates a fundamentally incomplete picture. A brand can appear dominant on Grok and functionally invisible on Claude, with no way to detect the gap without multi-platform monitoring.
The gap between Grok’s 27.01% citation rate and ChatGPT’s 0.59% represents a 46x difference — and that is between two of the most widely used platforms. The full 615x spread between the highest and lowest platforms means that any brand visibility strategy built on data from one AI tool is likely misleading.
“A brand can be thriving on one platform and invisible on another, and without multi-platform tracking, you would never know,” the Superlines research states. This echoes separate findings from RankScience that AI citation patterns vary so widely across platforms that optimizing for one can leave a brand entirely absent from another.
Technical Details
The research identified a phenomenon Superlines calls “ghost citations.” These occur when an AI platform links to a website in its response but never mentions the brand name associated with that website. During the 30-day study period, Gemini cited superlines.io 182 times but mentioned the brand name “Superlines” zero times. The AI was actively sending users to the site while keeping the brand itself anonymous.
Ghost citations create a measurement blind spot. Traditional brand monitoring tools that scan for brand name mentions would report zero visibility on Gemini, even though the platform was directing users to the site 182 times per month. Only tools that track URL-level citations alongside brand mentions can detect this pattern.
The citation volatility data is equally striking. Only 30% of brands maintain visibility between two consecutive AI answers to the same query. Across five consecutive runs, just 20% of brands remain present. This means that even brands with strong AI visibility cannot count on consistent citation — the same query run five minutes apart may produce entirely different source selections.
Citation behavior also varies by query type. Superlines data shows that Perplexity and Google AI Mode drive the most actual website clicks at 13.8% and 9.5% respectively, while platforms with higher raw citation counts like Grok produce less click-through traffic. High citation volume does not automatically translate to high referral traffic.
Who’s Affected
Marketing teams, brand managers, and SEO professionals face the most direct impact. Any team reporting AI visibility metrics based on a single platform is presenting an incomplete and potentially misleading picture to leadership. A brand that appears well-cited on Perplexity may be invisible on Claude, Gemini, and ChatGPT simultaneously.
The ghost citation phenomenon particularly affects companies that rely on brand recognition as a competitive advantage. If Gemini is linking to your product pages 182 times per month but never associating those links with your brand name, users may visit your site without any brand awareness — effectively turning your branded content into commodity referral traffic.
Agencies selling AI search optimization services are also affected. The 615x cross-platform variance means that client reporting must cover multiple AI platforms to be credible. An agency that optimizes for ChatGPT visibility alone can show zero results on Grok, where the same brand might actually have its strongest presence.
What’s Next
The Superlines data points toward a future where AI visibility monitoring requires the same multi-platform rigor that social media monitoring developed over the past decade. Just as brands learned they could not track social presence by monitoring Twitter alone, AI visibility demands coverage across ChatGPT, Perplexity, Gemini, Claude, Grok, Copilot, and emerging platforms.
The main limitation is tooling maturity. Superlines, Airefs, and a handful of other platforms offer multi-platform AI citation tracking, but the category is young and standardized metrics have not yet emerged. There is no equivalent of Google Search Console for AI citations — no single dashboard that provides authoritative, platform-endorsed visibility data.
For brands evaluating their AI citation strategy, the immediate step is to audit visibility across at least five AI platforms for their top 20 branded queries, with specific attention to ghost citations where URLs appear without brand mentions. Without that baseline, any AI optimization effort is operating on incomplete data.
