Core Web Vitals & AEO: Does Page Performance Affect AI Citations?
Do slow pages get cited less by AI? The relationship between Core Web Vitals, technical performance, and AI citation probability is more complex than most assume. Here's what the evidence shows.
Does page performance directly affect AI citation rates?
The direct answer: Core Web Vitals as measured by Google (LCP, INP, CLS) are not currently documented ranking or citation factors for AI answer engines. ChatGPT, Perplexity, and Claude don't receive user experience signals the way Google does — they process raw content, not user engagement data.
The nuanced answer: performance matters significantly for AI citations through indirect mechanisms. Content that can't be extracted because it loads via JavaScript, pages that time out during bot crawls, and sites that block AI crawlers due to server load issues all suffer citation losses that look like performance problems even if the root cause is technical accessibility, not speed.
The distinction that matters
Crawl-time vs. render-time performance
AI bots fetch pages in two modes: some (like Perplexity) render JavaScript; many others (including some versions of GPTBot) fetch only the initial HTML response. Pages where critical content is loaded via JavaScript after initial HTML delivery are partially or fully invisible to crawlers that don't execute JavaScript.
Indirect effects of poor performance on citation probability
While Core Web Vitals don't directly influence AI citation algorithms, poor performance creates several conditions that reduce citation probability indirectly.
Crawl budget exhaustion
Slow TTFB causes AI crawlers to spend more time on your pages, reducing how many pages get crawled per session. Low-priority pages miss crawl windows.
Bot timeouts
Many AI bots have strict timeout thresholds (5–10 seconds). Pages exceeding this threshold may be partially indexed or skipped entirely.
Google ranking correlation
Google's search rankings (which do use Core Web Vitals) influence which pages AI bots treat as authoritative starting points for crawl discovery.
Crawl rate limits
Servers that respond slowly under crawl pressure may return 429 or 503 errors to AI bots, causing content to be deprioritized in their indexes.
JavaScript rendering and content visibility for AI
The most impactful technical AEO issue is not speed — it's JavaScript-dependent content. If your key informational content, schema markup, or FAQs only appear after JavaScript executes, a significant portion of AI bots will never see them.
Test your pages by disabling JavaScript in your browser and viewing the source. Every piece of content that disappears is a citation risk. Content that matters for AI visibility should be in the server-rendered HTML, not appended by client-side JavaScript.
Schema injected via client-side JavaScript is unreliable
TTFB and schema parsing reliability
Time to First Byte (TTFB) is the one performance metric with a direct correlation to AI crawl quality. A high TTFB means AI bots spend more clock time waiting for content to begin arriving. Combined with timeout thresholds, this can cause complete crawl failures on slow-responding pages.
Target TTFB under 200ms for pages you want consistently crawled and cited. Use CDN edge caching, static generation, or edge rendering to achieve this, especially for high-priority pages.