Technical AEO

Core Web Vitals & AEO: Does Page Performance Affect AI Citations?

Feb 14, 20257 min read

Do slow pages get cited less by AI? The relationship between Core Web Vitals, technical performance, and AI citation probability is more complex than most assume. Here's what the evidence shows.

Does page performance directly affect AI citation rates?

The direct answer: Core Web Vitals as measured by Google (LCP, INP, CLS) are not currently documented ranking or citation factors for AI answer engines. ChatGPT, Perplexity, and Claude don't receive user experience signals the way Google does — they process raw content, not user engagement data.

The nuanced answer: performance matters significantly for AI citations through indirect mechanisms. Content that can't be extracted because it loads via JavaScript, pages that time out during bot crawls, and sites that block AI crawlers due to server load issues all suffer citation losses that look like performance problems even if the root cause is technical accessibility, not speed.

The distinction that matters

User-facing performance (how fast your page loads for a visitor) is mostly irrelevant to AI citation. Bot-facing extractability (whether AI crawlers can access and parse your content) is critical. These are different problems with different solutions.

Crawl-time vs. render-time performance

AI bots fetch pages in two modes: some (like Perplexity) render JavaScript; many others (including some versions of GPTBot) fetch only the initial HTML response. Pages where critical content is loaded via JavaScript after initial HTML delivery are partially or fully invisible to crawlers that don't execute JavaScript.

Content delivery methodBot visibilityCitation risk
Static HTML (SSG)Full visibility to all botsNone
Server-side rendered (SSR)Full visibility to all botsNone
Client-side rendered (CSR)Invisible to non-JS botsHigh — content may not be indexed
Lazy-loaded contentPartial — above-fold only for some botsMedium — depends on content position
API-fetched contentInvisible without JS executionVery high — content missed entirely

Indirect effects of poor performance on citation probability

While Core Web Vitals don't directly influence AI citation algorithms, poor performance creates several conditions that reduce citation probability indirectly.

Crawl budget exhaustion

Slow TTFB causes AI crawlers to spend more time on your pages, reducing how many pages get crawled per session. Low-priority pages miss crawl windows.

Bot timeouts

Many AI bots have strict timeout thresholds (5–10 seconds). Pages exceeding this threshold may be partially indexed or skipped entirely.

Google ranking correlation

Google's search rankings (which do use Core Web Vitals) influence which pages AI bots treat as authoritative starting points for crawl discovery.

Crawl rate limits

Servers that respond slowly under crawl pressure may return 429 or 503 errors to AI bots, causing content to be deprioritized in their indexes.

JavaScript rendering and content visibility for AI

The most impactful technical AEO issue is not speed — it's JavaScript-dependent content. If your key informational content, schema markup, or FAQs only appear after JavaScript executes, a significant portion of AI bots will never see them.

Test your pages by disabling JavaScript in your browser and viewing the source. Every piece of content that disappears is a citation risk. Content that matters for AI visibility should be in the server-rendered HTML, not appended by client-side JavaScript.

Schema injected via client-side JavaScript is unreliable

JSON-LD schema added to a page via JavaScript (appending a script tag via React, Vue, etc.) may not be parsed by AI crawlers that don't execute JS. Always serve JSON-LD schema in the initial HTML response, either via SSR or static generation.

TTFB and schema parsing reliability

Time to First Byte (TTFB) is the one performance metric with a direct correlation to AI crawl quality. A high TTFB means AI bots spend more clock time waiting for content to begin arriving. Combined with timeout thresholds, this can cause complete crawl failures on slow-responding pages.

Target TTFB under 200ms for pages you want consistently crawled and cited. Use CDN edge caching, static generation, or edge rendering to achieve this, especially for high-priority pages.

Technical priorities for AEO (ranked)

1. Ensure all key content is in server-rendered HTML — not dependent on JavaScript execution
2. Serve JSON-LD schema in the initial HTML document, not appended client-side
3. Keep TTFB under 200ms on key pages using CDN caching or static generation
4. Verify AI crawlers are not blocked in robots.txt (GPTBot, PerplexityBot, ClaudeBot)
5. Fix 404s and redirect chains that waste crawl budget on high-priority pages
6. Submit XML sitemaps that include all pages you want crawled and cited
Was this article helpful?
Back to all articles