Executives

undefined

title:: JavaScript SEO: Rendering, Hydration, and Crawlability for React, Next.js, and Vue description:: A developer's guide to JavaScript SEO covering server-side rendering, hydration pitfalls, crawlability for SPAs, and framework-specific solutions. focus_keyword:: JavaScript SEO guide category:: developers author:: Victor Valentine Romo date:: 2026.03.20

JavaScript SEO: Rendering, Hydration, and Crawlability for React, Next.js, and Vue

Quick Summary

- What this covers: javascript-seo-guide

- Who it's for: SEO practitioners at every career stage

- Key takeaway: Read the first section for the core framework, then use the specific tactics that match your situation.

JavaScript SEO is the practice of ensuring that search engine crawlers can discover, render, and index content generated by JavaScript frameworks. Googlebot uses a headless Chromium instance to render JavaScript, but the process introduces delays, resource constraints, and failure modes that don't exist with server-rendered HTML. If your site relies on client-side rendering for content that needs to rank, you have an SEO engineering problem.

The core challenge is render timing. Googlebot processes pages in two phases — crawl (fetches HTML) and render (executes JavaScript). The gap between these phases can range from seconds to days, and content that only exists after JavaScript execution is invisible during the crawl phase.

How Googlebot Processes JavaScript

The Two-Phase Indexing Pipeline

Phase one: Googlebot fetches the URL and receives the initial HTML response. This HTML is parsed immediately. Links discovered in this HTML are added to the crawl queue. Content present in this HTML is available for indexing.

Phase two: The fetched page enters the render queue. Googlebot's Web Rendering Service (WRS) executes JavaScript using a headless Chromium instance, producing the final DOM. Content that only appears after JavaScript execution becomes available for indexing at this point.

The problem: phase two is resource-constrained. Google maintains a rendering budget, and pages compete for rendering resources globally. High-profile sites get rendered quickly. Smaller sites may wait hours or days. During that gap, JavaScript-dependent content is invisible to search.

What Googlebot Can and Cannot Render

Googlebot renders most modern JavaScript, including React, Vue, Angular, and vanilla JS applications. It supports ES6+, async/await, Promises, Fetch API, and modern CSS. It does not support: WebSocket connections, service workers, IndexedDB in rendering context, or user interaction-dependent content (content that requires clicks, scrolls, or hovers to appear).

Content behind authentication walls, lazy-loaded content that requires scroll events, infinite scroll implementations without pagination fallbacks, and content rendered only after user interaction will not be indexed.

Rendering Budget and Crawl Efficiency

Every page Googlebot renders consumes computational resources. Sites with thousands of pages face a practical constraint: not every page will be rendered promptly. Crawl budget — the number of URLs Googlebot will crawl per session — is separate from rendering budget, but both limit how much of your site gets processed.

Reducing JavaScript execution time, minimizing render-blocking resources, and serving complete HTML without JavaScript dependency directly improve how efficiently Googlebot processes your site.

Server-Side Rendering: The SEO Default

Why SSR Solves Most JavaScript SEO Problems

Server-side rendering generates complete HTML on the server before sending it to the client. When Googlebot fetches the page, the HTML already contains all content, links, and metadata. No rendering queue. No JavaScript execution delay. No content visibility gap.

Next.js (React), Nuxt.js (Vue), and Angular Universal all provide SSR capabilities. For SEO-critical pages — landing pages, blog posts, product pages, category pages — SSR eliminates the primary technical risk.

SSR Implementation in Next.js

Next.js provides three rendering strategies relevant to SEO: getServerSideProps — renders on every request. Use for pages with frequently changing content where freshness matters for search. getStaticProps — renders at build time, optionally with Incremental Static Regeneration (ISR). Use for content that changes infrequently — blog posts, documentation, product descriptions. ISR lets you set a revalidation interval without rebuilding the entire site. generateStaticParams (App Router) — the App Router equivalent for generating static pages from dynamic routes.

For SEO purposes, static generation with ISR is the optimal strategy for most content: fast server response times, complete HTML on first request, and automated content freshness.

SSR Implementation in Nuxt.js

Nuxt 3 defaults to universal rendering — SSR on first request with client-side navigation afterward. For SEO-critical pages, ensure the ssr: true configuration is active (it's the default). Use useAsyncData or useFetch composables to load data server-side.

Static site generation via nuxt generate produces fully pre-rendered HTML files. This approach works for content sites but requires rebuilds when content changes.

SSR Implementation in Vue with Vite

For Vue applications not using Nuxt, Vite's SSR support requires manual configuration. Create an entry point for server rendering that produces HTML strings, and a separate client entry for hydration. The complexity is significant compared to Nuxt's built-in SSR — use Nuxt unless you have specific reasons not to.

Hydration Pitfalls That Break SEO

What Hydration Does

Hydration attaches JavaScript event handlers to server-rendered HTML, making static markup interactive. The server sends complete HTML. The browser renders it immediately (fast first paint). Then JavaScript loads and "hydrates" the existing DOM rather than rebuilding it.

Hydration Mismatch Errors

When the server-rendered HTML doesn't match what the client-side JavaScript produces, React and Vue throw hydration mismatch errors. The framework may discard the server-rendered DOM and re-render from scratch, producing a flash of incorrect content or layout shift.

For SEO, hydration mismatches create two problems: Googlebot may index the server-rendered content that differs from the hydrated content, and layout shift during hydration degrades Core Web Vitals scores.

Common causes: date/time formatting that differs between server and client timezones, browser-specific APIs called during server render, conditional rendering based on window or document objects, and random ID generation that produces different values on server and client.

Fixing Hydration Issues

Wrap browser-only code in client checks. In React: use useEffect for browser-only logic. In Vue: use onMounted or the component in Nuxt. In Next.js: use dynamic imports with ssr: false for components that cannot render on the server.

Test hydration by comparing the server-rendered HTML (view-source:) against the rendered DOM (DevTools Elements panel). Any differences indicate hydration mismatches that need resolution.

Client-Side Rendering: When Rankings Suffer

The CSR Problem for SEO

Single-page applications that render entirely on the client send a minimal HTML shell — typically a

— with a JavaScript bundle that builds the DOM after loading. Googlebot must render this JavaScript to see any content.

For small sites with high authority, Googlebot renders CSR pages quickly enough that the delay is manageable. For large sites, new sites, or sites with complex JavaScript, CSR creates indexation gaps that range from annoying to catastrophic.

Diagnosing CSR Indexation Problems

In Google Search Console, use the URL Inspection tool. Compare "Rendered HTML" against "Crawled HTML." If the crawled HTML contains no meaningful content but the rendered HTML does, your content depends entirely on JavaScript rendering.

Test with JavaScript disabled in your browser. If the page displays nothing — no headlines, no body text, no navigation — search engines see the same nothing during their initial crawl pass.

Migrating from CSR to SSR

The migration path depends on your framework:

React SPA to Next.js: Extract route-level data fetching into getServerSideProps or getStaticProps. Replace client-side routing with Next.js file-based routing. The migration is incremental — start with your highest-traffic pages. Vue SPA to Nuxt: Move route components into the pages/ directory. Replace Vuex/Pinia client-side data loading with useAsyncData. Nuxt's migration guide covers the most common patterns. Angular SPA to Angular Universal: Add @nguniversal/express-engine. Configure server-side module alongside client module. The architectural changes are more invasive than React or Vue equivalents.

Dynamic Rendering: The Compromise

What Dynamic Rendering Is

Dynamic rendering serves pre-rendered HTML to search engine crawlers while serving the standard JavaScript application to users. A server-side component detects the user agent, and if it matches known crawler patterns (Googlebot, Bingbot, etc.), returns a pre-rendered HTML version.

Google has acknowledged dynamic rendering as a valid approach, though they recommend SSR as the preferred long-term solution. Dynamic rendering adds infrastructure complexity (a rendering service like Rendertron or Puppeteer) and introduces a maintenance burden: two versions of every page must produce identical content.

When Dynamic Rendering Makes Sense

Large SPAs where SSR migration is prohibitively expensive. Applications with complex client-side state that's difficult to reproduce server-side. Hybrid applications where some routes need SSR and others don't justify the investment.

Implementation with Puppeteer or Rendertron

Deploy Rendertron (Google's open-source rendering service) or a custom Puppeteer instance. Configure your web server or CDN to route crawler requests to the rendering service. Cache rendered HTML to avoid re-rendering on every crawl request.

The caching strategy matters: stale cache serves outdated content to crawlers, which creates indexation drift. Set cache TTLs that match your content freshness requirements — 24 hours for dynamic content, 7 days for evergreen content.

Structured Data in JavaScript Applications

Injecting JSON-LD Dynamically

Structured data should appear in the initial HTML response, not be injected by client-side JavaScript. When using SSR, include JSON-LD