DJ
DataJelly
Visibility Test
EdgeGuard
PricingSEO ToolsGuidesGet Started
Dashboard
Back to Blog
Latest
April 2026

Why Google Can't See Your SPA (Even Though It Works in Your Browser)

Your app works. Google can't see it. That's not a tooling issue — it's how SPAs deliver content.

Most SPAs ship an empty HTML shell and rely on JavaScript to build the page. Browsers execute that JavaScript. Google often doesn't — at least not when it matters.

So Google indexes what it gets first: almost nothing.

What's Actually Happening

Your server responds with a minimal HTML file — a <div id="root">, a JS bundle link, and no real content.

What your server actually sends:

<!DOCTYPE html>
<html>
  <head>
    <title></title>
  </head>
  <body>
    <div id="root"></div>
    <script src="/assets/index.js"></script>
  </body>
</html>

Here's the divergence:

The browser

Executes JS → fetches data → builds the DOM → renders content. Everything works.

Googlebot

Fetches the same HTML → queues rendering for later (maybe) → indexes before or without full execution.

The SPA Rendering Gap

Server sends <div id="root"></div>
Browser Path
Executes JavaScript
Fetches data → Builds DOM
Full page rendered
Googlebot Path
Queues rendering for later
Maybe executes JS… eventually
Empty or partial index

"Google indexes the initial response far more reliably than the rendered result. If your content isn't in that first response, you're gambling."

What Most Guides Get Wrong

You'll hear advice like:

  • "Google can render JavaScript"
  • "Just optimize performance"
  • "Use dynamic rendering if needed"

Here's what actually happens:

  • Rendering is delayed — sometimes indefinitely
  • Failures are silent — no errors, just missing content
  • Heavy apps get partially rendered or skipped entirely

The dangerous assumption

"If it works in Chrome, Google sees it." That assumption is responsible for most SPA SEO failures we encounter.

What We See in Production

This isn't edge-case behavior. We see these patterns daily across hundreds of JavaScript apps:

1. Empty HTML at Crawl Time

Raw response: no text, no links, no structure. Result: pages indexed as empty — or dropped entirely.

2. Rendering Breaks on Real Data

This breaks in production when APIs are slow or return errors, auth/state blocks data fetching, or JS throws during hydration.

Result: missing sections, incomplete pages, inconsistent indexing across crawls.

3. Every Route Looks the Same

SPAs return the same HTML for /pricing, /features, and /docs. Content depends on JS routing. Google sees identical shells — result: duplicate or ignored pages.

4. Rendering Happens Too Late

Even when Google renders, it's queued behind other work, not guaranteed, and often too late for initial indexing. New pages don't rank. Updates take too long to appear.

Want to see this for yourself? Run your site through the Bot Test tool — it shows you exactly what bots receive vs what users see.

Solutions Compared: Prerender vs SSR vs Edge

There are three real options. Each has trade-offs.

Prerendering

Build step runs headless browserStatic HTML generatedSame file for everyone

Server-Side Rendering

Request hits serverServer renders on the flyFresh HTML per request

Edge Rendering

Proxy detects visitor typeBot → snapshot · User → SPABest of both worlds

1. Build-Time Prerendering

Generate HTML ahead of time. Run your SPA in a headless browser at build, capture the output, deploy as static files.

Works when

  • • Content is static
  • • Routes are limited

Breaks when

  • • Content changes frequently
  • • Routes are dynamic or large
  • • You end up rebuilding constantly

We wrote a deep dive: Why Script-Based Prerendering Struggles with Modern Web Apps

2. Server-Side Rendering (SSR)

Render HTML on every request. The bot gets real content because the server executes the app before responding.

Works when

  • • You control the full stack
  • • You can absorb latency and complexity

Costs

  • • More infrastructure
  • • Slower responses under load
  • • Tight coupling to framework
  • • Often means rewriting on Next.js

Most teams underestimate the operational cost. See: Dynamic Rendering vs Prerendering

3. Edge Rendering (Snapshot + Proxy)

Serve pre-rendered HTML to bots at the edge. Users still get the SPA. No frontend rewrite required.

What happens

  • • Bots get full HTML snapshots
  • • Users get the normal SPA
  • • AI crawlers get clean Markdown
  • • Works with React, Vite, Lovable
  • • Just a DNS change to set up

Trade-offs

  • • Requires a proxy layer
  • • Snapshot freshness needs management

We see this outperform SSR and prerender in real deployments because it removes the failure points instead of trying to manage them. More on how it works: DataJelly Edge

Practical Checklist

If you're unsure whether this is your problem, check these five things:

1

View Raw HTML

Right-click → View Source. If you don't see real content, Google doesn't either.

2

Hit a Deep Route Directly

Request /pricing or /features without JS. Does it return full content? If not, that route isn't indexable.

3

Test as Googlebot

Fetch with a bot user agent. Look for actual text, internal links, structured content. If it's missing, indexing will be incomplete.

4

Break Your API

Simulate slow responses or failed calls. Does the page still render? If not, Google will index broken states.

5

Compare Indexed Pages

Check search results: missing content? Duplicate titles? Empty snippets? That's your rendering problem showing up publicly.

Want a quick answer? Run the free visibility test — it shows exactly what bots see on your site in under 10 seconds.

Final Takeaway

If your server returns empty HTML, your SEO is broken. Full stop.

JavaScript rendering is not a reliable fallback. It's a best-effort system with no guarantees.

The teams that win here stop relying on Google to render their app and start giving Google exactly what it needs up front.

Quick Test: What Do Bots Actually See?

~30 seconds

Most people guess. Don't.

Run this test and look at the actual response your site returns to bots.

1

Fetch your page as Googlebot

Use your terminal:

curl -A "Googlebot" https://yourdomain.com

Look for:

  • Real visible text (not just <div id="root">)
  • Meaningful content in the HTML
  • Page size (should not be tiny)
2

Compare bot vs browser

Now test what a real browser gets:

curl -A "Mozilla/5.0" https://yourdomain.com

If these responses are different, Google is indexing a different page than your users see.

Stop guessing — measure it.

Real example: 253 words vs 13,547

We see this constantly. Here's a real example from production: Googlebot saw 253 words and 2 KB of HTML. A browser saw 13,547 words and 77.5 KB. Same URL — completely different content.

Bot vs browser comparison showing 253 words for Googlebot vs 13,547 words for a rendered browser on the same URL

If your HTML doesn't contain the content, Google doesn't either.

Compare Googlebot vs browser on your site → HTTP Debug Tool
3

Check for common failure signals

We see this all the time in production:

  • HTML under ~1KB → usually empty shell
  • Visible text under ~200 characters → thin or missing content
  • Missing <title> or <h1> → weak or broken page
  • Large difference between bot vs browser HTML → rendering issue

Use the DataJelly Visibility Test (Recommended)

You can run this without touching curl. It shows you:

  • Raw HTML returned to bots (Googlebot, Bing, GPTBot, etc.)
  • Fully rendered browser version
  • Side-by-side differences in word count, HTML size, links, and content
Run Visibility Test — Free

What this test tells you (no guessing)

After running this, you'll know:

  • Whether your HTML is actually indexable
  • Whether bots are seeing partial content
  • Whether rendering is breaking in production

This is the difference between "I think SEO is set up" and "I know what Google is indexing."

If you don't understand why this happens, read: Why Google Can't See Your SPA

If this test fails

You have three real options:

SSR

Works if you can keep it stable in production

Prerendering

Breaks with dynamic content and scale

Edge Rendering

Reflects real production output without app changes

If you do nothing, you will not rank consistently. Learn how Edge Rendering works →

This issue doesn't show up in Lighthouse. It shows up in rankings.

Run the TestAsk a Question

Frequently Asked Questions

Related Reading

SPA SEO Checklist: 10 Things to Fix

The actionable checklist to make your SPA visible to bots

SPA SEO: The Complete Guide

Comprehensive guide to SPA visibility for search and AI

React SEO Is Broken by Default

Why React ships HTML that search engines can't use

Sitemap Exists But Google Ignores Pages

Why discovery ≠ indexing — and the rendering fix

Why Script-Based Prerendering Struggles

Deep dive into build-time prerendering limitations

JavaScript SEO Guide

Technical foundations of JS SEO

Bot Test Tool

See what specific crawlers receive from your pages

DataJelly Edge

Edge rendering for bot visibility — no code changes

Prerender vs SSR vs Edge Rendering

Side-by-side comparison of what actually works for SEO in production

See what bots actually see on your site

Run the free visibility test to compare your browser view vs what search engines and AI crawlers receive. Takes 10 seconds.

Test Your Visibility Ask a Question

Or start a 7-day free trial - no credit card required.

Reading progress0%

On This Page

DataJelly

SEO snapshots for modern SPAs. Making JavaScript applications search engine friendly with enterprise-grade reliability.

Product

  • DataJelly Edge
  • DataJelly Guard
  • Pricing
  • SEO Tools
  • Visibility Test
  • Dashboard

Resources

  • Blog
  • Guides
  • Getting Started
  • Prerendering
  • SPA SEO Guide

Company

  • About Us
  • Contact
  • Terms of Service
  • Privacy Policy

© 2026 DataJelly. All rights reserved. Built with love for the modern web.