Most SPAs ship an empty HTML shell and rely on JavaScript to build the page. Browsers execute that JavaScript. Google often doesn't — at least not when it matters.
So Google indexes what it gets first: almost nothing.
What's Actually Happening
Your server responds with a minimal HTML file — a <div id="root">, a JS bundle link, and no real content.
What your server actually sends:
<!DOCTYPE html>
<html>
<head>
<title></title>
</head>
<body>
<div id="root"></div>
<script src="/assets/index.js"></script>
</body>
</html>Here's the divergence:
The browser
Executes JS → fetches data → builds the DOM → renders content. Everything works.
Googlebot
Fetches the same HTML → queues rendering for later (maybe) → indexes before or without full execution.
The SPA Rendering Gap
<div id="root"></div>"Google indexes the initial response far more reliably than the rendered result. If your content isn't in that first response, you're gambling."
What Most Guides Get Wrong
You'll hear advice like:
- "Google can render JavaScript"
- "Just optimize performance"
- "Use dynamic rendering if needed"
Here's what actually happens:
- Rendering is delayed — sometimes indefinitely
- Failures are silent — no errors, just missing content
- Heavy apps get partially rendered or skipped entirely
The dangerous assumption
"If it works in Chrome, Google sees it." That assumption is responsible for most SPA SEO failures we encounter.
What We See in Production
This isn't edge-case behavior. We see these patterns daily across hundreds of JavaScript apps:
1. Empty HTML at Crawl Time
Raw response: no text, no links, no structure. Result: pages indexed as empty — or dropped entirely.
2. Rendering Breaks on Real Data
This breaks in production when APIs are slow or return errors, auth/state blocks data fetching, or JS throws during hydration.
Result: missing sections, incomplete pages, inconsistent indexing across crawls.
3. Every Route Looks the Same
SPAs return the same HTML for /pricing, /features, and /docs. Content depends on JS routing. Google sees identical shells — result: duplicate or ignored pages.
4. Rendering Happens Too Late
Even when Google renders, it's queued behind other work, not guaranteed, and often too late for initial indexing. New pages don't rank. Updates take too long to appear.
Want to see this for yourself? Run your site through the Bot Test tool — it shows you exactly what bots receive vs what users see.
Solutions Compared: Prerender vs SSR vs Edge
There are three real options. Each has trade-offs.
Prerendering
Server-Side Rendering
Edge Rendering
1. Build-Time Prerendering
Generate HTML ahead of time. Run your SPA in a headless browser at build, capture the output, deploy as static files.
Works when
- • Content is static
- • Routes are limited
Breaks when
- • Content changes frequently
- • Routes are dynamic or large
- • You end up rebuilding constantly
We wrote a deep dive: Why Script-Based Prerendering Struggles with Modern Web Apps
2. Server-Side Rendering (SSR)
Render HTML on every request. The bot gets real content because the server executes the app before responding.
Works when
- • You control the full stack
- • You can absorb latency and complexity
Costs
- • More infrastructure
- • Slower responses under load
- • Tight coupling to framework
- • Often means rewriting on Next.js
Most teams underestimate the operational cost. See: Dynamic Rendering vs Prerendering
3. Edge Rendering (Snapshot + Proxy)
Serve pre-rendered HTML to bots at the edge. Users still get the SPA. No frontend rewrite required.
What happens
- • Bots get full HTML snapshots
- • Users get the normal SPA
- • AI crawlers get clean Markdown
- • Works with React, Vite, Lovable
- • Just a DNS change to set up
Trade-offs
- • Requires a proxy layer
- • Snapshot freshness needs management
We see this outperform SSR and prerender in real deployments because it removes the failure points instead of trying to manage them. More on how it works: DataJelly Edge
Practical Checklist
If you're unsure whether this is your problem, check these five things:
View Raw HTML
Right-click → View Source. If you don't see real content, Google doesn't either.
Hit a Deep Route Directly
Request /pricing or /features without JS. Does it return full content? If not, that route isn't indexable.
Test as Googlebot
Fetch with a bot user agent. Look for actual text, internal links, structured content. If it's missing, indexing will be incomplete.
Break Your API
Simulate slow responses or failed calls. Does the page still render? If not, Google will index broken states.
Compare Indexed Pages
Check search results: missing content? Duplicate titles? Empty snippets? That's your rendering problem showing up publicly.
Want a quick answer? Run the free visibility test — it shows exactly what bots see on your site in under 10 seconds.
Final Takeaway
If your server returns empty HTML, your SEO is broken. Full stop.
JavaScript rendering is not a reliable fallback. It's a best-effort system with no guarantees.
The teams that win here stop relying on Google to render their app and start giving Google exactly what it needs up front.
Quick Test: What Do Bots Actually See?
Most people guess. Don't.
Run this test and look at the actual response your site returns to bots.
Fetch your page as Googlebot
Use your terminal:
curl -A "Googlebot" https://yourdomain.comLook for:
- Real visible text (not just
<div id="root">) - Meaningful content in the HTML
- Page size (should not be tiny)
Compare bot vs browser
Now test what a real browser gets:
curl -A "Mozilla/5.0" https://yourdomain.comIf these responses are different, Google is indexing a different page than your users see.
Stop guessing — measure it.
Real example: 253 words vs 13,547
We see this constantly. Here's a real example from production: Googlebot saw 253 words and 2 KB of HTML. A browser saw 13,547 words and 77.5 KB. Same URL — completely different content.

If your HTML doesn't contain the content, Google doesn't either.
Compare Googlebot vs browser on your site → HTTP Debug ToolCheck for common failure signals
We see this all the time in production:
- HTML under ~1KB → usually empty shell
- Visible text under ~200 characters → thin or missing content
- Missing <title> or <h1> → weak or broken page
- Large difference between bot vs browser HTML → rendering issue
Use the DataJelly Visibility Test (Recommended)
You can run this without touching curl. It shows you:
- Raw HTML returned to bots (Googlebot, Bing, GPTBot, etc.)
- Fully rendered browser version
- Side-by-side differences in word count, HTML size, links, and content
What this test tells you (no guessing)
After running this, you'll know:
- Whether your HTML is actually indexable
- Whether bots are seeing partial content
- Whether rendering is breaking in production
This is the difference between "I think SEO is set up" and "I know what Google is indexing."
If you don't understand why this happens, read: Why Google Can't See Your SPA
If this test fails
You have three real options:
SSR
Works if you can keep it stable in production
Prerendering
Breaks with dynamic content and scale
Edge Rendering
Reflects real production output without app changes
If you do nothing, you will not rank consistently. Learn how Edge Rendering works →
This issue doesn't show up in Lighthouse. It shows up in rankings.
Frequently Asked Questions
See what bots actually see on your site
Run the free visibility test to compare your browser view vs what search engines and AI crawlers receive. Takes 10 seconds.
Or start a 7-day free trial - no credit card required.