We see this constantly across sites built with React, Vite, Lovable, and similar tools:
- The page loads perfectly in the browser
- Googlebot gets back almost nothing
- AI crawlers see an empty HTML shell
If you're building a JavaScript app, this is not an edge case. This is the default behavior.
The Core Problem: Bots Don't See What Users See
When a real user loads your site, everything works: JavaScript runs, data loads, the page hydrates, content renders. It looks correct.
But most bots don't operate like a browser. They:
- • Fetch the initial HTML response
- • May partially execute JavaScript (or skip it entirely)
- • Have strict timeouts and resource limits
So instead of your full page, they often see something like this:
What bots actually receive:
<!DOCTYPE html>
<html>
<head>
<title></title>
</head>
<body>
<div id="root"></div>
<script src="/assets/index.js"></script>
</body>
</html>No content. No structure. No signals. Just an empty container and a script tag.
This is what every SPA looks like to a bot that doesn't render JavaScript. And that includesmost AI crawlers — ChatGPT, Perplexity, Claude, and others.
Why This Breaks SEO (and AI Visibility)
Search engines and AI systems rely on three things in your HTML:
HTML Structure
Headings, sections, semantic elements
Text Content
The actual words on the page
Metadata
Title, canonical, OG tags
If your app depends on JavaScript to generate those — and every SPA does — bots may never see them. Or see them inconsistently.
This leads to:
- Pages not getting indexed
- Rankings dropping or never appearing
- AI tools ignoring your content entirely
- Social previews showing blank cards
What Most Guides Get Wrong
Most SEO advice for SPAs sounds like this:
- "Just add a sitemap.xml"
- "Submit URLs in Google Search Console"
- "Make sure your meta tags exist"
None of that fixes the core issue.
A sitemap helps with discovery — not rendering. If your HTML is empty or incomplete when fetched, Google still has nothing to work with. Submitting URLs to Search Console just tells Google where to look. It doesn't change what Google finds when it gets there.
The real fix is making sure bots receive usable HTML — with real content, real metadata, real structure — on the first request.
What We See in Production
These aren't theoretical problems. We see these patterns daily across hundreds of JavaScript apps:
1. Empty HTML Responses
Bots receive <div id="root"></div> — no text, no headings, no ranking signals. The entire page content exists only in JavaScript that never runs.
2. Missing or Incorrect Metadata
Titles and descriptions injected client-side (via react-helmet or similar) often don't appear in the initial HTML response. Bots see the fallback title from index.html — or nothing at all.
3. Broken Deep Links
Routes like /pricing or /features work perfectly when navigated to inside the app — but return incomplete or generic HTML when fetched directly by a bot.
4. Inconsistent Bot Behavior
Some bots partially render pages. Others don't even try. Googlebot has a render queue with delays. AI crawlers skip JS entirely. The inconsistency makes debugging a nightmare.
Want to see this for yourself? Run your site through the Bot Test tool — it shows you exactly what bots receive vs what users see.
The Three Approaches (and Their Tradeoffs)
There are three common ways to fix SPA SEO. Each has real tradeoffs.
1. Build-Time Prerendering
Generate static HTML during the build step. Run your SPA in a headless browser, capture the output, deploy it as static files.
Pros
- • Simple to set up
- • Fast CDN delivery
- • No server required
Cons
- • Breaks with dynamic content
- • Requires full rebuilds for updates
- • Hydration mismatch issues
- • Doesn't scale with app complexity
We wrote a deep dive on this: Why Script-Based Prerendering Struggles with Modern Web Apps
2. Server-Side Rendering (SSR)
Render pages on the server for every request. The bot gets fully formed HTML because the server executes the app before responding.
Pros
- • Accurate, up-to-date HTML
- • Good SEO out of the box
- • Dynamic content works
Cons
- • Complex server infrastructure
- • Performance overhead per request
- • Hard to retrofit into existing SPAs
- • Often means rewriting on Next.js/Nuxt
SSR is a solid approach if you're starting fresh. But if you already have a working SPA, migrating to SSR is often a full rewrite. See our comparison: Dynamic Rendering vs Prerendering
3. Edge Rendering (The DataJelly Approach)
Serve pre-rendered HTML snapshots to bots at the edge. Users still get the normal SPA. AI crawlers get structured Markdown.
Pros
- • No app rewrite required
- • Works with any SPA framework
- • Consistent output for all bots
- • AI-optimized Markdown delivery
- • Just a DNS change to set up
Cons
- • Requires a proxy layer
- • Snapshot freshness needs to be managed
This is what we built DataJelly to do. More on how it works: DataJelly Edge
Why Edge Rendering Works Better for Modern Apps
The key insight is simple:
"Bots don't need your app logic — they need the output."
Instead of forcing bots to execute your JavaScript, parse your API calls, and render your React components — you just give them the final rendered result.
This removes:
- JavaScript execution timing issues
- Hydration mismatches
- Inconsistent rendering across bot types
- Missing metadata in the initial response
And critically — your frontend architecture stays unchanged. No framework migration, no build pipeline changes, no server to maintain.
Practical Checklist for SPA SEO
If you're running a JavaScript-heavy site, check these five things right now:
Does your raw HTML contain real content (not just a script tag)?
Are title and meta tags present in the HTML without JavaScript?
Can a direct HTTP request to any route return usable HTML?
Are bots seeing the same structure consistently across pages?
Do AI crawlers get readable content (not just scripts and empty divs)?
If any of these fail, your visibility is at risk. Run the free visibility test to see exactly what bots see on your site.
Quick Test: What Do Bots Actually See?
Most people guess. Don't.
Run this test and look at the actual response your site returns to bots.
Fetch your page as Googlebot
Use your terminal:
curl -A "Googlebot" https://yourdomain.comLook for:
- Real visible text (not just
<div id="root">) - Meaningful content in the HTML
- Page size (should not be tiny)
Compare bot vs browser
Now test what a real browser gets:
curl -A "Mozilla/5.0" https://yourdomain.comIf these responses are different, Google is indexing a different page than your users see.
Stop guessing — measure it.
Real example: 253 words vs 13,547
We see this constantly. Here's a real example from production: Googlebot saw 253 words and 2 KB of HTML. A browser saw 13,547 words and 77.5 KB. Same URL — completely different content.

If your HTML doesn't contain the content, Google doesn't either.
Compare Googlebot vs browser on your site → HTTP Debug ToolCheck for common failure signals
We see this all the time in production:
- HTML under ~1KB → usually empty shell
- Visible text under ~200 characters → thin or missing content
- Missing <title> or <h1> → weak or broken page
- Large difference between bot vs browser HTML → rendering issue
Use the DataJelly Visibility Test (Recommended)
You can run this without touching curl. It shows you:
- Raw HTML returned to bots (Googlebot, Bing, GPTBot, etc.)
- Fully rendered browser version
- Side-by-side differences in word count, HTML size, links, and content
What this test tells you (no guessing)
After running this, you'll know:
- Whether your HTML is actually indexable
- Whether bots are seeing partial content
- Whether rendering is breaking in production
This is the difference between "I think SEO is set up" and "I know what Google is indexing."
If you don't understand why this happens, read: Why Google Can't See Your SPA
If this test fails
You have three real options:
SSR
Works if you can keep it stable in production
Prerendering
Breaks with dynamic content and scale
Edge Rendering
Reflects real production output without app changes
If you do nothing, you will not rank consistently. Learn how Edge Rendering works →
This issue doesn't show up in Lighthouse. It shows up in rankings.
Frequently Asked Questions
See what bots actually see on your site
Run the free visibility test to compare your browser view vs what search engines and AI crawlers receive. Takes 10 seconds.
Or start a 7-day free trial - no credit card required.