DJ
DataJelly
Visibility Test
EdgeGuard
PricingSEO ToolsGuidesGet Started
Dashboard
Back to Blog
Blog
Vite SEO
April 2026

Why Vite Apps Have SEO Problems (And How to Fix Them)

Your Vite app ships fine and indexes poorly. We see this all the time: browsers get ~120KB of content, bots get 3–7KB of empty HTML. Same URL. Different output. The bot indexes the 3–7KB version.

Reading progress0%

On This Page

The Real Problem

Browser view

~120KB

full DOM, all content

Bot fetch

3–7KB

empty body, script tags

Same URL. Different output. The bot indexes the 3–7KB version.

If your bot HTML is under ~10KB or visible text is under ~200 characters, the page is effectively empty. It returns a 200 status — everything looks fine in monitoring — but the content isn't there.

This is the single most common SEO failure we diagnose on Vite apps.

What's Actually Happening

Vite produces minimal initial HTML. This is by design — it's optimized for client-side speed, not bot readability:

<!DOCTYPE html>
<html>
  <head><title>My App</title></head>
  <body>
    <div id="root"></div>
    <script type="module" src="/assets/index-abc123.js"></script>
  </body>
</html>

All content is created after JavaScript runs. The initial HTML contains zero meaningful text.

Bots often:

  • Don't execute JavaScript at all
  • Time out before hydration completes
  • Snapshot the page early, before content loads

Result: HTML size 3–8KB, visible text near zero. That pattern is a "script shell." It passes status checks (200 OK) but fails content checks (no text).

What this looks like in DataJelly Guard: HTML under 5KB, visible text under 200 characters, DOM dominated by <script> tags. Guard flags this as a blank page automatically.

What Most Guides Get Wrong

"Google can render JavaScript" is not a strategy.

In production, Google's rendering budget is limited. Pages are queued and dropped. Failures don't surface in any dashboard. You get mixed outcomes:

  • Some pages fully render
  • Some partially render
  • Some never render at all

You don't control which pages fail. And when they fail, you don't get notified. Rankings just quietly drop.

AI crawlers (ChatGPT, Claude, Perplexity) make this worse — they don't even attempt to render JavaScript. They fetch HTML, extract text, and move on. If your Vite app ships a script shell, you're invisible to every AI engine. We wrote about this in detail →

What Breaks in Production

These are the five failure patterns we see constantly on Vite apps. Every one of them passes basic health checks.

1

Script shell pages

Most common failure. The HTML is technically valid — 38KB of JavaScript bundles and meta tags — but the visible text is 140 characters. That's it.

HTML: 38KB | Visible text: 140 chars | DOM: mostly <script> tags

Outcome: Indexed as thin content. Rankings suppressed. You don't get an error — you just don't rank.

2

Deep routes return empty HTML

This breaks in production when routing is client-only. /pricing returns the same minimal HTML from the origin. The browser hydrates and fixes it. The bot indexes the minimal version.

200 status | <10KB HTML | No headings | No body text

Every route on your Vite app is a different URL returning the same empty shell. Bots see duplicates. Why Google Can't See Your SPA →

3

Hydration failures

This breaks in production when an API call fails, a bundle fails to load, or a runtime error occurs. The HTML remains a shell.

API timeout → no data rendered | Console errors | HTML remains shell

Guard surfaces this as: low text, script-heavy DOM, console errors. Even if you have SSR, a single API timeout can drop your HTML from ~100KB to ~8KB.

4

Partial rendering under load

SSR isn't involved, but hydration still fails under load. The header renders, the content section stays empty.

HTML: ~12KB | Visible text: ~300 chars | Looks "valid" but performs like thin content

This is the hardest failure to detect because the page looks partially valid. But ~300 characters is not enough to rank.

5

Bot vs browser mismatch

You must assume bots see less. The typical diff on a Vite app:

Browser: 150KB DOM, thousands of words

Bot: <10KB HTML, almost no text

If you haven't compared these directly, you're guessing. Compare them now with HTTP Debug →

Solutions Comparison

There are three real approaches. Each has trade-offs. We've seen all of them in production. Full comparison →

Prerendering

What works

Static marketing pages with infrequent changes

What breaks

Dynamic routes, frequent deploys, personalized content

Example failure: /pricing cached once, content outdated after deploy. Same hash across releases.

Verdict: Fixes empty HTML. Introduces staleness.

SSR (Server-Side Rendering)

What works

Full HTML when APIs are fast and everything loads

What breaks

Slow APIs, complex pages, any backend instability

Example failure: API delay → response returns early → HTML drops from ~100KB to ~8KB. Missing sections, inconsistent output per request.

Verdict: Correct model. Unreliable without strict performance control.

Edge Rendering (Proxy-Based)

What works

Vite, React, Lovable SPAs, dynamic routes — all of them

Behavior

Bots get full HTML snapshots. AI crawlers get structured Markdown. Humans get the normal app.

Example: /pricing always returns complete HTML to bots. No dependency on hydration. Done at the edge proxy, not in your app. How Edge works →

Verdict: Most reliable for Vite apps because it removes the JS dependency from bot responses entirely.

Practical Checklist

Run these checks. Don't rely on assumptions.

1

Fetch as bot

Check raw HTML size. Expected: >50KB. If <10KB → broken.

curl -A "Googlebot" https://yourdomain.com | wc -c
2

Measure visible text

Extract text content. Expected: >1,000 words. If <200 chars → script shell.

3

Test deep routes

Hit non-root URLs (/pricing, /blog/post). Expected: full HTML with route-specific content. If empty → routing issue.

4

Disable JavaScript

Load your page with JS off in DevTools. What you see is what bots see.

5

Break dependencies

Simulate an API failure or JS error. Check the bot response. If content disappears → fragile system.

6

Compare responses

Diff bot vs browser HTML. Large difference = indexing risk. Use the HTTP Debug tool →

Quick Test

Quick Test: What Do Bots Actually See?

~30 seconds

Most people guess. Don't.

Run this test and look at the actual response your site returns to bots.

1

Fetch your page as Googlebot

Use your terminal:

curl -A "Googlebot" https://yourdomain.com

Look for:

  • Real visible text (not just <div id="root">)
  • Meaningful content in the HTML
  • Page size (should not be tiny)
2

Compare bot vs browser

Now test what a real browser gets:

curl -A "Mozilla/5.0" https://yourdomain.com

If these responses are different, Google is indexing a different page than your users see.

Stop guessing — measure it.

Real example: 253 words vs 13,547

We see this constantly. Here's a real example from production: Googlebot saw 253 words and 2 KB of HTML. A browser saw 13,547 words and 77.5 KB. Same URL — completely different content.

Bot vs browser comparison showing 253 words for Googlebot vs 13,547 words for a rendered browser on the same URL

If your HTML doesn't contain the content, Google doesn't either.

Compare Googlebot vs browser on your site → HTTP Debug Tool
3

Check for common failure signals

We see this all the time in production:

  • HTML under ~1KB → usually empty shell
  • Visible text under ~200 characters → thin or missing content
  • Missing <title> or <h1> → weak or broken page
  • Large difference between bot vs browser HTML → rendering issue

Use the DataJelly Visibility Test (Recommended)

You can run this without touching curl. It shows you:

  • Raw HTML returned to bots (Googlebot, Bing, GPTBot, etc.)
  • Fully rendered browser version
  • Side-by-side differences in word count, HTML size, links, and content
Run Visibility Test — Free

What this test tells you (no guessing)

After running this, you'll know:

  • Whether your HTML is actually indexable
  • Whether bots are seeing partial content
  • Whether rendering is breaking in production

This is the difference between "I think SEO is set up" and "I know what Google is indexing."

If you don't understand why this happens, read: Why Google Can't See Your SPA

If this test fails

You have three real options:

SSR

Works if you can keep it stable in production

Prerendering

Breaks with dynamic content and scale

Edge Rendering

Reflects real production output without app changes

If you do nothing, you will not rank consistently. Learn how Edge Rendering works →

This issue doesn't show up in Lighthouse. It shows up in rankings.

Run the TestAsk a Question

The Bottom Line

Vite is optimized for client rendering. Crawlers are not. That mismatch causes empty HTML, partial content, and inconsistent indexing.

You have three real options:

  • 1.Accept thin pages — and accept that you won't rank consistently
  • 2.Rebuild with SSR — works if you can keep it stable under real load
  • 3.Fix the response layer — serve complete HTML to bots at the edge, no app changes

If your bot HTML is under ~10KB or under ~200 characters of text, you are not being indexed correctly.

Run Visibility Test — FreeAsk a QuestionStart 7-Day Free Trial

FAQ

Related Reading

How to Check What Googlebot Actually Sees

Step-by-step guide to verifying what Googlebot receives vs what Chrome renders.

React SEO Is Broken by Default

Same root cause — React ships empty HTML. Vite makes it worse by optimizing for client-side speed.

Why Your Content Doesn't Show Up in ChatGPT

AI crawlers hit the same empty shell problem. If bots can't see it, neither can ChatGPT.

Page Crawled But Not Indexed

Google crawls your Vite app but refuses to index it — thin HTML shells are the #1 cause.

Prerender vs SSR vs Edge Rendering

Side-by-side comparison of all three rendering strategies with real production trade-offs.

SPA SEO: The Complete Guide

Comprehensive guide to making JavaScript SPAs visible to search engines and AI.

HTTP Debug Tool

Compare Googlebot vs browser responses on any URL — see the gap yourself.

Bot Visibility Test

See exactly what crawlers receive when they visit your Vite app.

How AI Crawlers Read Your Website

Deep-dive into how GPTBot, ClaudeBot, and PerplexityBot fetch and process pages.

Reading progress0%

On This Page

DataJelly

SEO snapshots for modern SPAs. Making JavaScript applications search engine friendly with enterprise-grade reliability.

Product

  • DataJelly Edge
  • DataJelly Guard
  • Pricing
  • SEO Tools
  • Visibility Test
  • Dashboard

Resources

  • Blog
  • Guides
  • Getting Started
  • Prerendering
  • SPA SEO Guide

Company

  • About Us
  • Contact
  • Terms of Service
  • Privacy Policy

© 2026 DataJelly. All rights reserved. Built with love for the modern web.