DJ
DataJelly
Visibility Test
EdgeGuard
PricingSEO ToolsGuidesGet Started
Dashboard
Back to Blog
Blog
Edge
April 2026

Your HTML Is Only 4KB (And Why That's a Problem)

A deploy goes out. Everything is green. 200 OK responses. No backend errors. Lighthouse looks fine. But traffic drops within 48 hours. Pages stop ranking. Social previews show blank cards. You check the page source: 4.1KB. ~30 words. No H1, no content, just a root div and scripts. We see this all the time.

Reading progress0%

On This Page

What's Actually Happening

Your server is returning a shell, not a page.

A typical SPA HTML response in production looks like this:

  • 3KB–6KB total size
  • 1 root div
  • 2–5 script tags
  • No real body content
<!DOCTYPE html>
<html>
<head>
  <title>My App</title>
</head>
<body>
  <div id="root"></div>
  <script src="/assets/index-abc123.js"></script>
</body>
</html>

Everything else depends on JavaScript: fetch data, render components, inject metadata. Here's the failure mode:

1Bot requests page
2Server returns 4KB HTML
3Bot does not execute JS (or times out)
4Bot indexes empty page

Quick verification (30 seconds):

curl -s https://yourdomain.com | wc -c
# If under ~10,000 bytes, your page is likely empty

curl -s https://yourdomain.com | grep -oP '\w+' | wc -l
# If under ~100 words, your content doesn't exist to crawlers

Want the automated version? Run the DataJelly Visibility Test — it does this comparison for you.

What Most Guides Get Wrong

The industry advice is flat wrong here. You'll hear:

"Google renders JavaScript"

Sometimes. For some pages. With an unpredictable delay. Deep routes often never get rendered.

"Just use React, it's fine"

React ships a root div. Without SSR or prerendering, that root div is your entire HTML payload.

What actually happens in production with Googlebot's rendering queue:

RouteRendered?Indexed?
/ Sometimes Usually
/pricing Partially Inconsistent
/blog/post-123 Rarely No

And other bots don't even try:

  • AI crawlers (GPTBot, ClaudeBot, PerplexityBot): read raw HTML only
  • Social bots (Slack, Twitter, LinkedIn): no JS execution
  • SEO tools (Ahrefs, Screaming Frog defaults): no JS execution

If your content is not in the initial HTML, it is invisible. This is the same problem we cover in depth in Why Google Can't See Your SPA and Why ChatGPT Can't See Your Content.

What We See in Production

These are not edge cases. This is normal. We see these patterns across hundreds of sites every week.

1

Empty HTML, indexed anyway

HTML: 4.3KB | Words: 25 | Title: correct
Page content: missing
Result: page ranks for nothing

Google indexed the title tag but found no body content. The page exists in the index but has zero ranking signals.

2

Hydration blocked by API failure

JS loads → API call returns 500 → page never renders
HTML still returns 200 with empty body
Users see spinner, bots see nothing

This is a silent failure. Your monitoring says "200 OK" but the page has no content. We wrote about this pattern in Your Site Returns 200 OK — But Is Completely Broken.

3

Deep routes completely empty

/         → works (sometimes rendered)
/features → works (partially rendered)
/blog/post → 4KB shell (never rendered)

Bots hit deep URLs directly. They never navigate from your homepage. If the route returns a shell, they see nothing. Check yours with the Page Validator.

4

Metadata injected too late

OG tags added via JavaScript → HTML has none
→ Slack preview: blank
→ Twitter card: missing image
→ LinkedIn: wrong title

Social bots never execute JS. If your OG tags aren't in the raw HTML response, every share link is broken. See Fixing Broken Social Previews in SPAs.

Quick Test: Is Your Site Affected?

Don't guess. Measure it. Use this step-by-step diagnostic to check what bots actually see on your site right now.

Quick Test: What Do Bots Actually See?

~30 seconds

Most people guess. Don't.

Run this test and look at the actual response your site returns to bots.

1

Fetch your page as Googlebot

Use your terminal:

curl -A "Googlebot" https://yourdomain.com

Look for:

  • Real visible text (not just <div id="root">)
  • Meaningful content in the HTML
  • Page size (should not be tiny)
2

Compare bot vs browser

Now test what a real browser gets:

curl -A "Mozilla/5.0" https://yourdomain.com

If these responses are different, Google is indexing a different page than your users see.

Stop guessing — measure it.

Real example: 253 words vs 13,547

We see this constantly. Here's a real example from production: Googlebot saw 253 words and 2 KB of HTML. A browser saw 13,547 words and 77.5 KB. Same URL — completely different content.

Bot vs browser comparison showing 253 words for Googlebot vs 13,547 words for a rendered browser on the same URL

If your HTML doesn't contain the content, Google doesn't either.

Compare Googlebot vs browser on your site → HTTP Debug Tool
3

Check for common failure signals

We see this all the time in production:

  • HTML under ~1KB → usually empty shell
  • Visible text under ~200 characters → thin or missing content
  • Missing <title> or <h1> → weak or broken page
  • Large difference between bot vs browser HTML → rendering issue

Use the DataJelly Visibility Test (Recommended)

You can run this without touching curl. It shows you:

  • Raw HTML returned to bots (Googlebot, Bing, GPTBot, etc.)
  • Fully rendered browser version
  • Side-by-side differences in word count, HTML size, links, and content
Run Visibility Test — Free

What this test tells you (no guessing)

After running this, you'll know:

  • Whether your HTML is actually indexable
  • Whether bots are seeing partial content
  • Whether rendering is breaking in production

This is the difference between "I think SEO is set up" and "I know what Google is indexing."

If you don't understand why this happens, read: Why Google Can't See Your SPA

If this test fails

You have three real options:

SSR

Works if you can keep it stable in production

Prerendering

Breaks with dynamic content and scale

Edge Rendering

Reflects real production output without app changes

If you do nothing, you will not rank consistently. Learn how Edge Rendering works →

This issue doesn't show up in Lighthouse. It shows up in rankings.

Run the TestAsk a Question

Solutions Compared

There are three real approaches to fixing this. Here's what actually works and what breaks.

ApproachHow it worksBreaks when
PrerenderingStatic HTML snapshots at build timeDynamic routes, frequent content changes, scale >100 pages
SSRHTML rendered on every requestSlow TTFB, complex infra, hard to scale under load
Edge SnapshotsPre-rendered HTML served only to bots at the edgeDoesn't break — bots always get full HTML, users get normal SPA

Why edge snapshots work

  • • Bot → gets full HTML snapshot (30KB–80KB)
  • • User → gets normal SPA experience
  • • No app rewrite required
  • • No runtime rendering cost

This is the only approach we see consistently work without breaking the app. Learn how Edge works →

Practical Checklist

Run these checks right now. If any fail, your page is not indexable.

1

curl your page

HTML <10KB? Problem.

2

View page source

Just a root div and scripts? No real paragraphs? Problem.

3

Count words in raw HTML

<200 words = effectively empty.

4

Disable JavaScript and reload

Blank page = broken for bots.

5

Hit deep URLs directly (not just homepage)

Individual routes return empty shells? Problem.

6

Check OG tags in raw HTML

Missing = social previews will fail.

Fix criteria — all must pass:

  • HTML contains real content on first response
  • H1 + body text present in raw HTML
  • Internal links exist without JS
  • Metadata (title, OG tags) present without JS

If your HTML is 4KB, your site is not SEO-compatible.

Not "suboptimal." Not "needs improvement." Broken. You are shipping empty documents and hoping a crawler reconstructs them. Sometimes it works. Most of the time it doesn't.

What DataJelly Does About This

Instead of sending a 4KB shell, DataJelly Edge serves full HTML snapshots (30KB–100KB) to bots. Content is present immediately. No JS dependency. For AI crawlers, clean Markdown is served — no parsing or rendering required.

Works with React, Vite, Lovable SPAs. No framework rewrite required. The fix is at the edge, not in your app.

Run Visibility Test — FreeStart 7-Day Free TrialAsk a Question

Related Diagnostic Tools

Visibility Test

Compare bot vs browser HTML side-by-side

Page Validator

Check bot-readiness of any URL

HTTP Debug Tool

Compare Googlebot vs browser responses

Social Card Preview

Test OG tags and social sharing

FAQ

Related Reading

Why Google Can't See Your SPA

The full breakdown of how SPAs fail for search engines.

AI SEO vs Traditional SEO: What Actually Changes

AI crawlers don't render JS. Here's what that means for your content.

React SEO Is Broken by Default

Why React ships empty HTML and how to fix it.

Vite SEO Problems (And How to Fix Them)

Vite apps ship 3–7KB to bots while browsers see 120KB.

React Blank Page in Production

Build passes, health checks green, page renders nothing.

Edge Rendering — How It Works

Full HTML snapshots served at the edge. No app rewrite required.

Reading progress0%

On This Page

DataJelly

SEO snapshots for modern SPAs. Making JavaScript applications search engine friendly with enterprise-grade reliability.

Product

  • DataJelly Edge
  • DataJelly Guard
  • Pricing
  • SEO Tools
  • Visibility Test
  • Dashboard

Resources

  • Blog
  • Guides
  • Getting Started
  • Prerendering
  • SPA SEO Guide

Company

  • About Us
  • Contact
  • Terms of Service
  • Privacy Policy

© 2026 DataJelly. All rights reserved. Built with love for the modern web.