Page Speed Analyzer
Compare raw HTTP response speed against full JavaScript render time.
This test shows whether a page is actually fast or only looks fast before JavaScript executes. It reports TTFB, full render time, JS overhead, content gap, rendered word count, DOM element count, render-blocking resources, status code, and differences across mobile vs desktop and Googlebot vs human browser output.
Use Human Browser, Googlebot, or Google vs Human mode without changing the analyzer workflow.
Live HTTP test — We fetch your page right now and measure real response times, content delivery, and JS rendering as it happens.
How is this different from Lighthouse?
- Lighthouse — simulated environment, throttled CPU/network, lab-based scores (Performance, Accessibility, SEO)
- This tool — real HTTP requests to your server, actual TTFB, real JS execution time, real content delivery
Lighthouse scores are useful benchmarks. This shows what bots and users actually experience right now.
Enter URL to Test
We'll run raw HTTP fetch + full JS render for both mobile and desktop simultaneously
Understand raw vs rendered performance
This analyzer runs two layers of testing on the same URL and compares the results side by side. Layer 1: Raw HTTP fetch. Layer 2: Full browser render. Both layers matter because a page can have fast TTFB and still take 8–10 seconds to render useful content if JavaScript does most of the work.
What this test measures
- TTFB (server response start time)
- Full render time after JavaScript execution
- JS overhead (render time minus TTFB)
- Content gap (rendered vs raw word count)
- DOM growth from hydration/client rendering
- Render-blocking resources
- Total scripts and their impact
- SPA dependency score (0-100)
How to use during a GSC cleanup
- Test pages marked Crawled - currently not indexed
- Compare Googlebot output against human output
- Find pages where raw HTML is thin but rendered content is large
- Identify pages with excessive JavaScript overhead
- Check whether important content appears only after JS execution
- Validate performance after Next.js, SSR, or hosting migration
- Prioritize pages that need SSR, prerendering, or code splitting
Why JS render time matters
Google can execute JavaScript, but delayed rendering, script-heavy pages, empty raw HTML, and large content gaps can slow discovery and make pages look thin or unstable.
When raw HTML has very little visible text but the rendered page has much more, the content gap shows a crawler-visible vs human-visible mismatch that often appears in crawled-not-indexed investigations.
Common problems this catches
- Fast server response but slow JavaScript render
- Raw HTML has almost no meaningful content
- Googlebot receives different content than humans
- Heavy JS overhead after a deploy
- Render-blocking scripts delaying page content
- Large DOM growth after hydration
- Missing metadata in raw response
- Slow mobile render compared to desktop
- Pages that look fine in browser but weak to crawlers
Recommended workflow
- 1Test the URL as Human Browser
- 2Run Googlebot or Google vs Human mode
- 3Compare TTFB against full render time
- 4Compare raw word count against rendered word count
- 5Check content gap and DOM growth
- 6Review render-blocking resources and scripts
- 7Re-test after code splitting, SSR, prerendering, or metadata fixes
- 8Use the Visibility Test to confirm crawler-visible content
Key metrics explained
- TTFB:
- Server response start time. ≤200ms is fast, >800ms is slow.
- Full render time:
- Total time for browser to finish rendering after JS execution.
- JS overhead:
- Full render time minus TTFB. Shows client-side cost.
- Content gap:
- Rendered words minus raw HTML words. Large gaps indicate thin crawler-visible content.
- SPA score:
- 0-100 JS dependency score based on content gap, DOM growth, and JS overhead.