CrUX vs Lighthouse — field data vs lab data
Web performance comes from two measurement systems, and they don't agree. Understanding the difference is the first step to diagnosing anything.
Lighthouse (lab data)
Lighthouse runs in Chrome DevTools, PageSpeed Insights, or via CLI. It simulates a page load on a specific virtual device (the default is a throttled "mobile 4G" profile) and reports what it measures.
- Pros: fast, consistent, catches issues before shipping
- Cons: doesn't reflect real users — one synthetic machine on one synthetic network
When you run Lighthouse five times on the same URL you'll get five slightly different scores, because simulation has variance. But it's repeatable enough for debugging.
CrUX (field data)
The Chrome User Experience Report (CrUX) aggregates real user measurements from Chrome browsers globally. Every Chrome user who opted in to usage statistics contributes data about the pages they visit. CrUX publishes the aggregated data (never individual sessions) on a 28-day rolling window.
- Pros: ground truth — it's literally what your users experienced
- Cons: only available for origins with enough traffic to anonymize; lags by up to 28 days
CrUX is what Google uses for Search ranking signals. It's also what Web Vitals Explorer renders.
When to use which
Use Lighthouse when:
- Debugging a performance regression
- Testing a fix before deploy
- Iterating on the same page quickly
- Profiling specific interactions in DevTools
Use CrUX (field data) when:
- Answering "are we fast for real users?"
- Checking SEO-relevant performance
- Comparing with competitors
- Validating that a shipped fix actually moved the p75
The gap
A common pattern: your Lighthouse score is 95+, but your CrUX data shows LCP 4.5s at p75. What happened?
Lighthouse ran on a simulated 1.6Mbps / 150ms RTT profile. Your real users might be on 3G with 500ms RTT, or on old Android devices with slow single-core CPUs. Lighthouse sees "fast server + small bundle = fast LCP." Real users see "slow CPU spending 3 seconds parsing your JavaScript before LCP paints."
The fix is almost always: reduce the amount of JavaScript the browser has to parse before first paint. Lighthouse's simulated CPU is faster than many real users' devices.
How to read both together
- Look at CrUX first (p75 field data)
- If CrUX is slow, run Lighthouse to reproduce and debug
- Fix, deploy, wait 28 days, check CrUX again
- Iterate
Lab data without field data leads you to optimize for the wrong thing. Field data without lab data leaves you without a way to test fixes. You need both.
By Paulo de Vries · Published