Performance Improvements Are Inconsistent
Performance improvements can look inconsistent even when Speed Layer is helping, so it is important to evaluate results in context.
Common reasons improvements look inconsistent
Lab test variance
PageSpeed and Lighthouse results can fluctuate between runs.
Traffic mix changes
A shift in device mix, network quality, or page types can change overall metrics.
Third-party tag changes
Adding or changing vendor scripts can introduce regressions that reduce gains.
Platform changes
Template updates or platform releases can affect performance independently of Speed Layer.
How to evaluate more reliably
- Focus on trends over time, not one-off scores.
- Segment by mobile versus desktop.
- Track key page types separately.
- Compare against a baseline time window.
Related pages
- Reading Trends vs One-Time Scores
- Setting Baselines and Comparing Before/After
- Common Misreads of Performance Data (and How to Avoid Them)