TL;DR: Google’s removal of a search parameter used by data crawlers has artificially inflated “Average Position” metrics and potentially deflated total impression counts. Our new analysis of performance data across a dozen industries reveals position inflation ranges from a modest 9% in E-commerce to a massive 47% in B2B. This is not a real performance gain; it’s a data artifact that requires you to adjust your reporting to get an accurate picture of SEO success.
A few weeks ago, Google quietly disabled the ?num=100 search parameter, which allowed users to request 100 search results on a single page. While many SEO professionals noted the inconvenience, they missed the more significant story: the quiet, artificial inflation of a key performance metric and its impact on reported search visibility.
Our initial analysis suggested an across-the-board improvement in average rankings. However, after analyzing performance data from the two weeks before and after this change, a more nuanced and dramatic picture has emerged. The impact varies wildly by industry, creating a serious challenge for accurate performance analysis.
The Mathematical Ghost in the Machine
According to Google’s official documentation, Average Position is the average ranking of your site’s URLs for a given query. The now-defunct ?num=100 parameter was a favorite of SEO crawlers and data-harvesting bots. These tools systematically logged impressions for deep, low-value positions (e.g., positions 80-100), dragging down the overall average position while simultaneously inflating total impression counts with non-human views.
With that parameter gone, the removal of these low-ranking data points from the equation caused the calculated average position to rise. At the same time, the elimination of bot-generated views means your total reported organic impressions may actually decrease. Our new data shows just how significant the position shift was for different sectors.
Evidence From the Field: A Widely Varied Impact
Our analysis reveals a dramatic variance in the artificial “improvement” of average position, directly correlated with the historic intensity of crawler activity in each sector.
- The B2B sector saw the most significant inflation, with average positions appearing to “improve” by a staggering 47%.
- Industries like AI (38%), Legal (39%), and Education (33%) also saw massive shifts, indicating their keywords are heavily monitored.
- On the other end of the spectrum, E-commerce (9%) and Automotive (23%) saw more moderate, but still significant, inflation.
This chart illustrates the disparity:
Why it Matters: This variance is critical. An executive in the B2B space might be led to believe their SEO strategy suddenly became 47% more effective overnight, while a leader in E-commerce might see a more believable 9% gain. Neither reflects a real change in market position. This distortion, combined with a potential drop in total impressions from the removal of bot traffic, can lead to misinterpreting agency/team performance, flawed budget allocation, and a fundamental misunderstanding of your true search visibility.
Key Takeaways for Accurate Performance Measurement
To maintain analytical integrity, leadership teams must adjust how they interpret Search Console data.
- Challenge All “Average Position” Gains: For the next two quarters, treat any claims of significant Average Position improvement with skepticism. Ask your team or agency to isolate this reporting artifact from genuine ranking improvements, using this industry data as a benchmark.
- Correlate Position with Impression Volume: Monitor Average Position alongside total organic impressions. An “improved” average position accompanied by a drop in impressions is a strong signal of this data artifact, not a performance win. True improvement is seeing position rise while clicks and high-quality impressions also increase.
- Prioritize Impression-Weighted Metrics: Shift your focus from raw Average Position to an impression-weighted average. This metric naturally gives more significance to the top positions that drive clicks and business value, minimizing the noise from low-ranking pages.
- Establish a New Baseline: Mark the date of this change in your reporting. When comparing month-over-month or year-over-year data, you must account for this artificial lift to have a true comparison.
Conclusion
Google’s technical cleanup to curb crawler activity has inadvertently demonstrated how seemingly straightforward metrics can be dangerously misleading. Without understanding the mechanics behind this shift, executives risk making strategic decisions based on flawed data. True performance analysis requires looking beyond the surface-level numbers to understand the “why” behind the data and its direct impact on business goals.
Navigate the future of search with confidence
Let's chat to see if there's a good fit
More from Previsible
SEO Jobs Newsletter
Join our mailing list to receive notifications of pre-vetted SEO job openings and be the first to hear about new education offerings.