Why AI FOMO Is Distracting Enterprise SEO Teams From What Actually Drives Performance

Tyson Stockton
Tyson Stockton
31 Mar, 2026 6 mins read

In this week’s episode of Voices of Search, we spoke with Kaspar Szymanski, Founder and Senior Director of SearchBrothers and former member of Google’s Search Quality team, about why enterprise SEO teams are being distracted by AI-driven FOMO—and what actually drives performance at scale.

Kaspar spent years at Google ensuring quality results for some of the largest search queries in the world, then turned his attention to helping enterprise organizations optimize their search strategies sustainably. Over the past several months, he’s noticed a recurring pattern: teams chasing generative AI, automation, and experimental tools often overlook the fundamentals that still make the biggest impact.

On this week’s episode, he broke down why technical performance, crawl efficiency, and site hygiene are still the levers that move the needle for enterprise SEO, which common technical issues are quietly reducing visibility, and how organizations can prioritize improvements that produce sustainable growth instead of chasing hype cycles.

Key Takeaways From This Episode

  • AI-driven FOMO is pushing enterprise SEO teams to prioritize experimentation over fundamentals
  • Technical performance and crawl efficiency still drive major enterprise SEO gains
  • Server logs remain one of the most underutilized data sources in enterprise SEO
  • Soft 404 pages and crawl inefficiencies quietly reduce visibility and rankings
  • Many ranking issues are caused by technical hygiene—not algorithm updates
  • Enterprise teams often overinvest in backlinks instead of fixing technical issues
  • Internal stakeholder alignment remains one of the biggest enterprise SEO challenges

AI FOMO Is Driving Enterprise SEO Decision-Making

AI dominates the SEO conversation, but Kaspar notes that this type of industry-wide excitement isn’t new. Over the years, SEO has experienced multiple hype cycles—from mobile-first indexing to voice search and structured data.

“SEO has been declared dead so many times,” Kaspar explains. “Every few years, there’s something new that’s supposed to change everything.”

For enterprise organizations, these cycles often create pressure from leadership to quickly adopt new strategies. Teams are expected to implement AI-driven initiatives, explore automation, and experiment with generative content—sometimes before understanding how those changes will impact performance.

This pressure can lead to resource shifts away from foundational improvements that may already be limiting growth. Kaspar emphasizes that AI has value, but organizations should avoid replacing fundamentals with experimentation.

“Before you chase the next big thing, make sure the fundamentals are in place,” he says. “Otherwise, you’re building on top of existing problems.”

Kaspar also notes that this isn’t just a technical issue—it’s cultural. Teams under pressure to show innovation often prioritize flashy AI experiments because leadership equates “AI” with “progress.” The result: enterprise organizations may end up chasing the latest trend while ignoring issues that quietly suppress visibility, engagement, and conversion at scale.

Enterprise SEO Fundamentals Still Deliver the Biggest Gains

Despite rapid changes in search, many of the most impactful SEO improvements remain consistent—especially for large websites. For enterprise organizations, technical performance and crawl efficiency often determine how well content performs. Improvements in these areas can produce meaningful gains across thousands—or even millions—of pages.

Site Speed Still Matters

Performance remains one of the most reliable areas for improvement. Faster websites improve user experience, increase engagement, and help search engines crawl content more efficiently.

Kaspar highlights that when content quality is similar, performance can become the deciding factor.

“Google’s always going to pick and choose the platform that is faster,” he explains. “If two sites offer similar value, performance matters.”

For enterprise sites, improving template-level performance can create measurable gains at scale. Optimizing images, reducing render-blocking scripts, and leveraging caching across category and product templates often result in immediate improvements without adding new content or links.

Kaspar points out that enterprise teams sometimes focus so heavily on content creation and link building that they overlook page speed at the template level. “Fixing one slow template that is applied across thousands of pages can move the needle much more than adding ten new backlinks,” he says.

Crawl Efficiency Often Limits Growth

Another key issue for large sites is crawl efficiency. Enterprise websites often contain vast volumes of pages, making it difficult for search engines to prioritize the right content. When crawl budgets are wasted on low-value pages, important content may receive less attention.

This can limit indexing and visibility—even when content quality is strong. Poorly configured faceted navigation, duplicate content, and dynamically generated URLs are common culprits. Kaspar explains that these issues quietly suppress rankings because search engines spend resources crawling pages that add little SEO value.

“Crawl efficiency is especially critical for large e-commerce sites,” Kaspar notes. “If Googlebot spends all its time crawling filtered or duplicate pages, your main product pages might never get the attention they deserve.”

Server Logs Reveal What Search Engines Are Actually Doing

One of Kaspar’s strongest recommendations is analyzing server logs. While many enterprise organizations collect this data, few actively use it for SEO. Server logs provide visibility into how search engines crawl a website and expose bottlenecks that traditional analytics can’t detect.

“Server logs show you what Google is actually doing,” Kaspar explains. “And that’s often very different from what teams assume.”

This data can reveal:

  • Pages receiving excessive crawl attention
  • Important pages being ignored
  • Redirect chains and crawl inefficiencies
  • Low-value pages consuming crawl budget
  • Technical bottlenecks affecting indexing

Kaspar emphasizes that server logs are particularly useful for uncovering orphaned pages, duplicate content, or patterns where bots repeatedly hit 404s or soft 404s. By analyzing this data, enterprise teams can adjust robots.txt, update internal linking, and prioritize pages that actually drive visibility.

Soft 404 Pages Quietly Reduce Site Performance

Soft 404 pages return a 200 status code but provide little or no meaningful content. From a technical perspective, these pages appear valid, but from a user perspective, they often deliver poor experiences.

“Technically, everything looks fine,” Kaspar explains. “But the user experience is not there.”

Soft 404s commonly appear in:

  • Out-of-stock product pages
  • Thin category pages
  • Filtered search results
  • Pagination issues

Over time, these pages can dilute overall site quality, reduce crawl efficiency, and harm rankings. For large enterprise websites, soft 404s can create a cascading effect where crawlers spend time on low-value pages instead of discovering new content, effectively limiting growth potential across thousands of pages.

Technical Hygiene Issues Often Get Blamed on Algorithm Updates

Many enterprise SEO teams attribute performance drops to algorithm updates. But Kaspar notes that technical issues are often responsible. 

Common problems include:

  • Canonical conflicts
  • Noindex errors
  • Orphaned pages
  • Slow time-to-first-byte
  • Redirect chains

“These are often basic issues,” Kaspar explains. “But at scale, they can have a significant impact.”

Because these issues develop gradually, they may go unnoticed until rankings decline. Enterprise teams often spend time analyzing competitors or chasing algorithm changes, when a careful audit of internal technical issues would resolve more visibility loss in less time.

The Backlink Myth Still Influences Enterprise SEO

Kaspar challenges a common assumption: that backlinks are always the top priority. While links remain important, many enterprise sites already have strong authority. In these cases, technical improvements often deliver greater returns.

“The vast majority of websites do not need more backlinks,” he says. “They need technical improvements.”

Enterprise teams frequently overinvest in link acquisition campaigns while low-hanging opportunities in technical performance and crawl optimization remain unaddressed. Shifting focus to fundamentals can provide measurable growth without the cost and unpredictability of outreach campaigns.

Enterprise SEO Success Requires Internal Alignment

Technical improvements often require collaboration across multiple teams. 

Enterprise SEO initiatives may involve:

  • Engineering teams
  • Product teams
  • Content teams
  • Leadership stakeholders

Without alignment, recommendations may not be implemented—even when they offer significant impact. Kaspar notes that communication and stakeholder management are critical components of enterprise SEO.

“Enterprise SEO is not just technical,” he explains. “It’s also about getting things done within organizations.”

Kaspar advises defining clear ownership, measurable KPIs, and alignment across departments to ensure technical recommendations are executed efficiently. Without this alignment, even high-impact improvements can languish indefinitely.

The Fundamentals Still Drive Enterprise SEO Success

AI, automation, and emerging technologies are reshaping the SEO industry. But as Kaspar explains, many of the biggest gains still come from foundational improvements. Technical performance, crawl optimization, and site architecture remain essential—especially for large enterprise websites.

Organizations that prioritize fundamentals while evaluating new technologies thoughtfully are more likely to achieve sustainable growth. Enterprise teams that balance experimentation with methodical technical audits avoid wasted effort and create scalable, long-term results.

As SEO continues to evolve, one principle remains consistent: strong foundations create the best opportunities for long-term success.

Voices of Search is a daily SEO and content marketing podcast hosted by Jordan Keone and Tyson Stockton. The show delivers actionable strategies and data-driven insights to help marketers navigate the ever-evolving world of search engine optimization and content marketing. New episodes air weekly, covering everything from technical SEO to AI discovery, featuring industry leaders and practitioners sharing real-world frameworks and proven tactics.

Subscribe to Voices of Search on Apple Podcasts, Spotify, or your favorite podcast platform. Follow Previsible on LinkedIn for updates and subscribe to the VOS YouTube channel for video episodes and clips. You can also visit the official VOS site to explore the full episode archive and submit your SEO questions for future episodes.

SEO educator and strategist bridging the gap between technical SEO teams and organizational leadership. As co-founder and COO of Previsible.io, Tyson empowers Fortune 500 companies through strategic consulting, team development, and recruitment, while sharing industry insights as host of the Voices of Search podcast to help SEO professionals advance their careers.

Navigate the future of search with confidence

Let's chat to see if there's a good fit

SEO Jobs Newsletter

Join our mailing list to receive notifications of pre-vetted SEO job openings and be the first to hear about new education offerings.

" " indicates required fields

This field is for validation purposes and should be left unchanged.