Introduction

LLMs are just beginning to reshape the search landscape. A recent study found that 37% of US respondents already consider AI-enhanced search results more satisfying than traditional search results. 

Google remains the dominant search product globally, both in terms of total visits to the search engine and searches per search.  

unnamed (2)-2

unnamed (1)-1

However, a cohort of new search entrants (e.g., Microsoft Copilot and Perplexity) incorporating LLMs into their offerings and, conversely, LLM chatbots (e.g., ChatGPT and Claude) that deliver credible answers to search queries has rather quickly resurrected a competitive arena just as Google’s users have begun to express discontent at the erosion of the tech giant’s flagship

Google is not sitting still either, launching its own LLM competitor, Gemini, and infusing SERPs with AI Overviews, which provide generated answers to queries that appear alongside the traditional list of links.

Although the roll-out of LLM-powered search is still limited, its innovative features and hot competition indicate a potential near-future of AI-search ubiquity that content creators would be wise to anticipate. Unsurprisingly, the practice of SEO will need to be rethought to adapt to this evolving search paradigm, as LLM-powered search differs markedly in both its user experience and its favored content. In this post, we explore what optimizing for search in the age of LLMs looks like.

Evolution of Search Engines with LLMs

Transition from Keyword-Based Search to NLP

Historically, search engines relied heavily on keyword-based algorithms to rank and present search results. These algorithms were designed to match search queries with the most relevant content based on the presence of specific keywords. It’s hard to overstate the success of this approach, propelling Google to the top tier of market capitalization, becoming critical features of the most popular personal computing products, and uplifting the fortunes of countless forward-thinking publishers, content creators, and niche retailers, but the feedback loop of keyword-based searches and user behavior has formed a pattern of search practice that often obscures the full intent behind search queries.

The introduction of LLMs is changing this paradigm. LLMs, powered by advancements in natural language processing, can better understand the context, intent, and semantics of queries. They can process complex language structures, recognize synonyms, and grasp more nuances of human language, allowing them to deliver potentially more accurate and contextually relevant results.

Key Milestones in the Adoption of LLMs by Major Search Engines

LLM integration into search engines has progressed rapidly, and several key milestones have been noted. Google’s implementation of models like BERT (Bidirectional Encoder Representations from Transformers) and, later, MUM (Multitask Unified Model) were among the first steps toward incorporating NLP capabilities into search. These models improved the search engine’s ability to understand the context of words within a query rather than just matching keywords.

More recently, Google’s Gemini and Microsoft’s Copilot, which integrate OpenAI’s GPT models into Bing, Perplexity, and niche-purpose engines like Picnic for Healthcare and Swiggy for food delivery, represent a further leap in this direction. These LLMs are designed to generate rich, comprehensive, direct answers to user queries—rather than the resource lists of traditional SERPS—by processing vast amounts of content, identifying relevant material within the indexed pages, and coherently summarizing them.

Current State of LLM-Powered Search and Predictions for the Future

Today, LLM-powered search engines are profoundly impacting search results, enhancing the quality of search results, and changing the way users interact with search engines. Users of LLM-search are increasingly asking more complex, conversational questions, expecting search engines to understand and respond with precise answers.

 

 

 

Looking forward, we can expect LLMs to continue evolving, especially with the upcoming rollout of SearchGPT becoming even more sophisticated in their understanding of human language. This evolution will likely lead to further changes in search behavior, with users relying more on voice search and conversational queries. This means continually adapting strategies for SEO professionals to align with these developments.

The Changing Landscape of SEO

Content Creation in the Era of LLMs

As LLMs become more integral to search engines, how content is created and optimized for SEO must evolve. In the past, SEO content strategies often focused on keyword-matching - within the content and the backlinks - as a primary vector of visibility. LLMs evaluate contextual relevance and informativity differently, favoring content that provides topical clarity and reads naturally over content that places keywords in prominence. These priorities are not necessarily mutually exclusive, but keyword-optimized content has often received good search visibility despite—or maybe even because of—the unnatural overuse of keywords, while LLMs would likely find such content harder to comprehend.

Additionally, as LLMs excel at processing and understanding longer, more detailed pieces of content and extracting relevant information from within the text, creating comprehensive content that answers a wide range of related queries—until very recently regarded as insufficiently focused, counter to best practices—is becoming a viable tactic for SEO success.

The Diminishing Focus on Keywords

While keywords are still important, their role in SEO is changing. LLMs rely less on keywords to determine relevance. Instead, they consider the overall theme and context of the content. This shift means that SEO strategies should focus on creating thematically relevant content to the target audience’s needs and interests instead of obsessing over keyword density.

For example, rather than targeting a single keyword like “best-running shoes,” content that covers a group of related topics—such as shoe material, race/route distances, terrains, running techniques, and foot health, and how their interaction informs choice in running shoes—will likely align better with the way LLMs process information, enabling them to comprehend natural relationships between topics better and match content with a broader range of search queries.

Structured Data and Semantic Markup

Structured data and semantic markup help search engines understand and categorize content more deeply. Text content often contains easily digestible data for readers but requires machines to invest significant additional processing to identify and interpret it. Semantic markup renders such data in structured form to spiders and automated agents without impacting readability. By providing search engines with detailed metadata about the content on a page, structured data can improve how that content is indexed and ranked.

For instance, using semantic markup to identify products, reviews, recipes, or events can lower the processing threshold for LLMs to extract relevant information and present it in search results. The semantic markup enhances visibility and improves the likelihood of appearing in rich snippets, knowledge panels, and other prominent positions in search results.

Assessing the adoption of semantic markup takes time and effort. Several mid-2010 studies measured the domain publishing semantic markup at roughly 1.5% and 35%, but there doesn’t appear to be any research fresher than 2017 on this topic. However, the adoption of semantic markup does offer tangible search rewards. Rich results (aka semantically enhanced SERP links) boast higher average CTR than unenhanced results (58% vs. 41%), and, in a Google/Rakuten case study, semantic markup boosted search traffic by 2.7x and session duration by 1.5x. As reported previously, CTR, session duration, and other user behavior metrics play a much more significant role in Google search performance than company representatives suggested.

Technical SEO Adaptations

Optimizing for Core Web Vitals

Core Web Vitals (CWVs) are a set of performance metrics that Google uses to assess the user experience on a webpage. These metrics include the Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS), which measure loading speed, responsiveness to interaction, and visual stability, respectively.

A site that scores well across these metrics will not only perform better in search but also provide a better user experience. Ensuring that your website meets or exceeds these benchmarks can improve organic search performance, but good user experience is a blanket recommendation that, at first glance, seems as applicable to standard search as LLM-enhanced search.

For this reason, I asked Gemini for clarification.

My query:

I'm asking about content visibility in Gemini, for example. Would the CWV of a third-party publisher's site matter more to its content's visibility in Gemini than it would in Google Search?

Gemini’s response:

Yes, CWV (Core Web Vitals) can significantly impact content visibility in Gemini compared to Google Search.

Here's why:

  1. Gemini's Focus on User Experience: Gemini is designed to prioritize user experience. This means that factors like page load speed, interactivity, and visual stability, which CWV measures, are more likely to influence how Gemini ranks and displays content.
  2. LLM's Reliance on User Feedback: Gemini's LLM relies heavily on user feedback and interactions to improve its performance. A website with high CWV scores is more likely to provide a positive user experience, leading to more engagement and better data for the LLM to learn from.
  3. Gemini's Emphasis on Relevance and Quality: While relevance and quality are still important factors in Gemini, a poor user experience can significantly impact how content is perceived and ranked. High CWV scores help ensure that content is accessible, engaging, and valuable to users.

In contrast, while CWV is still considered in Google Search, it may not be as heavily weighted as relevance, authority, and other ranking factors. This is because Google Search focuses on providing comprehensive and informative results, regardless of the user experience.

Therefore, for content to be easily discoverable and visible in Gemini, it's essential to prioritize high CWV scores. This will improve user experience and help Gemini's LLM understand and rank content more effectively.

The key takeaway is that Gemini and Google Search assign different values to user experience and relevance while still trying to accomplish similar tasks. As good optimization maximizes ROI, i.e., not necessarily executing every possible optimization but allocating labor and resources efficiently to deliver the most potential positive impact, SEOs should aim to achieve the highest quality content with the strong user experience possible. Still, the LLM future increases the cardinal utility of UX improvement vis a vis content quality, though perhaps not the ordinal. 

Mobile-First Indexing and Its Continued Relevance

In 2016, Google first announced it had begun implementing mobile-first indexing, primarily using a site's mobile version for indexing and ranking. As mobile search traffic has steadily climbed over the years, with mobile devices now accounting for around 60% of all search traffic, that practice has expanded dramatically. Google announced earlier this summer that it would no longer index inaccessible sites on mobile devices.

This means optimizing content layout, ensuring fast load times on mobile, and using responsive design principles. With LLMs favoring good user experience, a site that performs well on mobile devices will likely fare better in search rankings.

Voice Search Optimization

Despite somewhat limited fanfare, voice search is quite popular, with 41% of US adults reporting using voice search daily and 90% of voice search users reporting that it's easier and faster than standard search! LLMs are tailor-made for voice search, offering substantially better language interpretation and enabling more conversational, spoken queries. As everyone with a smartphone carries at least one voice assistant, and those assistants can leverage LLM-powered search to generate and speak responses, both the interface and the underlying technology make search viable in fully hands-free environments, like driving, cycling, or washing dishes.

To optimize for voice search, consider creating FAQ pages, using natural language in content, and focusing on common questions sourced from forums and social media posts rather than keyword research tools. This approach aligns with how LLMs process voice queries and can improve your chances of contributing to voice search results.

Adapting SEO Strategy for LLM-Driven Search Results

Importance of Topical Authority and E-E-A-T (Expertise, Experience, Authoritativeness, Trustworthiness)

In the era of LLM-powered search, building topical authority and demonstrating E-E-A-T (Expertise, Experience, Authoritativeness, and Trustworthiness) is more important than ever. LLMs prioritize content from sources recognized as authorities in their respective fields.

To build topical authority, focus on creating comprehensive content that covers a wide range of subtopics related to your main subject. This can involve developing content clusters that provide in-depth coverage of a specific topic. Additionally, ensuring that your content is authored by experts and backed by credible sources can enhance your site’s E-E-A-T, making it more likely to rank well in LLM-powered searches.

Content Clustering and Internal Linking

Content clustering is an effective strategy for improving topical relevance and authority. You can create a comprehensive resource on a specific topic by grouping related content together and linking it with a strong internal linking structure.

This approach helps users navigate your site more efficiently and signals to LLMs that your site is a valuable resource for that particular topic. When implementing content clusters, ensure that each piece of content is optimized for related keywords and that internal links connect them logically.

Leveraging AI Tools for SEO

As LLMs become more sophisticated, using AI tools to analyze search trends, optimize content, and refine SEO strategies is becoming increasingly important. AI-powered tools can help you understand how LLMs interpret content, identify keyword opportunities, and predict changes in search engine algorithms.

For example, AI tools can analyze large datasets to uncover patterns in search behavior, allowing you to adapt your content strategy accordingly. They can also assist in creating content, suggesting topics, headlines, and keywords that will likely perform well in LLM-powered searches.

Continuous Monitoring and Adaptability

The SEO landscape constantly evolves, especially with the integration of LLMs into search engines. To stay competitive, it is essential to continuously monitor your SEO performance and adapt your strategies as needed. Regularly updating content, conducting SEO audits, and staying informed about changes in search algorithms are all crucial for maintaining a strong online presence.

By remaining agile and responsive to changes in the search environment, you can ensure that your SEO efforts remain effective despite ongoing advancements in LLM technology.

The Future of SEO in the Age of LLMs

Predicted Trends in LLM-Powered Search

As LLMs continue to evolve, we can expect several trends to shape the future of SEO:

Increased Focus on User Intent

LLMs will become even better at understanding user intent, making it essential for content creators to focus on addressing specific needs and questions rather than just optimizing for keywords.

Growth of Conversational Search

As voice search becomes more prevalent, SEO strategies must adapt to the conversational nature of these queries, emphasizing natural language and direct answers.

Greater Importance of Video and Visual Content

LLMs are also becoming more adept at processing video and visual content, meaning optimizing multimedia content for search will become increasingly important.

Preparing for Ongoing Changes

To future-proof your SEO efforts, it’s important to:

Stay Informed

Follow industry blogs, attend conferences, and participate in SEO communities to keep up with the latest developments in LLM technology and search engine algorithms.

Be Agile

As the search landscape continues to evolve, remain flexible and willing to experiment with new strategies. Regularly test and refine your SEO tactics to ensure they remain effective.

Invest in AI Tools

Leverage AI-powered tools to gain insights into search trends, optimize content, and anticipate changes in search engine behavior.

Conclusion

Integrating LLMs into search engines marks a significant tectonic shift under the feet of content creators and publishers. As these models evolve, SEO practices must adapt by focusing on high-quality content, topical authority, natural language, user experience, and technical SEO. By staying informed, embracing change, and leveraging AI tools, you can develop new SEO strategies that remain effective, despite the changes that LLM-powered search ushers in.

As competition and innovation are set to reshape search more rapidly than in the last decade, SEO, perennially pronounced dead, may ascend to peak importance in the publisher toolbox. Those willing to adapt will find themselves well-positioned to serve users and expand visibility through LLMs effectively.