February 4th, 2025
AI Search Optimization is the Next Generation of AI SEO
AI Search Optimization (AISO) & AI Agent Optimization (AIAO) are expanding the scope of search engine optimization for business websites. In SEO, we have spent 20+ years fretting about search algorithm rankings resulting from the various factors gleaned by the Google search bot. Well, there is a new bot in town. And its name is…ChatGPT Operator, released on January 24, 2025. It is the first example for businesses to create automated processes. Well, supervised autonomy as the user must complete user verifications and transactions.
The new web, AI Web, if you will, will not only require your website to engage with users and make it easier for the Google search bot to index you, but your AI Web website will also optimize the user experience for AI Agents. Your business’s website conversions can only increase from facilitating AI agents (bots) in finding information and conducting transactions, such as ChatGPT Operator. Traditional SEO, targeting human users and search engine crawlers, is expanding to accommodate and influence AI search engines (like ChatGPT Search) and enabling AI agents (like ChatGPT Operators) to interact with your website. Sounds like SEO is getting a lot more complex and difficult? The answer is, yes.
Yes, traditional SEO with optimizing content is getting more competitive with new AI models to get search engine results, the large language models reading webpages themselves in lieu of linking to the site, and search engines having less links to the actual website (and more ads and AI summaries).
Yes, facilitating the AI agents once they arrive to your website to make them convert more easily will be new task for SEO. The technical details of agent-to-agent website interactions and how they will be facilitated by the website are to be discovered.
But No, SEO is very much the same process as before for businesses. Users or AI agents are still using search engines that may be manipulated to provide results via SEO. Bing is being utilized for search functions in ChatGPT. ChatGPT Search uses “Microsoft Bing’s index to provide real-time answers to user queries,” according to Yoast. Thus, Bing SEO tactics will apply to some degree. However, with AI driven search engines we can only influence the results and as the saying goes, results may vary.
The good thing for marketers is that these artificial intelligence applications are influenced by tasks already conducted for SEO. There are some more technical additions focused around facilitating the bots or agents journey on your website. The big difference artificial intelligence makes on website design and content development is schema markup. Make it easy for the search bots and agents to find information and conduct transactions.
Let’s break down these new forms of SEO for AI powered search and AI bots. Given that the SEO ranking factors for generative AI models have not been exposed and there are no industry experts to ask, the team at TESSA has deduced the processes, tasks and factors that affect AI search and AI bots. In the big scheme of things, we can call it all SEO since we are conducting the same task: making your website and online presence have more prospects, customers and sales.
What is Optimizing Rankings for AI Search?
The practice of tailoring site content and technical elements (like schema markup) so AI-driven search engines—such as ChatGPT Search—can better understand, summarize, and rank it. This focuses on conversational content, structured data, and real-time updates that align with user intent in an AI-augmented search environment.
What is AI Agent Optimization?
Preparing a website to facilitate the AI-powered agents to perform tasks—such as checking inventory, initiating purchases, or retrieving specific data—without direct human input. This involves clear APIs, machine-readable markup, and workflow design that allows bots to interact with site functionality as seamlessly as human users.
The Foundation: Technical SEO for AI Search Engines and AI Agents
Targeting any AI tools, whether AI based search engines or AI agents, your website must make it easy for bots/agents to identify information and initiate actions. The technical infrastructure must be machine-readable, structured, and efficient.
Structured Data & Schema Markup
- For AI Search: Use recognized schema markup such as FAQPage, HowTo, Product, Review, and Offer to give ChatGPT Search the ability to understand and quote your content.
- For AI Agents: Extend schema with official “Actions” like BuyAction or BookAction if applicable, so AI agents have the ability to conduct basic interactions (e.g., reservations, purchases).
Example: Product pages with price, availability, and aggregateRating schemas improve visibility in ChatGPT Search and let AI agents check stock or initiate purchases through recognized Offer or BuyAction details.
APIs & Real-Time Data Access
- For AI Search: Provide real-time data (e.g., inventory, pricing) via APIs to ensure ChatGPT Search delivers accurate, up-to-date answers.
- For AI Agents: Expose transactional APIs (e.g., checkout, booking) using OpenAPI specs so AI agents can programmatically complete tasks.
It should be noted that the new Computer-Using Agent (CUA) in ChatGPT Operator may not need APIs to conduct transactions. Its current version at its initial release, the AI tool did not have autonomous task execution.
AI-Crawler-Friendly Architecture
- Simplify navigation with clean URL structures and XML sitemaps.
- Use dedicated pathways (e.g., /ai-help) with stripped-down, text-heavy content for AI agents.
- Consider standard HTTP headers or recognized schema properties (e.g., availabilityStarts, availabilityEnds) to indicate time-sensitive content to both search engines and agents.
Content Creation Guidelines for AI Powered Search Engines
Content creation must be both human readable and understood by AI systems. By adopting a conversational approach, aligning with user intent, and keeping information fresh, your site accommodates the new AI use cases—whether that’s direct Q&A, dynamic pricing, or real-time updates—while still engaging genuine human visitors.
Conversational Content Creation
- For AI Search: Ensure content is written in natural language, answering questions directly (e.g., “How do I reset my password?”).
- For AI Agents: Optimize content with bullet points, tables, and FAQs to help agents parse and act on information.
Intent-Driven SEO
- Target long-tail keywords and semantic phrases that match user intent and are commonly used in search phrases (e.g., “budget wireless headphones under $50”).
- Use unambiguous CTAs (for example, “Buy Now,” “Schedule Appointment”) so AI agents can identify transactional opportunities in the right context.
Dynamic & Fresh Content
- As a routine task, update time-sensitive pages (e.g., company promotions, event listings).
- Use webhooks or real-time notifications to ensure any integrated AI agent receives up-to-date info (e.g., price drops, new inventory).
User Experience: Designed for Bots and Humans
UX now extends beyond human website visitors to include AI agents. By refining site navigation, ensuring robust transactional flows, and emphasizing trust signals, your business can increase website engagement and conversions not only for human users but also the AI bots.
AI-Optimized Navigation
- Implement breadcrumbs with schema markup (BreadcrumbList) to clarify page hierarchies for AI crawlers or other AI tools.
- Use descriptive anchor text (e.g., “View return policy”) to guide AI agents through workflows to make a seamless collaboration between website and other agents.
Transactional System Support
- Optimize forms with clear labels, ARIA roles (e.g., role=”navigation”), and machine-readable error codes (e.g., payment_declined).
- Support multi-step workflows (e.g., checkout flows) for user interactions with session tokens to maintain context for AI agents.
Trust & Credibility
- Strengthen E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) with author bios, case studies, and reviews.
- Add trust signals (SSL badges, testimonials) to reassure users and AI systems.
AI Likes to E-E-A-T: Experience, Expertise, Authoritativeness, Trustworthiness
Google has bestowed upon us in their Search Quality Evaluator Guidelines the term to un-summarize terms for overall credibility like “human authority,” “authorship,” as E-E-A-T: Expertise, Authoritativeness, and Trustworthiness. AI Search also uses EEAT, albeit with the additional ability to evaluate the content’s intent to better match to the user’s search intent. As AI tools like ChatGPT Search, Perplexity, You.com, Brave Search, and DeepSeek evolve, their ability to assess E-E-A-T should be superior to the Google of old, noting that the new Google search has also integrated AI Overviews into its search results.
How Do AI Search Engines Assess E-E-A-T?
AI Search not only does it assess the content creator’s credibility, it assesses the validity of the statements in the content itself. Thus, it assesses
- Experience: AI can analyze when the content uses firsthand accounts and personal narratives to gauge the author’s direct experience with the subject matter. For instance, a blog post about the author’s experience doing research or conducting field work could signal genuine “Experience.”
- Expertise: By evaluating the depth and accuracy of information, AI can determine the level of expertise demonstrated in the content. Compare this to Google’s use of latent semantic indexing (LSI) that looks for commonly used phrases in the most highly ranked articles, which doesn’t “understand” whether it demonstrates expertise or not.
- Authoritativeness: AI search, like traditional search engines, can assess the reputation of both the content creator and the website by analyzing backlinks, citations, social mentions in reputable sources.
- Trustworthiness: AI, like traditional search engines, can evaluate trust signals such as secure website protocols (HTTPS), transparent author information, clear privacy policies, and positive user reviews. Beyond traditional search, it can also detect misinformation or biased content by cross-referencing multiple reliable sources.
AI Search Engines vs. Google Search E-E-A-T
AI-powered search engines have advantages over traditional search engines, a la Google, in regards to assessing E-E-A-T.
Evolving Knowledge and Real-Time Ranking
AI search engines continuously learn and adapt based on new data, enabling them to assess E-E-A-T factors at the time the user searches:
- Continuous Learning: AI models update their understanding of what constitutes expertise and authority as new information becomes available, allowing for more accurate assessments over time.
- Real-Time Ranking: AI can evaluate the freshness and relevance of a website’s content upon searching the content after the user query, versus being indexed periodically like Google Search.
Semantic Understanding
AI Search uses semantic understanding which relies on Natural Language Processing (NLP) and machine learning algorithms to interpret the meaning behind queries and content. This serves as a reality check against the content’s claims.
- Contextual Relevance: AI can discern the context in which information is presented, ensuring that content aligns not just with the keywords but also with the broader topic and the user’s intent.
- Content Depth and Breadth: AI evaluates whether the content comprehensively covers the topic, providing detailed explanations, supporting evidence, and addressing the question that the user asked.
- Content Summaries: AI can generate summaries and synthesize information from multiple sources, merging authoritative from multiple sources.
Personalized and Context-Aware Search Results
- User Intent Prediction: By analyzing conversational queries, AI can better predict and understand user intent. This enables search results to align more precisely with specific needs, emphasizing content with high E-E-A-T in relevant contexts.
- Follow-up Questions: AI can ask additional questions to better understand the user’s intent.
- Context Retention: AI maintains context across multiple queries within a single session, allowing for a more accurate assessment of E-E-A-T as it builds a comprehensive understanding of the user’s information needs. This ensures subsequent search results are even more tailored and relevant.
- Personalization: AI search engines learn from individual user behaviors, preferences, and past interactions. This functionality is limited currently. ChatGPT paid plans now allow to opt-in to store personalized data, but these are more of preferences and notes (for example, “Runs a small business technology consulting company and is developing an internal software product”). This differs from referencing and building upon past conversations in a new conversation.
Multi-Source Verification
- Cross-Referencing and Fact-Checking: AI can corroborate across multiple reputable sources, plus its existing knowledge, to verify the content’s accuracy and trustworthiness.
Authorship and Source Credibility
- Author Profiling: Similar to Google, AI can analyze an author’s background, qualifications, and previous work to assess their expertise and authoritativeness more thoroughly than traditional search engines.
- Source Credibility Assessment: AI can evaluates the credibility of entire websites and platforms, not just individual pages, providing a broader assessment of the content’s credibility versus traditional search’s reliance on backlink metrics.
Enhanced Trust Signals Recognition
- Visual and Structural Trust Signals: AI recognizes and interprets visual trust signals such as badges, certificates, and secure payment indicators, but also visual assessments like clear navigation and transparent contact information.
- Behavioral Trust Indicators: AI assesses behavioral indicators such as user engagement metrics, time spent on page, and interaction rates to infer trustworthiness and authoritativeness. High engagement levels often correlate with content that effectively meets user needs.
Cost Limitations to AI Search Credibility Checks:
- Despite these advanced capabilities, there are economic and logistical constraints. Extensive E-E-A-T checks and credibility evaluations by AI search incur processing costs and require additional time. Consequently, while AI has the capability to thoroughly check content for credibility, it is unlikely to be done extensively due to the additional time and cost to process. Many of the methods discussed here are not yet common functionality in AI Search. Others, such as real-time updates to the search and personalization
Comparative Advantages Over Traditional Search Engines
While traditional search engines like Google have integrated E-E-A-T into their ranking algorithms, AI search offer several more extensive means to assess E-E-A-T.
- Superior Contextual Understanding: Unlike traditional search engines that primarily rely on keyword matching and backlink profiles, AI search engines utilize deep contextual and semantic analysis to understand the meanings behind queries and content.
- Interactive Refinement of Results: The conversational nature of AI search allows users to engage in dialogues with the search engine, prompting it to provide more precise and authoritative information based on follow-up questions. This enhances the E-E-A-T alignment of the results.
- Enhanced Personalization: AI search engines leverage extensive personalization based on individual user data, behaviors, and preferences. This ensures that content with high E-E-A-T is tailored to meet specific user needs and contexts, achieving a level of personalization that traditional search engines attain to a lesser extent.
- Automated and Comprehensive Verification: AI’s ability to cross-reference and fact-check content across multiple sources ensures higher accuracy and trustworthiness in search results. Traditional search engines may not perform this level of automated verification, potentially allowing less credible content to rank higher based solely on link metrics.
- Deeper Author and Source Analysis: AI search engines conduct more in-depth analyses of authors’ credentials and the overall authority of content sources. This comprehensive evaluation surpasses traditional search engines, which may rely more on external signals like backlinks and domain authority without as granular an analysis of content creators.
- Cost Limitations to AI Search Credibility Checks: Despite these advanced capabilities, there are economic and logistical constraints. Extensive background checks and credibility evaluations by AI search incur processing costs and require additional time. Consequently, while AI has the capability to thoroughly check content for credibility, it is unlikely to be done extensively for financial reasons. AI platforms may instead build user databases to customize results, similar to practices already employed by Google.
E-E-A-T in AI-Driven SEO
Although AI Search is capable of contextual understanding and credibility checking, the tactics to SEO AI Search are still believed to be the same as traditional SEO. For businesses and content creators, this means:
- Prioritizing Quality Over Quantity: Focus on producing high-quality, authoritative, and trustworthy content that genuinely meets user needs and demonstrates Experience and Expertise.
- Building Strong Authoritativeness: Establish a robust online presence through reputable backlinks, authoritative content, and transparent author information (re: markup) to enhance E-E-A-T signals.
- Enhancing User Trust: Build confidence among your website visitors that your site is reliable, credible, and safe to interact with for both human users and AI bots.
Monitoring & Adaptation
Monitoring and adaptation are critical in a constantly shifting AI-driven landscape. By tracking how AI systems interact with your site, you can pinpoint optimization opportunities and address potential issues. Whether you’re analyzing crawler activity, testing AI user workflows, or applying privacy safeguards, continual monitoring and evolving tactics is key to the new SEO.
Track AI Systems’ Interactions
- Use traditional SEO tools like Google Search Console for Bing (via ChatGPT Search) and server log analysis to monitor AI crawler activity.
- Test queries in ChatGPT Search to audit how your content is cited.
Test AI User Workflows
- Simulate AI-powered interactions with common search queries (e.g., “Find a hotel in Miami under $300/night”) to identify bottlenecks.
- Validate APIs with tools like Postman to ensure seamless AI agent access.
Ethical & Compliance Safeguards
- Extend robots.txt rules to manage AI crawlers (e.g., user agents from ChatGPT or Bing Chat).
- Ensure GDPR/CCPA compliance for APIs handling user data.
AI Agent Security & Anti-Spam
How do we facilitate generative AI utilizing our websites without massively increasing website spam? Imagine the web where generative AI is used to generate countless agents to prob websites and spin up bogus content. The AI Web would lose its value. As it stands now, in real world applications, there’s one fundamental problem that AI bots/agents will encounter: being blocked by websites! To counter the onslaught of spam, AI bots will facilitate a paradigm shift or game changer in website security. No longer will website security be about focusing solely on blocking bots. At least until there is a international standard for user agent verification, the website will need to be its own gatekeeper to allow the good AI Agents and block the bad AI Search Optimization (AISO) & AI Agent Optimization (AIAO) are expanding the scope of search engine optimization for business websites. In SEO, we have spent 20+ years fretting about search algorithm rankings resulting from the various factors gleaned by the Google search bot. Well, there is a new bot in town. And its name is…ChatGPT Operator, released on January 24, 2025. It is the first example for businesses to create automated processes. Well, supervised autonomy as the user must complete user verifications and transactions.
The new web, AI Web, if you will, will not only require your website to engage with users and make it easier for the Google search bot to index you, but your AI Web website will also optimize the user experience for AI Agents. Your business’s website conversions can only increase from facilitating AI agents (bots) in finding information and conducting transactions, such as ChatGPT Operator.
Traditional SEO, targeting human users and search engine crawlers, is expanding to accommodate and influence AI search engines (like ChatGPT Search) and enabling AI agents (like ChatGPT Operators) to interact with your website. Sounds like SEO is getting a lot more complex and difficult? The answer is, yes.
Yes, traditional SEO with optimizing content is getting more competitive with new AI models to get search engine results, the large language models reading webpages themselves in lieu of linking to the site, and search engines having less links to the actual website (and more ads and AI summaries).
Yes, facilitating the AI agents once they arrive to your website to make them convert more easily will be new task for SEO. The technical details of agent-to-agent website interactions and how they will be facilitated by the website are to be discovered.
But No, SEO is very much the same process as before for businesses. Users or AI agents are still using search engines that may be manipulated to provide results via SEO. Bing is being utilized for search functions in ChatGPT. ChatGPT Search uses “Microsoft Bing’s index to provide real-time answers to user queries,” according to Yoast. Thus, Bing SEO tactics will apply to some degree. However, with AI driven search engines we can only influence the results and as the saying goes, results may vary.
The good thing for marketers is that these artificial intelligence applications are influenced by tasks already conducted for SEO. There are some more technical additions focused around facilitating the bots or agents journey on your website. The big difference artificial intelligence makes on website design and content development is schema markup. Make it easy for the search bots and agents to find information and conduct transactions.
Let’s break down these new forms of SEO for AI powered search and AI bots. Given that the SEO ranking factors for generative AI models have not been exposed and there are no industry experts to ask, the team at TESSA has deduced the processes, tasks and factors that affect AI search and AI bots. In the big scheme of things, we can call it all SEO since we are conducting the same task: making your website and online presence have more prospects, customers and sales.
What is AI Search Optimization?
The practice of tailoring site content and technical elements (like schema markup) so AI-driven search engines—such as ChatGPT Search—can better understand, summarize, and rank it. This focuses on conversational content, structured data, and real-time updates that align with user intent in an AI-augmented search environment.
What is AI Agent Optimization?
Preparing a website to facilitate the AI-powered agents to perform tasks—such as checking inventory, initiating purchases, or retrieving specific data—without direct human input. This involves clear APIs, machine-readable markup, and workflow design that allows bots to interact with site functionality as seamlessly as human users.
The Foundation: Technical SEO for AI Search Engines and AI Agents
Targeting any AI tools, whether AI based search engines or AI agents, your website must make it easy for bots/agents to identify information and initiate actions. The technical infrastructure must be machine-readable, structured, and efficient.
Structured Data & Schema Markup
- For AI Search: Use recognized schema markup such as FAQPage, HowTo, Product, Review, and Offer to give ChatGPT Search the ability to understand and quote your content.
- For AI Agents: Extend schema with official “Actions” like BuyAction or BookAction if applicable, so AI agents have the ability to conduct basic interactions (e.g., reservations, purchases).
Example: Product pages with price, availability, and aggregateRating schemas improve visibility in ChatGPT Search and let AI agents check stock or initiate purchases through recognized Offer or BuyAction details.
APIs & Real-Time Data Access
- For AI Search: Provide real-time data (e.g., inventory, pricing) via APIs to ensure ChatGPT Search delivers accurate, up-to-date answers.
- For AI Agents: Expose transactional APIs (e.g., checkout, booking) using OpenAPI specs so AI agents can programmatically complete tasks.
It should be noted that the new Computer-Using Agent (CUA) in ChatGPT Operator may not need APIs to conduct transactions. Its current version at its initial release, the AI tool did not have autonomous task execution.
AI-Crawler-Friendly Architecture
- Simplify navigation with clean URL structures and XML sitemaps.
- Use dedicated pathways (e.g., /ai-help) with stripped-down, text-heavy content for AI agents.
- Consider standard HTTP headers or recognized schema properties (e.g., availabilityStarts, availabilityEnds) to indicate time-sensitive content to both search engines and agents.
Content Creation Guidelines for AI Powered Search Engines
Content creation must be both human readable and understood by AI systems. By adopting a conversational approach, aligning with user intent, and keeping information fresh, your site accommodates the new AI use cases—whether that’s direct Q&A, dynamic pricing, or real-time updates—while still engaging genuine human visitors.
Conversational Content Creation
- For AI Search: Ensure content is written in natural language, answering questions directly (e.g., “How do I reset my password?”).
- For AI Agents: Optimize content with bullet points, tables, and FAQs to help agents parse and act on information.
Intent-Driven SEO
- Target long-tail keywords and semantic phrases that match user intent and are commonly used in search phrases (e.g., “budget wireless headphones under $50”).
- Use unambiguous CTAs (for example, “Buy Now,” “Schedule Appointment”) so AI agents can identify transactional opportunities in the right context.
Dynamic & Fresh Content
- As a routine task, update time-sensitive pages (e.g., company promotions, event listings).
- Use webhooks or real-time notifications to ensure any integrated AI agent receives up-to-date info (e.g., price drops, new inventory).
User Experience: Designed for Bots and Humans
UX now extends beyond human website visitors to include AI agents. By refining site navigation, ensuring robust transactional flows, and emphasizing trust signals, your business can increase website engagement and conversions not only for human users but also the AI bots.
AI-Optimized Navigation
- Implement breadcrumbs with schema markup (BreadcrumbList) to clarify page hierarchies for AI crawlers or other AI tools.
- Use descriptive anchor text (e.g., “View return policy”) to guide AI agents through workflows to make a seamless collaboration between website and other agents.
Transactional System Support
- Optimize forms with clear labels, ARIA roles (e.g., role=”navigation”), and machine-readable error codes (e.g., payment_declined).
- Support multi-step workflows (e.g., checkout flows) for user interactions with session tokens to maintain context for AI agents.
Trust & Credibility
- Strengthen E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) with author bios, case studies, and reviews.
- Add trust signals (SSL badges, testimonials) to reassure users and AI systems.
AI Likes to E-E-A-T: Experience, Expertise, Authoritativeness, Trustworthiness
Google has bestowed upon us in their Search Quality Evaluator Guidelines the term to un-summarize terms for overall credibility like “human authority,” “authorship,” as E-E-A-T: Expertise, Authoritativeness, and Trustworthiness. AI Search also uses EEAT, albeit with the additional ability to evaluate the content’s intent to better match to the user’s search intent. As AI tools like ChatGPT Search, Perplexity, You.com, Brave Search, and DeepSeek evolve, their ability to assess E-E-A-T should be superior to the Google of old, noting that the new Google search has also integrated AI Overviews into its search results.
How Do AI Search Engines Assess E-E-A-T?
AI Search not only does it assess the content creator’s credibility, it assesses the validity of the statements in the content itself. Thus, it assesses
- Experience: AI can analyze when the content uses firsthand accounts and personal narratives to gauge the author’s direct experience with the subject matter. For instance, a blog post about the author’s experience doing research or conducting field work could signal genuine “Experience.”
- Expertise: By evaluating the depth and accuracy of information, AI can determine the level of expertise demonstrated in the content. Compare this to Google’s use of latent semantic indexing (LSI) that looks for commonly used phrases in the most highly ranked articles, which doesn’t “understand” whether it demonstrates expertise or not.
- Authoritativeness: AI search, like traditional search engines, can assess the reputation of both the content creator and the website by analyzing backlinks, citations, social mentions in reputable sources.
- Trustworthiness: AI, like traditional search engines, can evaluate trust signals such as secure website protocols (HTTPS), transparent author information, clear privacy policies, and positive user reviews. Beyond traditional search, it can also detect misinformation or biased content by cross-referencing multiple reliable sources.
AI Search Engines vs. Google Search E-E-A-T
AI-powered search engines have advantages over traditional search engines, a la Google, in regards to assessing E-E-A-T.
Evolving Knowledge and Real-Time Ranking
AI search engines continuously learn and adapt based on new data, enabling them to assess E-E-A-T factors at the time the user searches:
- Continuous Learning: AI models update their understanding of what constitutes expertise and authority as new information becomes available, allowing for more accurate assessments over time.
- Real-Time Ranking: AI can evaluate the freshness and relevance of a website’s content upon searching the content after the user query, versus being indexed periodically like Google Search.
Semantic Understanding
AI Search uses semantic understanding which relies on Natural Language Processing (NLP) and machine learning algorithms to interpret the meaning behind queries and content. This serves as a reality check against the content’s claims.
- Contextual Relevance: AI can discern the context in which information is presented, ensuring that content aligns not just with the keywords but also with the broader topic and the user’s intent.
- Content Depth and Breadth: AI evaluates whether the content comprehensively covers the topic, providing detailed explanations, supporting evidence, and addressing the question that the user asked.
- Content Summaries: AI can generate summaries and synthesize information from multiple sources, merging authoritative from multiple sources.
Personalized and Context-Aware Search Results
- User Intent Prediction: By analyzing conversational queries, AI can better predict and understand user intent. This enables search results to align more precisely with specific needs, emphasizing content with high E-E-A-T in relevant contexts.
- Follow-up Questions: AI can ask additional questions to better understand the user’s intent.
- Context Retention: AI maintains context across multiple queries within a single session, allowing for a more accurate assessment of E-E-A-T as it builds a comprehensive understanding of the user’s information needs. This ensures subsequent search results are even more tailored and relevant.
- Personalization: AI search engines learn from individual user behaviors, preferences, and past interactions. This functionality is limited currently. ChatGPT paid plans now allow to opt-in to store personalized data, but these are more of preferences and notes (for example, “Runs a small business technology consulting company and is developing an internal software product”). This differs from referencing and building upon past conversations in a new conversation.
Multi-Source Verification
- Cross-Referencing and Fact-Checking: AI can corroborate across multiple reputable sources, plus its existing knowledge, to verify the content’s accuracy and trustworthiness.
Authorship and Source Credibility
- Author Profiling: Similar to Google, AI can analyze an author’s background, qualifications, and previous work to assess their expertise and authoritativeness more thoroughly than traditional search engines.
- Source Credibility Assessment: AI can evaluates the credibility of entire websites and platforms, not just individual pages, providing a broader assessment of the content’s credibility versus traditional search’s reliance on backlink metrics.
Enhanced Trust Signals Recognition
- Visual and Structural Trust Signals: AI recognizes and interprets visual trust signals such as badges, certificates, and secure payment indicators, but also visual assessments like clear navigation and transparent contact information.
- Behavioral Trust Indicators: AI assesses behavioral indicators such as user engagement metrics, time spent on page, and interaction rates to infer trustworthiness and authoritativeness. High engagement levels often correlate with content that effectively meets user needs.
Cost Limitations to AI Search Credibility Checks:
- Despite these advanced capabilities, there are economic and logistical constraints. Extensive E-E-A-T checks and credibility evaluations by AI search incur processing costs and require additional time. Consequently, while AI has the capability to thoroughly check content for credibility, it is unlikely to be done extensively due to the additional time and cost to process. Many of the methods discussed here are not yet common functionality in AI Search. Others, such as real-time updates to the search and personalization
Comparative Advantages Over Traditional Search Engines
While traditional search engines like Google have integrated E-E-A-T into their ranking algorithms, AI search offer several more extensive means to assess E-E-A-T.
- Superior Contextual Understanding: Unlike traditional search engines that primarily rely on keyword matching and backlink profiles, AI search engines utilize deep contextual and semantic analysis to understand the meanings behind queries and content.
- Interactive Refinement of Results: The conversational nature of AI search allows users to engage in dialogues with the search engine, prompting it to provide more precise and authoritative information based on follow-up questions. This enhances the E-E-A-T alignment of the results.
- Enhanced Personalization: AI search engines leverage extensive personalization based on individual user data, behaviors, and preferences. This ensures that content with high E-E-A-T is tailored to meet specific user needs and contexts, achieving a level of personalization that traditional search engines attain to a lesser extent.
- Automated and Comprehensive Verification: AI’s ability to cross-reference and fact-check content across multiple sources ensures higher accuracy and trustworthiness in search results. Traditional search engines may not perform this level of automated verification, potentially allowing less credible content to rank higher based solely on link metrics.
- Deeper Author and Source Analysis: AI search engines conduct more in-depth analyses of authors’ credentials and the overall authority of content sources. This comprehensive evaluation surpasses traditional search engines, which may rely more on external signals like backlinks and domain authority without as granular an analysis of content creators.
- Cost Limitations to AI Search Credibility Checks: Despite these advanced capabilities, there are economic and logistical constraints. Extensive background checks and credibility evaluations by AI search incur processing costs and require additional time. Consequently, while AI has the capability to thoroughly check content for credibility, it is unlikely to be done extensively for financial reasons. AI platforms may instead build user databases to customize results, similar to practices already employed by Google.
E-E-A-T in AI-Driven SEO
Although AI Search is capable of contextual understanding and credibility checking, the tactics to SEO AI Search are still believed to be the same as traditional SEO. For businesses and content creators, this means:
- Prioritizing Quality Over Quantity: Focus on producing high-quality, authoritative, and trustworthy content that genuinely meets user needs and demonstrates Experience and Expertise.
- Building Strong Authoritativeness: Establish a robust online presence through reputable backlinks, authoritative content, and transparent author information (re: markup) to enhance E-E-A-T signals.
- Enhancing User Trust: Build confidence among your website visitors that your site is reliable, credible, and safe to interact with for both human users and AI bots.
Monitoring & Adaptation
Monitoring and adaptation are critical in a constantly shifting AI-driven landscape. By tracking how AI systems interact with your site, you can pinpoint optimization opportunities and address potential issues. Whether you’re analyzing crawler activity, testing AI user workflows, or applying privacy safeguards, continual monitoring and evolving tactics is key to the new SEO.
Track AI Systems’ Interactions
- Use traditional SEO tools like Google Search Console for Bing (via ChatGPT Search) and server log analysis to monitor AI crawler activity.
- Test queries in ChatGPT Search to audit how your content is cited.
Test AI User Workflows
- Simulate AI-powered interactions with common search queries (e.g., “Find a hotel in Miami under $300/night”) to identify bottlenecks.
- Validate APIs with tools like Postman to ensure seamless AI agent access.
Ethical & Compliance Safeguards
- Extend robots.txt rules to manage AI crawlers (e.g., user agents from ChatGPT or Bing Chat).
- Ensure GDPR/CCPA compliance for APIs handling user data.
AI Agent Security & Anti-Spam
How do we facilitate generative AI utilizing our websites without massively increasing website spam? Imagine the web where generative AI is used to generate countless agents to prob websites and spin up bogus content. The AI Web would lose its value. As it stands now, in real world applications, there’s one fundamental problem that AI bots/agents will encounter: being blocked by websites! To counter the onslaught of spam, AI bots will facilitate a paradigm shift or game changer in website security. No longer will website security be about focusing solely on blocking bots. At least until there is a international standard for user agent verification, the website will need to be its own gatekeeper to allow the good AI Agents and block the bad bots. Below is a high-level approach to help facilitate AI bots in a beneficial manner without inadvertently encouraging data scraping or spam:
- Partial “Soft Gate” Techniques
- Minimal Friction CAPTCHA: Only trigger a CAPTCHA after repeated requests from the same IP/user agent so normal visitors see no disruption.
- View-Limited Pages: Show public prices to casual browsers, but throttle suspicious user agents if they load too many pages too quickly.
- Monitoring & Rate-Limiting
- Rate Limits by IP/User Agent: Throttle or block abnormal traffic spikes—no forced login needed.
- Suspicious Pattern Detection: Log requests and look for scraping patterns. If triggered, serve a short challenge (like a CAPTCHA). A good application to utilize AI driven insights into abnormal patterns. Is it cold now
- Careful Structuring of Public Data
- Summaries vs. Exhaustive Price Tables: Let humans see essential info while limiting massive data exports.
- Expandable Sections: Detailed info under “Show more” toggles; simple bots might skip or fail to parse them fully.
- Identify Legitimate Generative AI Bots
- Recognized User Agents: Differentiate official crawlers (e.g., Bing, Google) from unknown scrapers.
- Serve Structured Data Wisely: Provide Product or Offer markup, but gate deeper details behind minimal friction if needed.
- User-Friendly Transparency
- Explain Why: If a user is temporarily blocked, clarify it’s due to unusual activity.
- Preserve Customer Experience: Typical shoppers or casual user needs never encounter undue friction.
By balancing open data (for legitimate AI-driven help) with controls to detect and deter high-volume scraping, you maintain a friendly user experience while limiting spam.
AI SEO Search Ranking Factors
In 2025, AI powered search engines like Perplexity, ChatGPT Search, You.com, Brave Search, DeepSeek and others utilize new search engine ranking factors that prioritize contextual understanding, user intent, and content quality over traditional search engines’ keyword-centric approach, a la traditional Google organic search rankings. In theory, AI search engine results should be better at understanding the user’s intent than traditional search engines that focus on indexing content related to keywords. To give this analysis some Local SEO context, we will analyze key ranking factors for AI search for the search phrase, “Junk Removal in NYC”.
1. Contextual Relevance, Semantic Understanding and Search Intent
AI search engines use natural language processing (NLP) and transformer models (e.g., GPT-4, Claude) to interpret the user’s search to determine their context and the searcher’s intent. For example:
– Perplexity AI first analyzes the user’s search intent behind a search, which is then used to synthesize answers from multiple websites based on their relevance and not keyword density.
– ChatGPT Search keeps the conversation going by refining its results via follow-up questions.
– DeepSeek displays for the user its understanding of the context of the search and its logic used to reply.
For a local query like “Junk Removal in NYC,” AI search engines would prioritize businesses that have website content that references what they searched (e.g., “same-day pickup,” “eco-friendly disposal”) and matches user intent (e.g., pricing, service areas).
2. Content Quality and Depth
AI search engines favor comprehensive, authoritative content that answers questions thoroughly:
– E-A-T (Expertise, Authoritativeness, Trustworthiness) is critical. For example, Google’s AI algorithms prioritize content from credible sources with clear expertise in their field. The uses of author schema or other markup to make the source verifiable.
– Perplexity and You.com highlight answers with citations, linking to reputable sources like industry blogs, government sites, or verified reviews.
– For local services, including FAQs (e.g., “What items can’t be removed?”), service details, and customer testimonials improves relevance.
3. Real-Time Data and Content Freshness
AI search engines prioritize up-to-date information:
– Perplexity and Bing Copilot integrate real-time data retrieval, ensuring results reflect current trends or events.
– For “Junk Removal in NYC,” businesses with updated service hours, seasonal promotions, or recent customer reviews would rank higher.
4. Local SEO and Hyperlocal Signals
Localized optimization is crucial for geographically specific queries:
– Google’s SGE and Bard emphasize proximity, using signals like Google Business Profile (GBP) listings, localized keywords (e.g., “Brooklyn junk removal”), and embedded maps.
– Brave Search uses independent indexes, so ensuring NAP (Name, Address, Phone) consistency across directories (e.g., Yelp, Yellow Pages) is essential.
– Structured data markup (Schema.org) for services, service areas, and reviews helps AI engines parse local relevance.
5. User Experience (UX) and Technical SEO
AI algorithms evaluate page performance and usability:
– Mobile optimization and fast load times are prioritized by engines like Google SGE and You.com.
– Brave Search rewards sites with minimal ads and trackers, aligning with its privacy-first approach.
– Clear navigation and intuitive design reduce bounce rates, signaling quality to AI systems.
6. Backlinks and Domain Authority
While traditional SEO factors persist, AI engines emphasize quality over quantity:
– Baidu’s ranking study found a strong correlation between high-quality backlinks (e.g., from .gov or industry hubs) and rankings.
– For local services, links from local news outlets, community boards, or environmental organizations (e.g., NYC recycling initiatives) boost authority.
7. Personalization and User Behavior
AI engines like You.com and Microsoft Copilot tailor results based on user history and preferences:
– Users searching for “Junk Removal in NYC” might see results influenced by past interactions (e.g., preference for eco-friendly services).
– Engagement metrics (time on page, click-through rates) signal content relevance to AI systems.
Optimization Strategies for “Junk Removal in NYC”
1. Localized Content: Include neighborhood-specific pages (e.g., “Junk Removal in Manhattan”) and optimize Google Business Profile with photos, reviews, and service tags.
2. Structured Data: Use schema markup for services, pricing, and service areas.
3. Voice Search: Optimize for conversational queries (e.g., “Who offers cheap junk removal near me?”).
4. Citations: Ensure consistent NAP details on local directories and industry platforms.
5. Authority Building: Publish case studies or partner with local environmental groups to enhance E-A-T.
Key Challenges
– Bias and Filter Bubbles: AI search results may prioritize popular brands, requiring smaller businesses to emphasize niche expertise.
– Misinformation Risks: Ensure content accuracy, as engines like Perplexity penalize unverified claims.
Case Study: AI SEO for E-Commerce
Scenario
An online store for collectibles and memorabilia. For the past three years, TESSA has conducted a limited but nationwide search optimization campaign focusing on organic Google search results. However, with the rise of ChatGPT Search and AI-driven bots like ChatGPT Operator, there is an opportunity to go beyond AI for content creation:
- Increase traffic from AI powered search.
- Increase conversions via AI product recommendations, plus AI tools for automating follow up.
- Enable AI bots to automatically check stock, initiate cart additions, notify customers of new items that may be of interest, and even process transactions on behalf of the user—without direct human input at every step.
Given the rarity of some of their items, they have customers waiting to buy a rare item as soon as they are released for sale. A, AI shopping bot could schedule purchases that meet a certain threshold of requirements on condition, price, etc.
Key Implementation Steps
- Structured Data & Schema Markup
- Expanded Product Schema: Deployed Product, Offer, and AggregateRating to feed AI searches accurate pricing, stock, and review data.
- Action-Oriented Markup: Added BuyAction so AI agents can initiate purchases.
- APIs & Real-Time Data Access
- Live Inventory: Offered an OpenAPI endpoint returning stock status and pricing, helping ChatGPT provide real-time answers.
- Checkout/Booking: Developed a secure API for cart additions or payment processing, ready for future autonomous AI.
- AI-Crawler-Friendly Architecture
- Clean URLs & XML Sitemaps: Improved discoverability for both humans and AI crawlers.
- Time-Sensitive Headers: Used properties like availabilityStarts/availabilityEnds to highlight deals and limited stock.
- Enhanced Content Creation
- Conversational Descriptions: Product pages answer typical Q&As to appear in AI responses.
- Frequent Updates: Seasonal promotions or new inventory trigger webhooks, keeping AI-fed data fresh.
- User Experience & Transactional Support
- Clear Forms: Marked checkout flows with ARIA roles and labeled error states (payment_declined) for AI-driven parsing.
- Trust Signals: Displayed SSL badges, user reviews, and author bios to bolster E-E-A-T for both humans and AI systems.
- Security & Anti-Spam Measures
- Partial Soft Gate: A lightweight CAPTCHA only after suspiciously high requests.
- Recognized AI Bots: Whitelisted Bing, Google, and “ChatGPT-Operator” while limiting unknown bots.
Outcome & Results
- Stronger AI Search Presence: Products often appear in ChatGPT’s “top picks under $X” queries.
- Increased Conversions: Real-time data and friction-less APIs improved both human and AI-initiated cart completions.
- Reduced Scraping: Soft gating and user-agent policies limited unwanted data harvesting.
- Higher Trust: Q&A content and reliable reviews boosted user (and AI) confidence, driving repeat visits and better brand perception.
The Future of SEO: A Unified Approach
The line between AI search engines and AI agents is blurring. ChatGPT Search already acts as an agent, pulling data from Bing and performing more tasks on behalf of users beyond just reviewing search results. To stay competitive:
- Merge Technical Strategies: Schema markup, APIs, and crawlable architecture serve both AI search and agents.
- Create Dual-Purpose Content: Write for human user engagement to collect information and structure for machines to have a clear contextual understanding.
- Build for Interaction: Assume every visitor could be a human or an AI bot acting on their behalf.
Checklist for Converged SEO
In summary, businesses can implement the following tasks to get ahead of the curve with generative AI web:
- Implement recognized schema markup to facilitate search queries and actions (Product, Offer, FAQPage, BuyAction, etc.).
- Develop APIs for real-time data and transactions.
- Audit content for conversational clarity and intent alignment.
- Test workflows with AI agents and ChatGPT Search.
Conclusion
Optimizing a company’s website for AI search and AI agents are two sides of the same coin, which is called SEO. By merging all of these strategies, you create a website that’s not only visible in AI-driven search results but also capable of interacting with AI agents to drive engagement, sales, and loyalty. The future belongs to brands that treat AI as both a search engine and a customer.
More Research Needed
This research is speculative given that these AI Systems are new without historical usage data and experiences from conducting SEO for artificial intelligence website interactions.
Many questions remain about these AI systems.
How do ChatGPT, Microsoft Copilot, and Bing all treat natural language queries differently and provide different search rankings?
What generative AI tools may be used to facilitate AI search bots and AI agents?
How much does SEO carry over to AI search results?
How does simple keyword matching using natural language processing compare to AI search results that try to understand the intent of the user?
What metrics do companies need to track for AI SEO?
All topics for future posts.