On February 10, 2026, Microsoft introduced AI Performance inside Bing Webmaster Tools as a public preview feature.
The new report is designed to help publishers and website owners understand how their content appears in AI-driven experiences. It covers Microsoft Copilot, AI-generated summaries within Bing, and selected partner integrations that use Microsoft’s AI systems.
For the first time, site owners can measure how often their content is cited as a source inside AI-generated answers and track which specific URLs are being referenced.
Traditionally, Bing Webmaster Tools has focused on search visibility metrics such as indexing status, crawl diagnostics, keyword impressions, and clicks. AI Performance expands this framework beyond conventional search results.
As AI-generated answers increasingly appear at the top of search experiences, visibility now includes whether your content is cited within those answers. This shift reflects a broader change in how people discover information online.
Why the AI Performance Report Is Important
Search behavior is evolving rapidly. Instead of scanning multiple blue links and clicking through to individual websites, many users now receive synthesized answers generated by AI systems. These answers often summarize information from multiple sources and provide citations that indicate where the information originated. As a result, publishers need insight into whether their content is being included and referenced in those summaries.
The AI Performance report addresses this need by providing data that was previously unavailable. It allows website owners to see how frequently their pages are cited, which topics are connected to their content through AI retrieval systems, and how citation activity changes over time. This introduces a new layer of transparency between AI systems and the open web.
Microsoft describes this direction as an early step toward Generative Engine Optimization, sometimes abbreviated as GEO. While traditional search engine optimization focuses on improving rankings and driving clicks, GEO emphasizes optimizing content so that it can be accurately retrieved, understood, and cited by generative AI systems.
Core Metrics in the AI Performance Dashboard
The AI Performance dashboard includes several distinct metrics, each designed to clarify how content participates in AI-generated answers. These metrics focus specifically on citation activity rather than ranking position or click volume.
Total Citations
The Total Citations metric shows the number of times content from your site was displayed as a source in AI-generated answers during a selected time period. This number represents how often your site was referenced across supported AI surfaces. It does not indicate where the citation appeared within the answer, how prominently it was displayed, or whether it influenced user engagement. Instead, it provides a clear count of citation occurrences.
Understanding total citations helps publishers measure overall participation in AI responses. If citation numbers increase over time, it may suggest improved visibility within AI-driven discovery experiences.
Average Cited Pages
Average Cited Pages measures the average number of unique URLs from your site that were cited per day within the selected date range. This metric highlights how broadly your content is being referenced across AI experiences. A higher average may indicate that multiple pages on your site contribute to AI answers rather than reliance on a single URL.
Because the data is aggregated across multiple AI surfaces, this metric reflects overall citation patterns rather than authority or ranking. It does not imply that one cited page is more influential than another. Instead, it shows how widely your content footprint extends within AI retrieval systems.
Grounding Queries
Grounding Queries are key phrases used by AI systems when retrieving content that is later cited in generated answers. This data provides insight into the language patterns and topic signals that connect user prompts to your content. The queries displayed represent a sample of citation activity and may evolve as additional data becomes available.
Grounding queries are particularly valuable because they reveal how AI systems interpret and categorize your content. If specific phrases frequently appear in this section, it may indicate strong topical alignment between those queries and your pages. This information can guide future content development and topic expansion strategies.
Page-Level Citation Activity
Page-Level Citation Activity shows how many times individual URLs from your site were cited during the selected period. This granular view allows publishers to identify which pages are most frequently referenced in AI-generated answers.
It is important to understand that citation frequency does not equate to ranking, authority, or placement within an answer. A page cited multiple times may appear alongside other sources in different contexts. Nevertheless, this metric helps identify high-performing pages in terms of AI visibility and highlights opportunities for improving underperforming content.
Visibility Trends Over Time
The timeline feature in the dashboard displays how citation activity changes across supported AI experiences over time. This longitudinal perspective allows publishers to detect patterns, such as gradual increases in citation frequency, sudden drops, or seasonal fluctuations.
Monitoring trends can help determine whether content updates, structural improvements, or publishing new material correlate with changes in AI citation activity. Over time, this data can inform broader content and optimization strategies.
How Publishers Can Use AI Performance Insights
AI Performance data is most useful when applied strategically. Reviewing cited pages and grounding queries can help clarify which content assets are already participating in AI-generated answers and which areas may need improvement.
One practical use is validating inclusion. Publishers can confirm whether newly published or recently updated pages are being cited. If certain pages consistently appear in AI answers, it suggests that those pages are well-aligned with AI retrieval systems and user queries.
Another application is identifying high-impact content. Pages that receive frequent citations often share common traits, such as clear topical focus, comprehensive coverage, structured formatting, and well-supported claims. Analyzing these pages can reveal patterns that can be replicated across other sections of the site.
The report can also highlight opportunities for enhancement. Pages that are indexed but rarely cited may benefit from clearer headings, improved structure, expanded explanations, or additional supporting evidence. Improving clarity and completeness can make content more accessible to AI systems that rely on structured extraction and semantic understanding.
Best Practices for Improving Inclusion in AI Answers
Microsoft outlines several principles that can improve the likelihood of inclusion and citation in AI-generated answers. These principles focus on clarity, authority, and technical freshness.
Strengthening depth and expertise is an important step. Pages that demonstrate comprehensive coverage of a topic and clear subject-matter expertise are more likely to align with grounding queries used by AI systems. Expanding related subtopics and addressing common user questions can reinforce topical authority.
Improving structure and clarity also plays a significant role. Clear headings, well-organized sections, tables, and FAQ formats make it easier for AI systems to extract accurate information. Structured formatting reduces ambiguity and increases the likelihood that key points are cited correctly.
Supporting claims with evidence enhances credibility. Including data points, examples, references, and contextual explanations builds trust signals that may influence AI systems when selecting reliable sources.
Keeping content fresh and accurate is equally important. Using IndexNow allows publishers to notify participating search engines when content is added, updated, or removed. Faster indexing ensures that AI systems reference the most current version of a page, reducing the risk of outdated information being cited.
Consistency across formats is another factor to consider. When text, images, and other media consistently represent the same entities, products, or concepts, it reduces confusion and improves clarity for AI retrieval systems.
AI Visibility for Local Businesses
For local businesses, accurate and up-to-date information is especially critical. AI systems frequently answer location-based queries, such as requests for business hours, addresses, or contact details. Inaccurate information can reduce eligibility for citation in AI-generated responses.
In addition to using Bing Webmaster Tools, local businesses can register with Bing Places for Business to ensure that essential details such as address, phone number, and operating hours remain current. Maintaining accurate listings increases the likelihood that AI systems surface correct and reliable information.
Respecting Publisher Controls
Microsoft emphasizes that AI Performance respects content owner preferences expressed through robots.txt and other supported control mechanisms. This means publishers retain control over how their content is accessed and referenced. Sites that choose to restrict crawling or indexing can manage their participation according to established web standards.
The Broader Significance of AI Performance
The introduction of AI Performance signals an important development in search analytics. For many years, success was measured primarily through impressions, rankings, and click-through rates. While those metrics remain relevant, the rise of AI-generated answers introduces a new visibility dimension centered on citation and attribution.
AI Performance provides publishers with tools to measure that dimension directly. By offering insight into citation counts, grounding queries, page-level activity, and trends over time, the report helps bridge the gap between AI systems and content creators.
As AI-driven discovery continues to expand, understanding how and why content is cited will become increasingly important. The AI Performance report represents one of the first structured efforts to provide publishers with that visibility, helping them adapt to the next phase of search and information retrieval.