WebMCP and NLWeb: Two New Standards Are Turning Websites Into AI Interfaces – Here's Why Schema.org Is the Common Foundation
Google has introduced WebMCP. Microsoft has launched NLWeb. Two different approaches, two different companies — but the same underlying message: websites need to become machine-readable to survive in the age of AI agents. And both standards rely on the same foundation: structured data.
What Just Happened
Within the span of a few months, both major browser and platform vendors have released standards that redefine how AI agents interact with websites.
On February 10, 2026, André Cipriani Bandarra announced the early preview of WebMCP on the official Chrome Developers Blog:
WebMCP is a proposed web standard developed jointly by engineers at Google and Microsoft, incubated through the W3C Web Machine Learning Community Group. The specification editors include Brandon Walderman (Microsoft), Khushal Sagar (Google), and Dominic Farolino (Google) [2].
In May 2025, Microsoft had already introduced NLWeb at Build 2025 — an open protocol that turns websites into conversational AI interfaces. Microsoft's ambition:
NLWeb was conceived and developed by R.V. Guha, who recently joined Microsoft as CVP and Technical Fellow. This is a significant detail: Guha is the creator of RSS, RDF, and Schema.org [9] — the very standards that NLWeb now builds on. The inventor of Schema.org is now building the protocol that turns Schema.org data into conversational AI interfaces.
Together, WebMCP and NLWeb represent two complementary paths toward the same destination: an agentic web where AI agents don't just read websites, but understand and interact with them.
WebMCP: From Guessing to Contracts
Today, AI agents operating in the browser have to interact with websites the same way a human would: identifying form fields, interpreting button labels, guessing how a calendar picker works. This is slow, error-prone, and computationally expensive. Early benchmarks reported by VentureBeat suggest WebMCP reduces computational overhead by approximately 67% compared to traditional visual agent-browser interactions [3].
WebMCP replaces this guessing with an explicit contract. A website declares its capabilities as structured tools — with a name, a description, and a schema for expected parameters. The agent calls the function directly and receives structured results back.
This happens through two complementary APIs. The Declarative API transforms existing HTML forms into agent-callable tools by adding a few new attributes. The Imperative API, accessed through the new navigator.modelContext browser interface, enables more complex JavaScript-based interactions [2]. In both cases, the website reuses its existing frontend logic — no separate backend for agents required.
Khushal Sagar, Staff Software Engineer at Chrome, has described the scope:
WebMCP is model-agnostic. The demo extension uses Gemini 2.5 Flash via API, not Google's on-device Gemini Nano [4]. Whether the agent runs on Claude, ChatGPT, Gemini, or an open-source model — any browser-based agent can use WebMCP tools.
Dan Petrovic, founder of AI SEO agency DEJAN, summarized the significance:
NLWeb: Schema.org as the Conversational Layer
Where WebMCP makes websites actionable through tool contracts, NLWeb takes a different approach: it makes websites conversational by turning their existing structured data into natural language interfaces.
The core idea is straightforward. NLWeb takes the Schema.org markup and RSS feeds that a website already publishes, indexes them in a vector database, and combines them with an LLM to create a conversational endpoint. A user — or an AI agent — can then query the website in natural language and receive structured JSON responses based on the site's actual data [9] [10].
The Glama.ai analysis captures the architectural insight:
This has several important implications.
Schema.org is not optional for NLWeb — it's the primary data source. NLWeb specifically leverages the semi-structured formats that websites already publish. The GitHub repository states this directly: "Schema.org and related semi-structured formats like RSS — used by over 100 million websites — have become not just de facto syndication mechanisms, but also a semantic layer for the web. NLWeb leverages these to enable natural language interfaces more easily" [9]. A website without Schema.org markup simply has less to work with in an NLWeb context.
Every NLWeb instance is also an MCP server. This means that a website running NLWeb automatically becomes discoverable and queryable by any agent in the MCP ecosystem — including Claude, ChatGPT, and any MCP-compatible tool [8] [9]. This directly addresses the tool discovery problem that WebMCP hasn't solved yet.
NLWeb is already in production. Shopify, Tripadvisor, Snowflake, Eventbrite, O'Reilly Media, and Chicago Public Media are among the early adopters [11]. This isn't a theoretical spec behind a feature flag — it's running on real websites with real traffic.
NLWeb is model-agnostic and platform-agnostic. Microsoft has tested it with OpenAI, DeepSeek, Gemini, Anthropic's Claude, and others. It runs on major cloud platforms as well as laptops [9].
The analyst perspective from Dion Hinchcliffe of The Futurum Group underscores the strategic dimension:
How WebMCP, NLWeb, and Schema.org Fit Together
WebMCP and NLWeb are not competing standards. They operate on different layers and serve different interaction patterns:
WebMCP is a browser-native API. It enables agents to perform specific actions on a website — submitting forms, configuring products, completing checkouts. It works client-side, within the user's browser session, and shares the user's authentication context. WebMCP does not technically require Schema.org, but agents that need to decide which website to act on benefit enormously from structured data.
NLWeb is a server-side protocol. It enables agents (and humans) to query a website's content through natural language and receive structured responses. It explicitly uses Schema.org as its primary data source, turning existing structured markup into a conversational interface. Every NLWeb instance is also an MCP server, making the website's content discoverable to the broader agent ecosystem.
Schema.org is the shared foundation. For WebMCP, it provides the context that helps agents choose where to act. For NLWeb, it provides the data that powers the conversational interface itself. Without Schema.org, WebMCP agents act without understanding. Without Schema.org, NLWeb has nothing to index.
A website with all three layers — Schema.org for understanding, NLWeb for conversation, WebMCP for action — becomes a fully capable participant in the agentic web. It can be found, queried, understood, and transacted with.
The fact that R.V. Guha, the creator of Schema.org, is now building NLWeb on top of it is not a coincidence. It's the logical next step in a trajectory he began in 2011: first, make web content machine-readable. Now, make it machine-conversable and machine-actionable.
What This Means for Structured Data Strategy
Neither WebMCP nor NLWeb replaces Schema.org. Both raise the stakes.
Until now, the case for comprehensive schema markup was primarily about search visibility: better rankings, richer snippets, higher click-through rates, and AI citation. Those arguments remain valid. GEO research published at KDD 2024 by Aggarwal et al. demonstrates measurable visibility improvements of up to 40% for structured, authoritative content [5]. A Data.world study (2023) showed that Enterprise Knowledge Graphs built on structured data increased GPT-4's response accuracy from 16% to 54% [6].
But WebMCP and NLWeb add new dimensions. The question is no longer just "Can AI find and understand your content?" It's now also "Can AI interact with your website on behalf of the user?" and "Can AI agents query your website in natural language and get accurate answers?" The answer to both questions depends on the quality of the structured data underneath.
The more websites become actionable through WebMCP and conversational through NLWeb, the more critical the question becomes: which websites do agents choose? An agent that can interact with hundreds of stores will recommend the ones it understands best. A conversational query that can hit dozens of NLWeb-enabled sites will surface results from the ones with the richest Schema.org data.
enhancely.ai automates exactly this foundation. Over 30 Schema.org types — Organization, Product, Offer, AggregateRating, FAQPage, BreadcrumbList, and many more — are automatically generated and maintained across your entire website. No manual effort. No frontend changes. Compatible with any CMS or shop system. This structured data layer is what makes a website discoverable, understandable, and trustworthy to AI systems — the prerequisite for being chosen when agents start acting and conversing.
WebMCP builds the action interface. NLWeb builds the conversational interface. Schema.org provides the data that makes both valuable. Together, they form what it means to treat your website as an API for AI.
The Window Is Open Now
WebMCP is available as an early preview in Chrome 146 Canary behind a feature flag [7]. NLWeb is open source and already in production at major publishers [9]. Both are drafts in the sense that the ecosystem is still forming. But the direction is unambiguous: the web is being rebuilt for AI agents, and both Google and Microsoft are investing heavily.
Dan Petrovic draws the parallel to the emergence of SEO: search engines needed structured signals to understand websites, and an entire discipline emerged. WebMCP and NLWeb mark the beginning of the same paradigm shift, this time for AI agents instead of crawlers [4]. The tool discovery problem that WebMCP hasn't solved yet — agents can only find tools when they've already navigated to a page — is one that NLWeb addresses directly by making every instance an MCP server [9].
For website operators, the takeaway is straightforward: investing in machine-readable content today builds the foundation on which both protocols operate. Schema.org is the starting point. WebMCP and NLWeb are the next chapters. All three together are what makes a website AI-ready.
FAQ
-
-
Does WebMCP or NLWeb replace Schema.org?
No. Both build on Schema.org rather than replacing it. WebMCP uses structured tool contracts that benefit from Schema.org context for agent decision-making. NLWeb directly ingests Schema.org data as its primary source for building conversational interfaces. Without comprehensive Schema.org markup, both protocols deliver diminished results.
-
Does NLWeb work without Schema.org?
NLWeb can ingest various data formats, but Schema.org and RSS are its primary data sources. The protocol was explicitly designed to leverage structured data that websites already publish [9]. A site with rich Schema.org markup will generate significantly better NLWeb responses than one without.
-
Is WebMCP tied to a specific AI model?
No. WebMCP is model-agnostic [4]. So is NLWeb, which has been tested with OpenAI, DeepSeek, Gemini, Anthropic's Claude, and others [9].
-
-
-
What is the difference between WebMCP and NLWeb?
WebMCP is a browser-native API that lets agents perform actions on websites — submitting forms, completing checkouts, running searches. NLWeb is a server-side protocol that lets agents (and humans) query website content in natural language and receive structured responses. WebMCP handles action. NLWeb handles conversation. Both benefit from Schema.org as the underlying data layer.
-
Which browsers support WebMCP?
As of February 2026, WebMCP is available as an early preview in Chrome 146 Canary behind the "Experimental Web Platform Features" flag [7]. Firefox, Safari, and Edge are participating in the W3C working group but haven't shipped implementations yet. The cross-vendor authorship (Microsoft and Google) suggests broader support is likely.
-
What should I do now to prepare for WebMCP and NLWeb?
Start with the foundation: ensure your website has comprehensive, accurate Schema.org markup. This is the layer that powers NLWeb's conversational interface and helps WebMCP agents evaluate your site. enhancely.ai automates this process across your entire site with a simple integration.
-
-
-
Does WebMCP work without Schema.org?
Technically, yes. An agent can invoke a WebMCP tool without any Schema.org markup on the page. But without structured data, the agent can't evaluate whether the product, service, or content is relevant to the user's query. WebMCP without Schema.org is functional, but the agent can't make an informed recommendation.
-
Who is using NLWeb today?
Early adopters include Shopify, Tripadvisor, Snowflake, Eventbrite, O'Reilly Media, and Chicago Public Media [11]. The protocol is open source and can be deployed on any major cloud platform or run locally.
-
How does enhancely.ai relate to WebMCP and NLWeb?
enhancely.ai automates the structured data layer that both protocols build on. WebMCP enables the action layer — allowing agents to interact with your website. NLWeb enables the conversational layer — allowing agents to query your content in natural language. enhancely ensures the knowledge layer is in place: the Schema.org markup that feeds NLWeb's index, helps WebMCP agents evaluate your site, and ultimately determines whether agents choose you over competitors.
-
Footnotes and Sources
[1] André Cipriani Bandarra: "WebMCP is available for early preview", Chrome Developers Blog, February 10, 2026.
[2] W3C Web Machine Learning Community Group: WebMCP Specification. Editors: Brandon Walderman (Microsoft), Khushal Sagar (Google), Dominic Farolino (Google). See also: WebMCP Explainer on GitHub.
[3] VentureBeat: "Google Chrome ships WebMCP in early preview, turning every website into a structured tool for AI agents", February 12, 2026.
[4] Dan Petrovic: "WebMCP", DEJAN Blog, February 10, 2026.
[5] Aggarwal et al.: "GEO: Generative Engine Optimization", published at KDD 2024. Experiments on Perplexity.ai demonstrated real-world visibility improvements of up to 37–40% for structured, authoritative content.
[6] Data.world (2023): Enterprise Knowledge Graph study demonstrating GPT-4 response accuracy improvements from 16% to 54% when structured data was available.
[7] Bug0: "WebMCP just landed in Chrome 146. Here's what you need to know", February 10, 2026. Additional reporting: Barry Schwartz, "Google previews WebMCP, a new protocol for AI agent interactions", Search Engine Land, February 11, 2026.
[8] Microsoft Official Blog: "Microsoft Build 2025: The age of AI agents and building the open agentic web", May 19, 2025.
[9] Microsoft: NLWeb GitHub Repository. See also: Microsoft Source, "Introducing NLWeb: Bringing conversational interfaces directly to the web", July 1, 2025.
[10] Glama.ai: "NLWeb: Microsoft's Protocol for AI-Powered Website Search", June 1, 2025.
[11] The Letter Two: "Microsoft NLWeb Brings Conversational AI to Websites", May 19, 2025.
[12] Computerworld: "Why is Microsoft offering to turn websites into AI apps with NLWeb?", May 20, 2025.
This article draws on the official Chrome Developers Blog announcement, the W3C WebMCP specification, the Microsoft NLWeb GitHub repository, Microsoft's Build 2025 keynote, industry analysis from DEJAN, VentureBeat, Search Engine Land, Bug0, Glama.ai, The Letter Two, and Computerworld, as well as GEO research published at KDD 2024 and enterprise knowledge graph studies by Data.world.