← Back to Debrief
MarTech

WebMCP Is Coming and It Will Change How AI Agents Interact with Your Website. Start Preparing Now.

Robots.txt told crawlers what to index. WebMCP tells agents what to do. The shift from passive crawling to active interaction is the biggest change to the web since search.

Filip Ivanković··3 min read
3 min read

A new specification called WebMCP is gaining traction as a proposed standard for how AI agents discover and interact with website capabilities. If the specification succeeds, it will become the interface layer between your website and the growing population of AI agents that visit it, the same way robots.txt became the interface for search crawlers.

The core idea is straightforward. Today, AI agents visiting your website have to scrape HTML, interpret page structure and guess at functionality. WebMCP proposes a structured file that tells agents what your site can do, what data is available and how to interact with it programmatically. Think of it as an API discovery layer for AI agents, hosted at a known path on your domain.

The specification is still early. It is not yet widely adopted and the tooling ecosystem is nascent. But the trajectory is clear. As AI agents become more prevalent (shopping agents, research agents, customer service agents, content aggregation agents), they need a more efficient way to interact with websites than screen-scraping HTML. WebMCP or something like it is inevitable.

2025

The year WebMCP gained serious momentum as AI agent traffic becomes a measurable share of web visits

The implications for marketers and web teams are meaningful even at this early stage. If AI agents become a significant source of traffic and transactions, the businesses that make their sites agent-friendly will capture that traffic. The ones that do not will be invisible to the agent layer, the same way sites without proper SEO are invisible to search.

The parallels to the early days of SEO are striking. In 2000, most businesses did not think search engine optimisation mattered because search was a small share of traffic. By 2005, it was the primary channel. AI agent interaction could follow the same curve, slowly at first and then all at once.

Why it matters

The web is shifting from a human-first browsing model to a hybrid model where both humans and AI agents interact with your site. Your website already has two audiences: people and crawlers. WebMCP adds a third: agents that do not just read your site but interact with it. If your site is not structured for agent interaction, you will lose traffic and transactions to competitors who are.

This is especially relevant for e-commerce, SaaS, professional services and any business where an AI agent might comparison shop, book appointments or request information on behalf of a user.

What to do about it

You do not need to implement WebMCP today. But you should be aware of it and start thinking about your site through the lens of agent interaction. Structured data (schema.org) is the first step because it is already the foundation of machine-readable web content. Clean, well-documented APIs are the second step. When WebMCP or its equivalent becomes standard, the businesses with structured, machine-readable sites will be ready. The ones running everything through JavaScript-rendered SPAs with no structured data will be scrambling to catch up.

ShareLinkedInX

Debrief

Get the next one

No spam. No fluff. Just the next article, straight to your inbox.

Filip Ivanković
Filip IvankovićFounder, New Rebellion

10+ years leading performance marketing across agencies and in-house teams in Australia. Writes about the gap between marketing activity and commercial outcomes, and what it takes to close it.

Keep reading

All articles →

If this resonated

Let's talk about your marketing

30 minutes with a senior strategist. No pitch deck, no obligation. Just an honest conversation about what you need.