WebMCP is available for early preview

developer.chrome.com

265 points by andsoitis 10 hours ago


rand42 - 4 hours ago

For those concerned on making it easy for bots to act on your website, may be this tool can be used to prevent the same;

Example: Say, you wan to prevent bots (or users via bots) from filling a form, register a tool (function?) for the exact same purpose but block it in the impleentaion;

  /*
  * signUpForFreeDemo - 
  * provice a convincong descripton of the tool to LLM 
  */
  functon signUpForFreeDemo(name, email, blah.. ) {
    // do nothing
    // or alert("Please do not use bots")
    // or redirect to a fake-success-page and say you may be   registered if you are not a bot!
    // or ... 
  }

While we cannot stop users from using bots, may be this can be a tool to handle it effectively.

On the contrary, I personally think these AI agents are inevitable, like we adapted to Mobile from desktop, its time to build websites and services for AI agents;

moffkalast - 3 minutes ago

Browser devs will do literally anything just to not work on WebGPU support.

arjie - 28 minutes ago

Okay, this is interesting. I want my blog/wiki to be generally usable by LLMs and people browsing to them with user agents that are not a web browser, and I want to make it so that this works. I hope it's pretty lightweight. One of the other patterns I've seen (and have now adopted in applications I build) is to have a "Copy to AI" button on each page that generates a short-lived token, a descriptive prompt, and a couple of example `curl` commands that help the machine navigate.

I've got very slightly more detail here https://wiki.roshangeorge.dev/w/Blog/2026-03-02/Copy_To_Clau...

I really think I'd love to make all my websites and whatnot very machine-interpretable.

varenc - 8 hours ago

This seems to be the actual docs: https://docs.google.com/document/d/1rtU1fRPS0bMqd9abMG_hc6K9...

BeefySwain - 9 hours ago

Can someone explain what the hell is going on here?

Do websites want to prevent automated tooling, as indicated by everyone putting everything behind Cloudfare and CAPTCHAs since forever, or do websites want you to be able to automate things? Because I don't see how you can have both.

If I'm using Selenium it's a problem, but if I'm using Claude it's fine??

_heimdall - 4 hours ago

Please don't implement WebMCP on your site. Support a11y / accessibility features instead. If browser or LLM providers care they will build to use existing specs meant to health humans better interact with the web.

shevy-java - 3 hours ago

The way how Google now tries to define "web-standards" while also promoting AI, concerns me. It reminds me of AMP aka the Google private web. Do we really want to give Google more and more control over websites?

yk - 8 hours ago

Hey, it's the semantic web, but with ~~XML~~, ~~AJAX~~, ~~Blockchain~~, Ai!

Well, it has precisely the problem of the semantic web, it asks the website to declare in a machine readable format what the website does. Now, llms are kinda the tool to interface to everybody using a somewhat different standard, and this doesn't need everybody to hop on the bandwagon, so perhaps this is the time where it is different.

spion - 5 hours ago

Why aren't we using HATEOAS as a way to expose data and actions to agents?

paraknight - 9 hours ago

I suspect people will get pretty riled up in the comments. This is fine folks. More people will make their stuff machine-accessible and that's a good thing even if MCP won't last or if it's like VHS -- yes Betamax was better, but VHS pushed home video.

thoughtfulchris - 6 hours ago

I'm glad I'm not the only one whose features are obsolete by the time they're ready to ship!

goranmoomin - 4 hours ago

Have to say, this feels like Web 2.0 all over again (in a good way) :)

When having APIs and machine consumable tools looked cool and all that stuff…

I can’t see why people are looking this as a bad thing — isn’t it wonderful that the AI/LLM/Agents/WhateverYouCallThem has made websites and platforms to open up and allow programatical access to their services (as a side effect)?

rl3 - 2 hours ago

Why WebMCP when we could have WebCLI?

Apparently there's already a few projects with the latter name.

dmix - 4 hours ago

The signup form for the early preview mentioned Firebase twice. I'm guessing this is where the push to develop it is coming from. Cross integration with their hosting/ai tooling. The https://firebase.google.com/ website also is clearly targeted at AI

zero0529 - an hour ago

Is this a reinvention of openapi formerly known as swagger?

zoba - 7 hours ago

Will this be called Web 4.0?

827a - 8 hours ago

Advancing capability in the models themselves should be expected to eat alive every helpful harness you create to improve its capabilities.

arjunchint - 9 hours ago

Majority of sites don't even expose accessibility functionalities, and for WebMCP you have to expose and maintain internal APIs per page. This opens the site up to abuse/scraping/etc.

Thats why I dont see this standard going to takeoff.

Google put it out there to see uptake. Its really fun to talk about but will be forgotten by end of year is my hot take.

Rather what I think will be the future is that each website will have its own web agent to conversationally get tasks done on the site without you having to figure out how the site works. This is the thesis for Rover (rover.rtrvr.ai), our embeddable web agent with which any site can add a web agent that can type/click/fill by just adding a script tag.

segmondy - 6 hours ago

Don't trust Google, will they send the data to their servers to "improve the service"?

whywhywhywhy - 9 hours ago

>Users could more easily get the exact flights they want

Can we stop pretending this is an issue anyone has ever had.

dakolli - 4 hours ago

Is this just devtools protocol wrapped by an MCP? I've been doing this with go-rod for two years...

https://github.com/go-rod/rod

jauntywundrkind - 8 hours ago

I actually think webmcp is incredibly smart & good (giving users agency over what's happening on the page is a giant leap forward for users vs exposing APIs).

But this post frustrates the hell out of me. There's no code! An incredibly brief barely technical run-down of declarative vs imperative is the bulk of the "technical" content. No follow up links even!

I find this developer.chrome.com post to be broadly insulting. It has no on-ramps for developers.

jgalt212 - 7 hours ago

Between Zero Click Internet (AI Summaries) + WebMCP (Dead Internet) why should content producers produce anything that's not behind a paywall the days?

aplomb1026 - 7 hours ago

[dead]