bad_haircut72 2 hours ago

"The rise of MCP gives hope that the popularity of AI amongst coders might pry open all these other platforms to make them programmable for any purpose, not just so that LLMs can control them."

I think the opposite, MCP is destined to fail for the exact same reason the semantic web failed, nobody makes money when things aren't locked down.

It makes me wonder how much functionality of things like AI searching the web for us (sorry, doing "deep-research") might have been solved in better ways. We could have had restaurants publish their menus in a metadata format and anyone could write a python script to say find the cheapest tacos in Texas, but no, the left hand locks down data behind artificial barriers and then the right hand builds AI (datacenters and all) to get around it. On a macro level its just plain stupid.

  • fidotron 2 hours ago

    > I think the opposite, MCP is destined to fail for the exact same reason the semantic web failed, nobody makes money when things aren't locked down.

    I think this is right. MCP resembles robots.txt evolved into some higher lifeform, but it's still very much "describe your resources for us to exploit them".

    The reason the previous agent wave died (it was a Java thing in the 90s) was eventually everyone realized they couldn't trust their code once it was running on a machine it's supposed to be negotiating with. Fundamentally there is an information assymetry problem between interacting agents, entirely by design. Take that away and huge swathes of society will stop functioning.

    • arbuge an hour ago

      "describe your resources for us to exploit them"

      What you want to do is offer resources that make you money when they're "exploited".

      • fidotron a minute ago

        I would agree with that if there were no distinction between clients and servers. i.e. agents and LLMs are resources that should be discovered and exploited in the same exact way as anything else, and switchable in the same ways.

  • doug_durham 2 hours ago

    Plain human readable text is not an "artificial barrier". It the nature of our our world. Requiring that a restaurant publish menus in a metadata format is an artificial barrier. That the beauty of these new NLP tools. I don't need to have a restaurant owner learn JSON, or buy a software package that generates JSON. We can use data as it is. The cost of building useful tools goes to near zero. It will be imprecise, but that's what human language is.

    • lucideer an hour ago

      > It the nature of our our world.

      It's the nature of capitalism.

      Some forms of capitalism may have roots in the natural world - natural selection as both a destructive & wasteful competitive process certainly has a lot of parallels in idealised markets - but there's nothing inherent about your menu example when it comes to the modern human world, beyond restrictions placed upon us by capitalism.

      > Requiring that a restaurant publish menus in a metadata format is an artificial barrier

      This is oddly phrased as noone would need to require anyone to do anything - it's obviously beneficial to a restaurant to publish their menus in formats that are as broadly usable as they can. The only barrier to them doing that is access to tools.

      The various hurdles you're describing ("buying" software, the "cost" of building tools) are not natural phenomena.

    • Y_Y 2 hours ago

      Plain text menus would have been fine

  • jjfoooo4 an hour ago

    MCP is described as a means to make the web open, but it’s actually it’s a means to make demos of neat things you could do if the web were actually open.

  • jsnell an hour ago

    It's not just that nobody makes money providing a free and open API. It's that to operate such an API you'll basically need unlimited resources. No matter how many resources you throw at the problem, somebody will still figure out a way of exhausting those resources for marginal gains. MCP will just make the problem worse as AI agents descend on any open MCP servers like locusts.

    The only stable option, I think, is going to be pay-per-call RPC pricing. It's at least more viable to do then it was for Web 2.0 APIs, since at least the entity operating the model / agent will act as a clearinghouse for all the payments. (And I guess their most likely billing model is to fold these costs into their subscription plans? That seems like the best way to align incentives.)

  • ljm 2 hours ago

    HATEOAS was the dream in the early 2010s and that basically went nowhere beyond generating swagger yaml, despite the fact it intended to make API consumption trivial.

    Whoever coined it as HATEOAS basically set it up to fail though.

    • johnmaguire 2 hours ago

      > Whoever coined it as HATEOAS basically set it up to fail though.

      I could never understand making the term "hate" so prominent.

    • dragonwriter 2 hours ago

      > HATEOAS was the dream in the early 2010s and that basically went nowhere

      I dunno, HTTP/1.1, the motivating use case for REST and HATEOAS, seems to have been moderately successful.

    • badgersnake 2 hours ago

      MCP is just that again, but less well thought out. Everything new is old.

  • alberth an hour ago

    Sure - not many companies made money on "HTTP", but lots of people/companies made gobs of money by adopting it.

  • philosophty 2 hours ago

    I haven't paid close attention. Why can't people make money with MCP-based APIs? Why can't providers require API keys / payment to call their functions?

    • olalonde 34 minutes ago

      Sure they can - they're just another API interface tailored for LLMs. I think parent and OP are in fact ranting about that (many APIs being locked behind signups or paywalls). Not sure I agree with the criticism though. In my view, web 2.0 was a huge success: we went from a world with almost no APIs to one where nearly every major website or app offers one. That's real progress, even if we didn't turn every business into an open data non-profit.

  • drusepth an hour ago

    > I think the opposite, MCP is destined to fail for the exact same reason the semantic web failed, nobody makes money when things aren't locked down.

    Is there a way to handle "locking down" things with MCP? It seems like a potential opportunity for freemium services if they have a mechanism for authentication and telling the MCP caller "this user has access to these tools, but not _these_ yet".

    • seanhunter 38 minutes ago

      Yes. MCP allows (and uses) exactly the same authentication mechanisms that any other rest or similar api allows. So if you have a service you want to expose (or not) via MCP you can do that in exactly the same way as you currently could do that for a rest API.

      The difference for the user is instead of them having to make (or use) a special-purpose client to call your rest api, the llm (or llm powered application) can just call the api for them, meaning your rest service can be integrated into other llm-powered workflows.

  • Joker_vD 2 hours ago

    The reason the semantic web failed is not only because "nobody makes money when things aren't locked down". It's also because nobody ain't got no time for generating infinite amount of metadata when full-text search and indexing, with a judicious pinch of fuzzy matching, is both faster and more reliable. And LLMs, as much as I dislike the technological/societal consequences of their existence, are effectively further development of the latter, so they won't go away.

    Manual or even semi-automated cataloguing of websites (and further curating) of websites wasn't the answer to "how do I find stuff on the web" — Google was. Having standardized metadata format for menus is undoubtedly nice — but good luck making people to use it. You just can't. It really is both cheaper and easier for everyone involved to have website with arbitrary information layout scraped and fed into an LLM to extract relevant data: because what is "relevant" is different for everyone. You can't pre-ordain the full list of possible relevant metadata, and, again, good luck forcing everyone to fill out those 500 items-long forms. Ain't nobody got time for that.

    • jauntywundrkind 44 minutes ago

      I tend to agree one of the top semantic web problems was:

      > It's also because nobody ain't got no time for generating infinite amount of metadata

      There's also a lot of tooling problems too, that the semantic web doesn't integrate gracefully with POJO of the programming worlds.)

      The tooling distance between users/devs and semantic web remains. But all that metadata? There being an interesting rich world of information, associated & well described & meticulous? Uh we actually seem like we just invented a really powerful tool at doing all this immense catalogization (LLM's).

  • throwaway7783 2 hours ago

    MCP is basically APIs V2 as far as I can see. It probably will evolve in its concrete specs, but useful and not niche, especially when they can be composed fairly trivially.

    In that sense, it is probably the building block for the next user interface, which is conversational.

    Maybe the mention of web 2.0 is triggering all negative responses here, but on it's own, it is useful and could disrupt (not MCP itself, but the overall field) how we interact with software and systems

  • isodev 2 hours ago

    I think MCP’s popularity is a side effect of the hype bubble driving AI atm - one of the fancy things one can do with AI.

    If there was any “easy” value in making one’s data available in a standard form, we would’ve seen a lot more adoption of interoperable endpoints (e.g. using schema.org or generally common ontologies as opposed to custom formats that always need a special magic SDK).

    • doug_durham 2 hours ago

      There is an easy way to make your data available. It's existed for several hundred years, it's called plain text. We now have tools that allow computers to work with plain text. Outside of specific niches ontologies are vanity projects.

  • gz5 2 hours ago

    >nobody makes money when things aren't locked down

    i would rephrase as "incumbents don't usually make more money if things are opened up".

    if consumer gets materially better value, then challenger ecosystem around MCP will evolve. it will be open at first - great for startups and challengers, innovator's dilemma for market leaders.

    and then it will close as the new wave establishes their moats. but, similar to web, even though the current web leaders are now much more closed than we would like, the overall ecosystem is more open than it was.

  • testplzignore 38 minutes ago

    > On a macro level its just plain stupid.

    You've described most white-collar jobs :)

  • AlienRobot 13 minutes ago

    I still don't know who uses this semantic web. Like you have all these semantics marked up... for whom? What are actual applications using this?

    Google has a small subset of schema.org it supports, but rather than "semantic web" it feels more like "here's my API." Its own schema tester often complains about things that should be valid schemas, simply because it doesn't conform to its API. How would any web developer mark up (and test the validity of said mark up) for applications that don't even exist?

  • throwaway13337 2 hours ago

    It's reasonable to be cynical, but the future hasn't been written yet. If we choose only to see a negative future, we will ensure that it can only exist.

    In the negative vein, I see a lot of VCs and business leaders talking about making AI for companies that directly interface with customers.

    Those agents will be used to manipulate and make painful existings services exactly like today. Enshitified transactional websites engineered for maximum pain.

    A different direction can happen if we choose instead to use our ai agents to interact with business services. This is actually what's currently happening.

    I use gemini/chatgpt to find things on the web for me without being manipulated and marketed at. Maybe one day soon, I can buy airline tickets without the minefield of dark patterns employed to maximize A/B tested revenue.

    The only thing that needs to happen to keep us on this path is to bite the heels of the major companies with great agent systems that put the user at the center and not a brand. That means selling AI agents as a SaaS or open source projects - not ad-supported models.

    This community is probably the group that has, collectively, the most say in where this future goes. Let's choose optimism.

    • jahewson 2 hours ago

      The thing is, if AI agents become a significant part of web traffic then the content of the web will simply shift to manipulate the agent instead of the human.

      And don’t forget when you use an AI agent today to buy something it’s using “marketing” information to make its decisions. It’s influenced by SEO in its search results, indeed there’s no shortage of marketers busy working out how to do this.

      I do agree there’s much to be optimistic about but the fundamental dynamics of the consumer market won’t change.

      • throwaway13337 an hour ago

        It's absolutely true that in that future vision, the agents will then be marketed at.

        And that's great.

        In that world, those agents will sift through the noise. And the one that does that the best will win.

        The end user experience then becomes uniform and pleasant.

        It's the difference between talking to a personal secretary and a customer service representative. No one should have to endure the latter.

        • kibwen an hour ago

          > In that world, those agents will sift through the noise. And the one that does that the best will win.

          The existence of agents capable of learning to cut through the enshittification also implies the existence of agents capable of learning to enshittify all the more effectively. It's an arms race, and there's no reason to suspect that the pro-consumer side will win over the pro-exploitation side.

  • pphysch 2 hours ago

    xAI is a concrete example of this. During the initial LLM explosion, X locked down its previously public APIs and data sources. Simultaneously, xAI is investing massively in building its private data hoard and compute infrastructure. Probably a similar case with Meta.

    "Data for me but not for thee"

    MCP is only getting the light of day, arguably, because of LLM "one trick ponies" like OpenAI and Anthropic, who do benefit from MCP amplifying their value proposition. As that business model continues to fizzle out and lose/subordinate to the AI integrators (Google, Microsoft, xAI?), MCP will probably fizzle out as well.

jacob019 15 minutes ago

Fun writing, and something to think about. To me, Web 2.0 is kind of a joke; jQuery, REST, AJAX, CSS2, RSS, single page apps were going to change everything overnight, it was THE buzzword, and then... incremental improvements. In retrospect, everything did change, but that loose collection of technologies was just links in the chain of incremental progress. So yeah, Web 2.0 2.0 makes sense.

I've seen a lot of talk around here, and everywhere, about MCP. A lot of enthusiasm and a lot of criticism. I've written a few MCP servers and plan to write some more. It isn't how I would design a protocol, but it works, and everyone is using it, so hooray for interoperability.

I think the hype represents the wider enthusiasm that people have about this moment in time, and the transformative power of these new tools. It's easy to look at MCP and say there it is, now it's easy to connect these tools to the things that I care about, it feels accessible, and there's community.

MCP is NOT the new TCP for AI. It is, essentially, an RPC layer for chat. And while it could be extended to a wider variety of use cases, I expect that it will remain primarily a method for wiring up tool calls for user-facing use cases. We recognize the power of these tools and anticipate deep changes to workflows and systems, but we don't know how that will shake out. If I am writing a classifier for a backend system, I am not going to use MCP, even if I could. Because it's inefficient. Every additional tool we offer the model consumes tokens and increases latency. I expect that the primary user of LLMs is going to be business automation of all kinds, and I don't expect them to reach for MCP to wire things up. Yeah, it's really cool to hook tools up to the chat, for that to feel accessible, to know how to do things in an idiomatic and standards-compliant way, that feels good! And yeah, the hype is overblown.

tagfowufe 2 hours ago

While I understand where the author is coming from, and I get his sentiment(s), I don't think what he proposes is actually possible: his vision relies on faux open tools and protocols and having access to walled gardens. The means of computation for these kinds of things are owned by a tiny minority. Nearly everything is a SaaS or is based, one way or the other, on rent extraction. We're essentially subject to the whims of someone who is letting us do something for as long as we play nice.

>There is a chance, though, that younger developers, and those who weren't around to build back during that last era a generation ago, are going to get inspired by MCP to push for the web to go back towards its natural architecture. It was never meant to be proprietary.

Alas, the reason APIs started closing and being metered is because, after all, there's someone owning and paying for the hardware upon which you are making calls and requests.

As long as there's no way to agree upon how to have a bunch of servers providing computation for anyone and at the same time ensuring their upkeep without the need for a central authority, I don't think such vision is sustainable long term. The current state of the Internet is proof of it.

zoogeny an hour ago

> Compared to the olden days, when specs were written by pedantic old Unix dudes

I think that is one of the reasons (among many others) that the semantic web failed (which doesn't contradict the author, whose point is literally the worse-is-better mantra).

People really leaned into the eXtensible part of XML and I think a certain amount of fatigue set it. XSL, XHTML, XSD, WSDL, XSLT, RDF, RSS, et al. just became a bit too much. It was architecture astronautics for data formats when what the world at the time needed was simple interchange formats (and JSON fit the bill).

But I actually believe XML's time has come. I've noticed that XML appears a lot in leaked system prompts from places like Anthropic. LLMs appear to work very well with structured text formats (Markdown and XML specifically).

I believe that MCP is the wrong model, though. I believe we should be "pushing" context to the models rather than giving them directions on how to "pull" the context themselves.

nimish 37 minutes ago

Rent seeking is the name of the game for much of b2b SaaS.

MCP is an attempt to make that easy, but the issue here is that a lot of the companies offering integration could be disintermediated entirely by LLMs. Hard to say what that means.

daemonk an hour ago

At a higher level, MCP seems wants to enforce a standard where no standard exists. I get that the low level technical implementation allows AI to utilize these tools.

But there doesn't seem to be any standardization or method in how to describe the tool to the AI so that it can utilize it well. And I guess part of the power of AI is that you shouldn't need to standardize that? But shouldn't there at least be some way to describe the tool's functionality in natural language or give some context to the tool?

lxgr 3 hours ago

Turns out the “Semantic Web” was a syntactic web all along, and maybe this is the real deal?

1oooqooq 13 minutes ago

I pitty the fools thinking they will have access to anything because there's a MCP.

those things will be hidden behind a dozen layers of payment validation and authentication. And whitelisted IPs (v4, of course).

ERR 402; is all that will be visible to yall.

vivzkestrel an hour ago

Everything seems to be susceptible to enshittification and so far I see no evidence that MCP is any exception. First the value will go to users, then the users ll cut short to drive value to shareholders and then it will turn to an absolute pile of garbage as businesses make every attempt to somehow cash in on this

RansomStark an hour ago

MCP could have cracked the web opeb. The terrible standard was all about clients and local servers all on the same host.

Imagine it, everything is open, servers are as simple as a pip install ... You have full control of what servers you install. What functions you turn on, what access you allow.

Now everyone and their blog is sticking MCPs on their servers and locking them down behind subscriptions and paywalls.

What a wasted opportunity.

  • freeone3000 14 minutes ago

    And what pays for the resources used serving your (hundreds of) requests against a “local” server? For computer control, sure, but actual remote services have actual remote costs.

quantadev an hour ago

We can now build the Semantic Web. All we have to do is create a tiny protocol (as an optional extension to MCP) for how organizations can share their SQL Table Create DDL as a static file that MCP apps can read, to understand data, and then, using the already-existing tools for AI/LLM function calling to SQL, that would become a Semantic Web.

That would fill the missing link that always held back the Semantic Web which was the lack of any incentive for companies to bother to use a standard "Data Type" rather than all proprietary data types. Once we have an MCPQ (MCP with Queries), suddenly there's an incentive for organizations to collaborate at the data structure layer.

jaredcwhite 3 hours ago

MCP is not an open standard.

People routinely mistake "protocol specification uploaded to GitHub, PRs welcome" as open standards. They are not. Calling them "open protocols" because they are open source, not open standards (no standards body was involved in the making of this protocol!) is essentially a form of openwashing.

This has been happening way too frequently lately (see also: ATProto), and it really needs to be called out.

  • underbluewaters 2 hours ago

    Successful standards usually start out scrappy, are embraced by a community, and then are blessed by standards bodies. What comes out of working groups of standards bodies rarely gains traction. See xhtml vs "html5".

    • croes 2 hours ago

      Does MCP still have the security issues?

      A big mistake in the first place to start it without proper security.

      That not Web 2.0 2.0, that‘s Web 1.0

      • doug_durham 2 hours ago

        It's a mistake to not start at all because of an issue that may not be important for many important use cases.

        • croes an hour ago

          It’s a mistake to start and to think security is a non important issue.

          Anything that’s connected to the web has to consider security at step one.

          How often are we repeating the same mistakes over and over again?

      • jedisct1 an hour ago

        MCP Servers are usually installed locally and can do whatever they want on the local machine.

        But this is solved by sandboxes such as mcp.run .

        • croes an hour ago

          Sandboxes aren’t a solution just a workaround for a bigger problem

  • 98codes 2 hours ago

    Open standard or not, how is it a standard at all? I spent a while trying to find what the current version number of the spec is, and I could not find one. Theer were SDKs, and the SDKs have versions, but the protocol itself seems to be on v0.0.0.

  • nicoburns 2 hours ago

    > no standards body was involved in the making of this protocol!

    Is a standards body being involved relevant? Many standards ratified by standards bodies are "pay to access" and seem much less open than many de facto standards where no standards body was involved.

    • isodev 2 hours ago

      > Is a standards body being involved relevant?

      I believe it is. Taking the example of ATproto, it boils down to a managed platform as a service but they seem to do a #BuildInPublic thing where we can experience early developer previews. That’s not really open and it isn’t a standard.

      • philosophty 2 hours ago

        How are these not "open" if they're GitHub projects and entirely open source and open to outside contribution?

        The organizations that create "standards" are much less open than this, requiring in person lobbying, schmoozing, travel, and company affiliations.

  • anildash 2 hours ago

    I was using the phrase in the vernacular sense; I’ve worked on genuine open standards and know the difference. I don’t think it matters that much to use the phrase in this way in an obviously casual piece on my personal blog, as opposed to a more formal assessment of a technology.

  • dangoor 2 hours ago

    It has become a de facto standard. There are many implementations of MCP. We'll have to see if it fizzles out or if some reasonable stewardship/governance come about, but it is very much possible for something to be a standard even without a standards body.

  • sali0 2 hours ago

    Partially agree. But I can't help thinking this is the natural lifecycle of protocols. They first start as open projects, proliferate as such, and evolve into standards with governance once they catch on.

    What would you call these projects? Open protocols?

  • Toritori12 2 hours ago

    I guess it is a first step... a lot of current "open" protocols started as proprietary.

    • croes 2 hours ago

      The first step should include more security.

CSMastermind 2 hours ago

I really wish we'd learn from Web 2.0.

All the mistakes of "hey everything has an API now" that we learned from we're back to repeating.

I feel like that meme from Watchmen with the blue guy sitting on Mars.

  • olalonde an hour ago

    What mistakes are we repeating?

    • thuanao 31 minutes ago

      Not understanding that businesses don't want open data or open protocols. Internet businesses want to monopolize information and charge rent for it. Capitalists don't want to commodify software, as that means lower profits and competition. They want to monopolize it. That's the whole game, except for companies selling physical goods over the Internet.

      • olalonde 3 minutes ago

        So, is the idea that MCP might benefit from more built-in support for things like paid APIs?

  • anildash 2 hours ago

    Definitely having that feeling a lot these days watching the cycles repeat. I am tired of earth. These people. Their APIs.

hansmayer 2 hours ago

Wasn't this guy involved in some NFTs scheme a while ago ? Hard pass to anything such types have to say on anything, I am afraid.

  • leptons an hour ago

    Yeah, did they just forget "Web 3.0" died in a garbage fire? Now the latest object of hype is "Web 2.0 2.0"? I want to get off this ride.

quotemstr an hour ago

People said the same thing about "APIs" 10-15 years ago when they were a craze. Everything had to be an API! Doesn't matter whether it made sense to not. It's going to change the world! We're going to have San Francisco events with microbrews for APIs! Everyone's going to publish API frameworks! Let me make api-blog-blog.blogger.blog!

Blah. Bay Area Tech regularly goes through these bursts of baseless enthusiasm for rehashes of existing technology. Is MCP useful? Yeah, probably. Is the current froth based on anything but SF-scene self-licking-ice-cream-code social cred? No.

  • almog an hour ago

    Every bubble and its WSDL moment...

munificent 2 hours ago

[flagged]

  • anildash 2 hours ago

    I’m furious about it, I just made a joke in passing because this piece isn’t about that. As it turns out, when I write about the threat to democracy, HN doesn’t actually let the link survive. https://news.ycombinator.com/item?id=42607135

    • pvg 42 minutes ago

      HN can't fit everyone's writing about 'the threat to democracy', beside the fact it's not really HNs remit. DOGE was still the most discussed topic on HN in the weeks after its establishment. The idea that HN didn't 'let you' is pretty spurious and (however understandably) self-absorbed.

    • munificent 2 hours ago

      > I just made a joke in passing because this piece isn’t about that.

      I get that and maybe it's just me but it really fell flat. The world is on fire and it's reasonable to take a break from writing about that sad reality but in that case, I'd rather the writing not mention it at all instead of making me think about the horrorshow but then making light of it.

      Your article about Procurement Capture is spot on.

      • anildash 2 hours ago

        I get it, we’re all sort of running into the brutality of the descent into authoritarianism at different times and in different ways all the time, we’ll be out of sync with our respective senses of whether we can process it as grief or absurdity or anger or fatigue or whatever at different times. I don’t fault anybody for feeling my tone is off if they’re not in the same mindset at the point when they’re reading.

  • Y_Y 2 hours ago

    Making jokes isn't an endorsement! It's often a coping mechanism.

  • geodel 2 hours ago

    Only thing infuriating is your tone policing on a good point.

    There is enough policing as such already by people in power. You not doing it can be a good thing.

cranberryturkey 3 hours ago

[flagged]

  • madsmith 3 hours ago

    Model Control Protocol is not Model Context Protocol. We might be suffering from too many acronyms. (Just to be clearer, I think you’re referencing on the wrong MCP)

    • 98codes 2 hours ago

      We ran out of acronyms some time in the late 2000s I think; ever since then, name collisions everywhere.

    • 725686 3 hours ago

      That would be a TMA

    • cranberryturkey 3 hours ago

      hmmm....what's the difference here so I can understand better? I developed this for use with roocode which supports MCP servers.

      • 85392_school 2 hours ago

        MCP is Model Context Protocol, a standardized API for declaring remote tools for AI to call. Roo Code supports Model Context Protocol, not your creation.

  • rafram 3 hours ago

    Why does an MCP server need my OpenAI API key? Isn't that kind of backwards? And doesn't it defeat the point of MCP, which is that one AI provider (OpenAI, whatever) can connect to many MCP servers?

  • jansan 3 hours ago

    What is that? What is the difference between Model Context Protocol and Model Control Protocol, and why are both AI related?

somat an hour ago

[flagged]