Join our daily and weekly newsletters for the latest updates and the exclusive content on AI coverage. Learn more
The larger models do not lead the next wave of IA innovation. The real disturbance is more silent: normalization.
Launched by Anthropic in November 2024, the Context Protocol (MCP) Model Standardized How IA applications interact with the world beyond their training data. Like HTTP and standardized REST How web applications connect to services, MCP Standardrizes How AI models connect to tools.
You have probably read a dozen articles explaining what MCP is. But what is most lacking is the boring – and powerful part -: MCP is a standard. Standards do not only organize technology; They create growth flies. Adop them early and you get on the wave. Ignore them and you are late. This article explains why MCP now counts, what challenges it introduces and how it already reshapes the ecosystem.
How MCP moves us from chaos to the context
Meet Lily, product manager in a cloud infrastructure company. It juggles on projects through half a dozen tools like Jira, Figma, Github, Slack, Gmail and Confluence. Like many, she drowns in updates.
By 2024, Lily has seen how good large languages (LLM) have become good information on synthesis. She spotted an opportunity: if she could feed all the tools of her team in a model, she could automate updates, write communications and answer questions on demand. But each model had its personalized way of connecting to the services. Each integration brought it more deeply in a single supplier platform. When she needed to draw transcriptions from Gong, it meant to build another tailor -made connection, which makes it even more difficult to move to a better LLM later.
Then Anthropic launched MCP: an open protocol to normalize the way the context goes to LLMS. MCP quickly picked up OPENAI,, AWS,, Azure,, Microsoft Copilot Studio And, soon, Google. Official SDKs are available for Python,, Manuscript,, Java,, C #,, Rust,, Kotlin And Fast. Community SDK for Go And others followed. The adoption was rapid.
Today, Lily performs everything via Claude, connected to her work applications via a local MCP server. State reports are written. The leadership updates are at an prompt. As new models are emerging, she can exchange them without losing any of her integrations. When she writes code on the side, she uses Cursor with an Openai model and the same MCP server as in Claude. Its IDE already understands the product it builds. MCP made it easy.
The power and implications of a standard
The story of Lily shows a simple truth: nobody likes to use fragmented tools. No user likes to be locked in sellers. And no business wants to rewrite integrations whenever it changes models. You want freedom to use the best tools. MCP Book.
Now, with standards, implications.
First, SaaS providers without strong public APIs are vulnerable to obsolescence. MCP tools depend on these APIs and customers will require support for their AI applications. With a de facto emerging standard, there is no excuse.
Second, AI applications development cycles are about to accelerate considerably. The developers no longer have to write personalized code to test the AI simple applications. Instead, they can integrate MCP servers with easily available MCP customers, such as Claude Desktop, Cursor and Windisurf.
Third, switching costs collapse. Given that integrations are decoupled from specific models, organizations can migrate from Claude to Openai to Gemini – or mixture models – without reconstructing infrastructure. Future LLM suppliers will benefit from an existing ecosystem around MCP, which allows them to focus on better price performance.
Navigation of challenges with MCP
Each standard introduces new friction points or leaves existing non -resolved friction points. MCP is no exception.
Confidence is critical: Dozens of MCP registers have appeared, offering thousands of servers maintained by the community. But if you do not control the server – or trust the part that does it – you risk disclosing secrets to an unknown third party. If you are a SaaS company, provide official servers. If you are a developer, look for official servers.
Quality is variable: APIs evolve and poorly maintained MCP servers can easily be synchronized. The LLM rely on high quality metadata to determine the tools to be used. No authority MCP register still exists, strengthening the need for official servers of trust parts. If you are a SaaS company, keep your servers as your APIs evolve. If you are a developer, look for official servers.
Large MCP servers increase costs and lower utility: Gather too much tools in a single server increases costs thanks to the consumption of tokens and submerges the models with too many choices. LLMs are easily confused if they have access to too many tools. It is the worst of both worlds. Smaller and task -oriented servers will be important. Keep this in mind when you build and distribute servers.
Authorization and identity challenges persist: These problems existed before MCP, and they still exist with MCP. Imagine Lily gave Claude the possibility of sending emails and gave well-intentioned instructions such as: “quickly send Chris an update of the status.” Instead of sending an email to its boss, Chris, the LLM sends an email to everyone has appointed Chris in his contact list to make sure that Chris receives the message. Humans will have to stay in the loop for high -level actions.
Ahead
MCP is not the media threshing – It is a fundamental change in infrastructure for AI applications.
And, just like all the standards well adopted before it, MCP creates a self-reproductive steering wheel: each new server, each new integration, each new application aggravates the momentum.
New tools, platforms and registers are already emerging to simplify the construction, tests, deployment and discovery of MCP servers. As the ecosystem evolves, AI applications will offer simple interfaces to connect to new capacities. The teams that embrace the protocol will send products faster with better integration stories. Companies offering official public APIs and MCP servers can be part of the history of integration. Late adopters will have to fight for relevance.
Noah Schwartz is responsible for the product for Postman.