analysis engineering

MCP Passed 97 Million Installs. That's a Free Lunch for NanoClaw.

NanoClaws.io

NanoClaws.io

@nanoclaws

2026年3月10日

7 分鐘閱讀

MCP Passed 97 Million Installs. That's a Free Lunch for NanoClaw.

On March 10, 2026, Model Context Protocol (MCP) passed 97 million npm installs. The number itself isn't exceptional — plenty of popular npm packages hit hundreds of millions. What's interesting is the growth curve: MCP had just a few hundred thousand installs at its November 2025 launch, and four months later it's approaching a hundred million. That's not linear growth. It's the early stage of exponential.

MCP is an open protocol Anthropic introduced that defines a standardized communication interface between AI models and external tools. Simply put, MCP lets AI agents call various tools — database queries, file operations, API calls, web browsing — in a uniform way, without writing custom integration code for each tool.

For NanoClaw, MCP's explosive growth means something very specific: the tools available to users are growing faster and faster, and NanoClaw doesn't have to write a single line of code to keep up.

What MCP Actually Solves

Before MCP, having AI agents call external tools required custom integration. Want the agent to query a database? Write a tool function for that specific database. Want it to send emails? Write another. Want it to drive your project management tool? Another one. Each integration was independently built, maintained, and tested.

The problem with this approach isn't technical feasibility — it's that it doesn't scale. There are tens of thousands of tools and services in the world, and no single team can write integrations for all of them.

MCP defines a standard interface: the tool side describes what it can do (schema), and the agent side sends requests and receives responses through a standard protocol. Once a tool implements an MCP server, any MCP-capable agent can use it. Implement once, available everywhere.

This is the same logic as USB. Before USB, every peripheral needed its own interface and driver. Once USB standardized the physical interface and communication protocol, new peripherals didn't need new interface designs — plug in, it works. MCP is doing the same thing for the AI tooling ecosystem.

What's Behind 97 Million

97 million installs doesn't mean 97 million users — npm install counts include CI/CD pipelines, nested dependencies, and automated downloads. But it reflects the real depth and breadth of the MCP ecosystem.

By March 2026, the MCP ecosystem has more than 2,000 public tool servers covering major databases, cloud services, dev tools, communication platforms, project management tools, and enterprise apps. That number grows every week, because implementing an MCP server is cheap — the standard is already defined, you just wrap your tool in an MCP interface.

For NanoClaw, this means users can tap into a rapidly growing tool ecosystem through MCP without NanoClaw building any integrations itself. User wants the agent to query PostgreSQL? There's an MCP server. Drive Notion? There's an MCP server. Handle Slack messages? There's an MCP server. NanoClaw doesn't need to know these tools exist — it just needs to support the MCP protocol, and tool-specific details are handled by the MCP ecosystem.

How NanoClaw Uses MCP

NanoClaw supports MCP natively through the Claude Agent SDK. This isn't a custom MCP client — the Claude Agent SDK itself integrates MCP support. NanoClaw just makes sure MCP servers can run properly in the container environment.

In practice, this means users can specify the MCP servers they need in the container configuration, and NanoClaw will launch those servers along with the container at startup. The agent communicates with these servers through the MCP protocol inside the container, transparently to the user.

The key advantage of this design is security. MCP servers run inside the container, and their access to the host system is bounded by container isolation. An MCP server behaving badly — whether due to bugs or malice — is constrained by the container boundary. That's a qualitative difference from running MCP servers directly on the host.

Ecosystem Dividends vs Platform Lock-In

MCP's success validates an ecosystem strategy: don't build your own tool ecosystem, plug into an existing, fast-growing one.

Many AI agent frameworks chose to build their own plugin or skill marketplaces: OpenClaw has a skill store, LangChain has a tool marketplace, AutoGPT has a plugin system. These proprietary ecosystems require heavy investment to attract developers, review quality, and maintain compatibility. And they're mutually incompatible — a skill written for OpenClaw doesn't work in LangChain.

MCP is an open standard, usable by any agent. NanoClaw didn't build its own tool ecosystem — it just uses MCP. That means NanoClaw's tool library is the entire MCP ecosystem. When a new MCP server is published, NanoClaw users can use it immediately, no waiting for the NanoClaw team to review, adapt, or release an update.

This is another expression of the thin-architecture philosophy: don't reinvent the wheel, plug into existing infrastructure. Not a single line of NanoClaw's 500 lines is tool-integration code — because MCP made tool integration a standard agent-runtime capability rather than an application-layer feature that needs custom implementation.

97 million installs is a milestone for the MCP ecosystem. But for NanoClaw users, it just means their available tool library got a little bigger. That's what standing on ecosystem giants' shoulders feels like.

現在就開始打造 AI 代理

取得新版本發布、整合功能與 NanoClaw 開發的最新消息。不發垃圾郵件,隨時可取消訂閱。