MCP Server Registry #159
Replies: 7 comments 6 replies
-
Thanks for writing this up! I'm increasingly feeling like this is the right path forward too. Before, I had been reluctant for us to undertake this because it felt like building npm/pypi from scratch—but actually, if we assume the continued use of package managers' own registries, we can make this strictly a metadata service about servers. That dials the complexity and security risk way down. We could also integrate something like socket.dev to provide some basic level of security assessment about the underlying packages. Does it sound right that we should require packages to be published to some underlying registry first (npm, pypi, Docker), and then they should be explicitly submitted/registered with the metadata registry afterward? |
Beta Was this translation helpful? Give feedback.
-
I agree with the opportunity here, namely to create a "single source of truth" of all the MCP servers out there and deduplicate all the effort being made to identify and collate what's been built. Definitely agree on having a global, public API for providing access to this bedrock data. Also agree that reimplementing npm/pypi is out of scope -- leave the source code hosting to those solutions. I also like the idea of a base "Server Browser" implementation, with allowing the market to potentially improve on the "server discovery UX" by implementing their own take on top of the global, public API. I'm not as sold on "curation and segmentation", "security", or "unified runtime" as something that should definitely be solved within the registry. I think this could potentially be separated out and be concerns tackled by third party "Server Browsers" - of which the native official "Server Browser" is just a simple implementation that does not offer much in the way of opinionated tags or security guarantees. But maybe we could take these on a case by case basis after an initial official registry exists.
I think this should be true for open source packages that are meant to run on a local machine. But I think this "centralized registry" solution should also accept closed source, SSE server submissions. |
Beta Was this translation helpful? Give feedback.
-
My recommendation would be that there be a standard json/yaml specification and or configuration file that is used that is implementation agnostic (Think terraform). That standardized spec/config is converted by MCP server implementations into a functional service. Industry originating specs/ configs could then be published for deployment on any MCP server service package Its really inhibitive for each server developer maintain multiple versions that are not readily extensible and for developers / service deployers to not have this sort of modular approach. This approach calls for the following:
In addition to the value added circumstances that everyone mentions above, this approach could stream line adoption and get us out of a proprietary / snowflake solution trajectory (which is where it seems this could go). |
Beta Was this translation helpful? Give feedback.
-
+1 to have an official place where people can submit their servers, clients etc. |
Beta Was this translation helpful? Give feedback.
-
All good responses here. Thanks for the thoughtful commentary and ideas @tadasant! I'm pretty aligned and look forward to figuring out a more specific plan soon. I'll be on vacation a few days beginning tomorrow, but back full time from next Wednesday. |
Beta Was this translation helpful? Give feedback.
-
Maintainer of Smithery AI here! I would definitely want to be kept up to date about MCP's official plans as Smithery currently offers both hosting and registry of MCPs. Open to discussing any potential integrations/collaborations that might deduplicate work. Our registry API: https://smithery.ai/docs/registry |
Beta Was this translation helpful? Give feedback.
-
I believe the issue of registry fragmentation of having different registries of various levels of support and awareness is real. I'd also suggest that this might be duplicative of the discoverability topic. Though to offer a differing opinion to consider here... Registries are expensive to build and maintain with very little hope for revenue gen (I know this from experience and from discussions with NPM founders), this is what makes them hard to continue maintaining at scale. Propagation of MCP servers are not remotely to the point of "scaled" where this will be a challenge. So perhaps the "registry fragmentation" problem is a problem that will naturally go away unless someone can figure out a path to monetize the ability to support these at scale. To put it directly, I fully expect these registries to fall away as expenses build (though I do wish for everyone's efforts and ventures to be successful). So my suggestion would be not to create a cloud service for this discoverability but to lean into systems like GitHub, NPM, etc. to provide a way to openly capture, list, and publish updates to this list - take that as far as possible. Those that need it are technical teams and can easily sync from there. That should provide a long-term viable way to support these systems even as the lists start to hit a level of actual scale. I also believe there are differing needs for MCP servers and I do not believe that, architecturally, MCP clients should have the breadth of all MCP servers available to use at the ready on disk. While a registry that can be stable and cost-effective for clients to sync to is a good initial solution, we will need to consider how to determine the context directories that always need to be available and what should be sourced on demand. For example, if I create an MCP server for a local service in town, I don't expect anyone to need that constant knowledge of its existence. I expect that leaning into the web as the on-demand system for that would be ideal. What happens when there are 100k MCP servers, 10 million, 100m, more? Then the multiple of times that content needs to sync? Not suggesting we build for that scale now but preparing for the ideal outcome that this pattern takes hold means expecting these numbers. |
Beta Was this translation helpful? Give feedback.
-
Pre-submission Checklist
Discussion Topic
The purpose of this is to sketch a baseline set of required functionality for a public industry standard registry of MCP Servers. A number of sites have popped up in recent months. For example:
https://mcpserver.cloud/
https://mcp.run/
https://smithery.ai/
https://block.github.io/goose/v1/extensions/
While there is value in different server browsers and client integrations for MCP, there will be additional value in a “single-source-of-truth” registry containing the metadata about MCP servers themselves. Right now each of these sites has its own copy of data, relying on additions by maintainers or contributors. They each present a subset of all available MCP servers globally, and duplicate much of the storage and search logic. Ultimately this presents a fragmented view of what is available to end-users.
In contrast, a single widely adopted registry will be a bedrock resource that higher level tools interfacing with MCP servers can leverage.
Feature Requirements
Global Public API
We need a robust API serving metadata about every server, as well as artifact download URIs, search functionality (via utility and categories), new server publishing, storage, tagging, versioning, etc.
This will allow multiple server browsers / client install flows to emerge, while maintaining and deriving the benefits of a single source of all metadata.
Server Browser
Similar to https://www.npmjs.com/ we should have a standard server browser that implements and exposes a UX for these feature requirements. This is not to say we will discourage other browsers, but that to pair with the global public API there should be at least one officially maintained server browser.
Curation and Segmentation
There should be support in the API and UX for browsing MCP servers of notable utility (popular, most installed, new this week) as well as specific categories for the services they connect to (finance tools, fitness, wellness, etc).
Security
Security should be taken as a first class consideration in the registry. We should implement automated code scanning looking for traditional CVE (common vulnerabilities & exposures) as well as analysis specific to MCP servers (adherence to authorization spec, scanning for prompt injection, etc) that will become more clear over time. The global public API should also be protected against publishing / DDoS attacks.
Further Exploration
Unified Runtime
We could explore a unified runtime for MCP servers (a la npx) that would work for MCP servers written in any language. This would simplify the installation and usage flow for client integrators.
Beta Was this translation helpful? Give feedback.
All reactions