Understanding MCPs: Bridging the Gap Between AI Models and External Data
Model Contextualization Protocols (MCPs) are gaining traction in the AI world as standardized APIs. These protocols serve as crucial links connecting external data sources or applications to large language models (LLMs) such as ChatGPT or Claude. By allowing AI models to access real-time data, manage calendars, and even manipulate files on a computer, MCPs extend the capabilities of LLMs significantly.
While existing AI tools like Claude, Cursor, and OpenAI have custom integration features, MCPs introduce a universal and standardized format for all interactions. This versatility enhances how AI models interact with the external world.
An MCP primarily consists of two components: a client and a server. The client is an interface like ChatGPT, while the server is a data source like a flight scheduling website. Together, they empower AI models to access real-time information, execute online actions, and operate as proactive agents. Instead of being static chatbots, they can perform dynamic functions.
Currently, two main types of MCPs are emerging. The first type caters to developers, allowing them to run on devices like laptops to manage files and execute scripts. Tools like Cursor or Claude Code exemplify this category. The second type focuses on real-world applications, such as searching for products, registering domains, booking events, or sending emails. This type integrates directly with various online services.
To explore the practical aspects, two distinct MCPs were developed. The first, named GPT Learner, is a developer server designed to help users guide Cursor in remembering errors and avoiding repetition. If Cursor or Claude incorrectly overwrites code, GPT Learner allows users to record and learn from the mistake, storing the correct approach for future reference. It is a tool to enhance developer productivity by leveraging AI models.
The second project is a prediction market MCP that connects large language models to a website, betsee.xyz, which aggregates real-time prediction markets. When a user asks Claude a question like, ‘What are the secondary effects of Trump pausing tariffs, and what are people betting on?’ the MCP returns relevant markets and real-time odds from Polymarket or Kalshi. This connection allows AI models to incorporate real-time financial predictions into their responses, enhancing their analysis.
Why MCPs Aren’t Quite Ready for Primetime
The development of these two MCPs revealed several key insights. Primarily, MCPs are not yet ready for widespread adoption due to user experience limitations and security concerns.
The current user experience with MCPs is far from ideal. Many chatbots, including ChatGPT, do not yet support MCP servers. Of those that do, installation often requires manually editing JSON, a process not user-friendly. Chatbots like Cursor and Claude tend to prompt users for every request and frequently return incomplete information or raw JSON output, making the experience clunky and unsatisfying.
Using Claude’s desktop version to query the prediction market MCP often resulted in failures to provide links or prices unless explicitly requested. Sometimes, the server was not called at all. The constant pop-up prompts from Claude when using MCPs further diminished user interest. While seamless processing and meaningful responses from MCPs are expected in the future, the technology is not yet mature.
Security is another significant concern. MCPs have the ability to perform external operations and access real-time systems, making them vulnerable to numerous security challenges. Prompt injection, malicious tool installation, unauthorized access, and Trojan horse attacks are real threats. Currently, there is a lack of sandboxing, verification layers, and a mature ecosystem to handle these edge cases.
These issues confirm that MCP is still in the experimental stage. Addressing these limitations is crucial for enabling wider adoption.
The Client’s Decisive Role
An important lesson learned while building these servers is that the client, not the server, ultimately decides the future of MCPs. Clients are the interfaces users interact with, and control how MCPs are accessed and used.
The entities that control the interaction with large models also control which tools users see, which are triggered, and which responses are displayed. It’s possible to create the most useful MCP server in the world, but the client may not call it, may only show part of its output, or may not even allow its installation. This highlights the importance of client-side development and integration.
MCPs and the Emergence of Gatekeepers
The critical power of the client means that MCPs will be governed like search engines and app stores. Leading large model application providers, such as OpenAI and Anthropic, will become the new ‘gatekeepers,’ deciding which MCPs can be listed and curating their discoverability through recommendation algorithms.
Since its inception in the late 1990s, Google has controlled what content is presented to users, which has helped them build an extremely profitable business. Chatbots are now gaining this ability, replacing the traditional search engine’s ‘10 blue links’ with direct answers. They can decide which content to show, which to exclude, and how to format it. This level of control is significant.
The MCP installation process will likely resemble the app store model. Just as Apple and Google have shaped the mobile ecosystem by deciding which apps are recommended, pre-installed, or approved, large model clients will determine which MCP servers are showcased, promoted, and even allowed on the platform. This dynamic is likely to lead to competition among companies, potentially involving payments to model providers for recommendations and exposure in the new ecosystem, fostering the creation of high-profit MCP distribution platforms.
Users will install MCPs or ‘AI chat applications’ from carefully curated ‘MCP stores.’ Tools like Gmail, HubSpot, Uber, and Kayak will add MCP endpoints, integrating directly into chat-based workflows. While users could theoretically choose to install any MCP they want, most will likely rely on client-provided recommendations, such as those from ChatGPT. These recommendations won’t be arbitrary but will stem from lucrative partnerships, with large companies paying to become the default option in shopping, travel, domain searching, or service searching categories. This level of visibility would translate to millions of users, offering immense exposure, data, and commercial value.
Some client-side MCP app stores (MAS) will offer a more lenient and open selection of MCPs, allowing for a broader range of experimentation and community-developed MCPs. Others will have strict approval processes, prioritizing quality, security, and monetization. In either case, the client sets the conditions for participation—and the rules for success. This centralized control will shape the MCP landscape.
MCP clients like OpenAI and Claude will become the new iOS and Android platforms, with MCP servers playing the role of apps. Instead of icons, these applications will be invoked via user commands, offering rich, structured, and interactive responses to user needs through language interaction. The user experience will be driven by conversational interfaces.
In time, we might see specialized clients emerge, tailored to specific industries or domains. Imagine an AI chat assistant focused on travel planning, seamlessly integrating services from airlines, hotel chains, and travel agencies to offer users a comprehensive travel planning experience. Or an MCP client focused on human resources, providing unified access to legal data, employee records, and organizational tools, transforming how businesses are managed. Specialization allows for more tailored experiences.
While most users will stick with mainstream clients, some open-source AI chatbots will emerge. These chatbots will appeal to professionals who want complete control over the MCPs they install, free from the limitations imposed by gatekeepers. However, like Linux desktop systems, these open-source products will likely remain niche markets. Customization will be a key differentiator for open-source solutions.
New Opportunities in the Emerging Ecosystem
Several types of businesses and tools are expected to emerge to serve the evolving MCP landscape. This presents numerous opportunities for innovation and growth.
MCP Wrappers and Server Packs: These will bundle multiple related MCPs into a single installation package, streamlining setup. Imagine a single package providing a calendar, email, customer relationship management, and file storage MCP that is ready to use without any configuration. Such packages will simplify personnel processes and be particularly useful in vertical markets. They may also include packaging tools (‘Set up calendar and send email’). Bundling simplifies the user experience and caters to specific needs.
MCP Shopping Engines: Some MCP servers will act as AI-powered comparison engines, offering real-time prices and product listings from various vendors. They will monetize through affiliate links, earning referral fees. This approach mirrors early search engine optimization and affiliate marketing. Shopping engines provide value to consumers by aggregating options.
MCP-First Content Apps: These services will optimize content delivery for large language models via MCP servers, rather than designing websites for human viewers. Imagine rich, structured data and semantic tags returned through MCP calls. Revenue will come from subscriptions or embedded sponsorships and product placements, rather than page views. Content optimized for AI consumption will be a critical component.
API-to-MCP Providers: Many existing API providers wish to participate in this new ecosystem but lack the resources to do so. This will drive the emergence of middleware tools that automatically convert traditional REST APIs into compliant and discoverable MCP servers, making it easy for SaaS platforms to join. API conversion bridges the gap between existing systems and the MCP ecosystem.
Cloudflare for MCPs: Security is a major concern. These tools will sit between the client and the server, sanitizing inputs, logging requests, blocking attacks, and monitoring anomalies. Just as Cloudflare has made the modern web safer, this type of service will play a similar role in the MCP ecosystem. Security solutions are essential for building trust and enabling adoption.
Enterprise “Private” MCP Solutions: Large companies will start connecting their internal services to private MCP servers and using open-source AI products. These internal setups will become part of AI workflows behind the firewall, giving companies control. Private MCP solutions address the unique needs of enterprise environments.
Vertically Focused MCP Clients: While many chatbots can meet general user needs, certain scenarios, such as industrial procurement and compliance work, require specific user interfaces and business logic. Vertically focused MCP clients will emerge, with customized operations, language, and layouts to meet these unique needs. Vertical specialization allows for deeper integration and value in specific industries.
The future of MCPs hinges on addressing user experience and security concerns. As the ecosystem matures, new opportunities will emerge, shaping how AI models interact with the world and creating a new era of intelligent applications.