Baidu Cloud: Pioneering Enterprise MCP Services

The Rise of MCP as an Industry Standard

Model Context Protocol (MCP), introduced by Anthropic in November 2024, is rapidly becoming a crucial standard for AI model interaction. Its primary aim is to create secure, bidirectional connections between Large Language Models (LLMs) and diverse data sources. This addresses inconsistencies in tool implementation and facilitates cross-model sharing, paving the way for more unified and efficient AI development.

Within a short period, MCP has garnered significant attention within the AI community. At the Create2025 Baidu AI Developer Conference on April 25th, Baidu founder Robin Li revealed two innovative models: Wenxin Large Model 4.5 Turbo and Deep Thinking Model X1 Turbo. Alongside these models, various AI applications were presented, showcasing Baidu’s dedication to empowering developers to fully utilize MCP.

The support for MCP extends beyond Baidu to encompass other significant players like OpenAI, Google, Microsoft, Amazon, Anthropic, Alibaba, and Tencent. This widespread adoption indicates that MCP is evolving into the ‘HTTP of the AI world,’ establishing a universal standard for how models and data sources communicate.

During the conference, Baidu Intelligent Cloud officially launched the first enterprise-grade MCP service in China. This service provides enterprises and developers with access to over 1,000 MCP Servers. Furthermore, the platform empowers developers to create their own MCP Servers on Qianfan, Baidu’s AI development platform, and publish them to the MCP Square. Baidu offers free hosting and indexing through Baidu Search, enhancing the accessibility and utility of these servers.

Baidu Cloud’s Enterprise-Focused Strategy

While many vendors are embracing MCP, their approaches differ. Baidu Intelligent Cloud is concentrating on the enterprise market, striving to engage as many developers as possible from the outset. This strategy includes enriching the MCP Square and utilizing Baidu Search to drive traffic, thereby cultivating a thriving MCP ecosystem.

Baidu’s approach to MCP offerings is centered on understanding and addressing the needs of enterprise customers. The company is well-positioned to leverage its existing relationships with enterprise customers and introduce them to the world of MCP.

The Necessity of MCP in the AI Landscape

The emergence of MCP addresses key challenges in deploying LLMs, especially within enterprise environments. Previously, the application of LLMs was largely confined to chatbot-like scenarios. More complex enterprise applications required extensive customization, making the development process both intricate and resource-intensive, even with the toolchains offered by vendors like Baidu Intelligent Cloud.

With 2025 being recognized as the year of the AI Agent, LLMs are anticipated to evolve beyond simple information processing to planning and executing tasks autonomously. In this new paradigm, the LLM acts as the ‘brain,’ requiring ‘limbs’ and ‘senses’ to accomplish specific tasks.

The traditional approach of customizing each AI application necessitates integrating ‘M×N’ tools, where each AI application must interface with numerous tools. MCP simplifies this by standardizing the interaction between LLMs and tools, reducing the complexity to ‘M+N.’ This standardization is essential for scaling AI applications across a range of enterprise functions.

Streamlining Enterprise-Level AI Applications

Baidu Group Executive Vice President and President of Baidu Intelligent Cloud Business Group, Shen Dou, emphasized that applying LLMs involves more than just simple calls. ‘It requires connecting various components and tools and performing intricate orchestration. Often, further refinement and customization of models are needed to enhance performance,’ he noted.

Shen Dou further elaborated that constructing enterprise-grade applications requires careful consideration of computing performance, stability, scalability, and security. He considers the deployment of an application as a comprehensive ‘system’ construction process.

Enterprise applications demand higher standards and lower error tolerance compared to consumer-grade applications. According to an industry expert, application development consumes 90% of project time because while models are standardized, applications are highly variable.

These efforts typically involve four crucial tasks: supplementing professional knowledge, orchestrating business processes, expanding intelligent tools, and integrating enterprise systems. By encapsulating these tasks into a platform offering out-of-the-box functionality, enterprises can leverage RAG (Retrieval-Augmented Generation) to incorporate expert knowledge, utilize workflows to orchestrate business processes, and employ intelligent agents combined with MCP to leverage existing systems and assets.

MCP is poised to meet the industry’s expectations for simplifying the deployment of LLMs in practical applications.

Bridging the Gap in Enterprise-Level Agents

As Shen Dou pointed out, the deployment of LLMs necessitates full-stack, system-level support, extending from underlying computing power to applications. This includes high-performance hardware and cluster optimization, as well as flexible development toolchains and scenario-based solutions.

Baidu Intelligent Cloud’s system-level capabilities encompass a computing power layer, including the newly announced 30,000-card Kunlunxin cluster and the upgraded Baige GPU computing platform. The model development layer features over 100 models on the Qianfan platform, including Baidu’s Wenxin 4.5Turbo and Wenxin X1 Turbo, as well as third-party models like DeepSeek, Ilama, and Vidu.

In the application development layer, Baidu Intelligent Cloud offers Qianfan Enterprise-Level Agent and MCP services, enhancing the ability of agents to solve complex problems. These services are complemented by a comprehensive model development toolchain that supports the customization and fine-tuning of deep-thinking models and multi-modal models.

Baidu Intelligent Cloud is focusing on the application development layer, with significant updates to the Qianfan platform’s enterprise-level agent development toolchain. The platform introduces the new inference-based intelligent agent, Intelligent Agent Pro, which enhances capabilities from quick question answering to deep deliberation, supporting customized intelligent agents for each enterprise.

Real-World Applications of Baidu’s MCP Ecosystem

Consider the example of Sewage Treasure, which uses Qianfan Agentic RAG capabilities to combine enterprise-specific data and knowledge bases. This allows agents to formulate retrieval strategies based on an understanding of tasks, significantly reducing model hallucinations.

Intelligent Agent Pro also supports Deep Research mode, enabling agents to autonomously plan complex tasks, filter and organize information, and collect exploratory knowledge by browsing web pages. It also supports using various tools to create charts, write reports, and generate structured and informative professional reports.

MCP empowers developers and enterprises to better leverage industry data and tools when developing agents, thereby addressing critical gaps in enterprise-level agent capabilities.

Developers can embrace MCP in two ways: by providing their resources, data, and capabilities in MCP format for use by AI applications, or by leveraging existing MCP Server resources when developing AI applications. Both approaches reduce development effort and significantly enhance capabilities.

Baidu Intelligent Cloud’s Qianfan platform is the first large model platform to support MCP. Before MCP, large models and tools were scattered and lacked standardization. MCP fosters interconnection and facilitates ecosystem prosperity.

The Competitive Landscape of MCP

MCP, and large models in general, represent a competition between platforms and ecosystems. In the early stages of new technologies, various paradigms are immature, requiring end-to-end optimization to achieve optimal performance. This explains why the deployment of large model applications relies heavily on leading vendors.

For these vendors, the challenge lies not in excelling in one area but in having no significant weaknesses. They must build robust platform capabilities and foster thriving ecosystems to attract more participants, pitting one large model ecosystem against another.

Baidu’s strategy in the MCP domain involves three steps.

  1. Launching MCP Servers: Baidu was among the first to launch MCP Servers, including the world’s first e-commerce transaction MCP and search MCP. Developers can add Baidu AI Search and Baidu Youxuan’s MCP Servers to the ‘Universal Intelligent Agent Assistant’ on the Baidu Intelligent Cloud Qianfan platform, enabling intelligent agents to complete the entire process from information queries and product recommendations to direct order placement. This combines e-commerce transaction support with top-tier search capabilities.
  2. Supporting MCP Service Development: The Baidu Intelligent Cloud Qianfan platform officially launched China’s first enterprise-grade MCP service, with over 1,000 MCP Servers available for enterprises and developers. Developers can create their own MCP Servers on Qianfan, publish them to the MCP Square, enjoy free hosting, and gain exposure and usage opportunities through Baidu Search.
  3. AI Open Plan: The Baidu Search Open Platform launched the ‘AI Open Plan’ (sai.baidu.com) to provide traffic and monetization opportunities for developers of intelligent agents, H5 applications, mini-programs, and independent apps through various content and service distribution mechanisms. This plan also allows users to easily discover and use the latest AI services.

By enabling more enterprises and developers to open up their capabilities through MCP, Baidu is fostering its ecosystem while enabling its partners to realize commercial value. The ultimate winner in the large model competition may not necessarily be the most technologically advanced vendor, but the one with the most thriving ecosystem.