YouTube Deep SummaryYouTube Deep Summary

Star Extract content that makes a tangible impact on your life

Video thumbnail

Connect any LLM to any MCP server without any MCP client.

AI LABS โ€ข 2025-04-17 โ€ข 6:12 minutes โ€ข YouTube

๐Ÿค– AI-Generated Summary:

Unlocking Direct MCP Server Communication with the New Python Client Library

Communicating with Multi-Cloud Platform (MCP) servers has traditionally required the use of specific MCP clients like Windinsurf, Cursor, and Claude Desktop. But what if you could streamline this process directly within your code, using any large language model (LLM) you prefer? This is now possible thanks to an exciting new MCP client library that empowers developers to integrate MCP servers seamlessly with LLMs โ€” all within a flexible, modular Python framework.

In this blog post, weโ€™ll walk you through what this library offers, how to set it up, and some creative ways to use it to build powerful autonomous agents.


What Is the New MCP Client Library?

This new MCP client library is a Python-based tool designed to simplify communication between your code and MCP servers. Unlike previous solutions that required specific clients, this library lets you bind any LLMโ€”OpenAI, Anthropic, Grock, Llama, and moreโ€”directly to MCP servers using an agent-based architecture.

Key Features:

  • LLM Agnostic: Use any preferred LLM provider with simple configuration.
  • Agent-Based: The agent handles communication logic, including prompt management and step limits.
  • Multi-Server Support: Manage multiple MCP servers in a single file and intelligently route requests.
  • Modular & Extensible: Easily modify code to build unique, autonomous applications.
  • Localhost HTTP Support: Connect to MCP servers running locally or remotely.
  • Easy Installation: Python package with straightforward setup steps.

How to Get Started: Installation and Setup

Step 1: Verify Python Installation

Make sure you have Python installed on your system. You can check this by running:

bash python --version

or for Python 3:

bash python3 --version

Step 2: Create and Activate a Virtual Environment

Creating a virtual environment isolates your project dependencies. Use the following commands:

  • Windows:

bash python -m venv env .\env\Scripts\activate

  • macOS/Linux:

bash python3 -m venv env source env/bin/activate

If youโ€™re unfamiliar with these commands, you can easily get them from ChatGPT or online resources.

Step 3: Install the MCP Client Library and LLM Dependencies

Use the package manager pip3 to install the MCP client library and any LLM-specific packages you need:

bash pip3 install mcp-client-library pip3 install langchain-openai # For OpenAI pip3 install langchain-anthropic # For Anthropic

Check the GitHub repositoryโ€™s documentation for other providers like Grock or Llama.


Configuring Your Project with API Keys

After setup, open your project folder in an IDE or editor like Cursor:

bash cursor .

Create a new .env file and add your API key for the LLM provider you intend to use. For example, if youโ€™re using OpenAI:

OPENAI_API_KEY=your_openai_api_key_here

Only add the key for the provider you plan to use.


How the Code Works: A Quick Overview

Hereโ€™s a simplified explanation of the setup:

  1. Load Environment Variables: Your .env file is loaded to access API keys.
  2. Create MCP Client: The client connects to the MCP server using configurations in a separate file.
  3. Define LLM: Set which LLM model you want to use (e.g., OpenAI GPT-4).
  4. Create Agent: The agent binds the LLM and MCP client, sets max steps, and provides prompts.
  5. Execute and Receive Output: The agent communicates with the MCP server and returns results.

This architecture eliminates the need for separate MCP clients and allows you to build modular applications, such as autonomous WhatsApp agents or real estate listing filters.


Real-World Example: Airbnb Listing Filter

Using the Airbnb MCP server, you can create an agent that fetches listings filtered by your preferencesโ€”like properties with pools and high ratings. The agent queries the MCP server, processes the data through the LLM, and returns curated results. This showcases the power and flexibility of the framework.


Enhancing Your Development with Cursor and Docs Integration

If you use Cursor as your code editor, you might notice it lacks context about the MCP framework by default. To fix this:

  1. Go to Cursorโ€™s features section and open the docs.
  2. Add a new doc and paste the link to the MCP libraryโ€™s README file from the GitHub repository.
  3. Cursor will index this documentation, enabling it to generate context-aware code snippets.

Additionally, you can convert the entire repository into a format digestible by LLMs by modifying the URL (replacing /hub/ with /ingest/). This allows you to query the repo content directly for clarifications or help.


Expanding Possibilities: Multi-Server and Service Manager Support

The library supports defining multiple MCP servers in a single configuration file. You can either:

  • Specify which server should handle each request, or
  • Enable the service manager, which intelligently routes requests to the appropriate server.

You can also control which tools the agent can access, opening up possibilities for complex, multi-faceted applications.


Why You Should Try This Framework

This MCP client library is a game-changer for developers looking to leverage MCP servers and LLMs together. It offers:

  • Flexibility to use your favorite LLM.
  • Simplified architecture without needing separate MCP clients.
  • Modular design for creative application development.
  • Support for local and remote servers.
  • Intelligent multi-server management.

The code is open-source and available on GitHub, with example projects like using Playwright with Airbnb or running the Blender MCP server.


Final Thoughts

Whether youโ€™re an experienced developer or new to coding, this library provides the tools and flexibility to build powerful, autonomous agents that interact with MCP servers directly. Use ChatGPT or Cursor to help write or modify your code, and explore the vast possibilities this framework unlocks.

Ready to dive in? Check out the GitHub repo, try the examples, and start building your own MCP-powered applications today!


Useful Links


If you found this post helpful, consider subscribing and supporting the ongoing development through donations linked in the repository. Happy coding!


๐Ÿ“ Transcript (168 entries):

I'm going to show you a library that lets you communicate with your MCP servers directly in your code. You can use it with any LLM you prefer. Previously, communicating with MCP servers required specific MCP clients. We already have clients like Windinsurf, Cursor, and Claude Desktop. Now, you can use this new MCP client library. It works by using an agent and comes with some pretty cool features. It's also really easy to install. I'll show you how to set it up. If you're not familiar with code, keep in mind that this does require some coding knowledge to use properly. Even if you're new to it, that's not a big issue. I'll also show you how to vibe code with it. Let's get into the video. Let's look at the installation of the library. It's a Python-based library. So, the first step is to check whether Python is installed on your system. Then, you'll create a virtual environment. Once it's created, you need to activate it. The commands for both Windows and Mac OS are shown here, and I'll paste them in the description below as well. If you don't know how to code, you can also get these commands from chatgpt. These are the commands to install the library. They're available on the GitHub repo, too. If you're using Python version 3, which you'll know from the Python version command mentioned earlier, then make sure to run everything using pip 3, not pip, but pip 3. If you're planning to use an open AI key or model to interact with your MCPS, you need to install Langchain OpenAI. For anthropic, you need to install Langchain Anthropic. For other providers like Grock or Llama, you can find the required libraries listed at this link. So once everything is installed, open up your terminal. You can see that I'm in a Python environment. I've installed the pip packages. And now I'll show you how to move forward. First open this directory in cursor using this command. This will launch your project directly in cursor. Once it opens in cursor, create a new file named env. Your project will only have the virtual environment folder. At this point, you need to create the other files yourself and add the code manually. Create the env file and paste the following line with your API key. You only need to paste the key for the provider you're using. I'm using OpenAI in this example. So, I pasted that key. Let me quickly explain the code and how it works. At the top, you can see different inputs from MCPUs, Langchain, and OpenAI. As we're using OpenAI for the LLM, we define a function and load.env env loads the environment variables from the ENV file. Then we create an MCP client using the MCP configuration from a separate file. I've placed it here. You can see the Airbnb MCP configuration is right here. Next, we define the LLM we want to use. If you're using Anthropic, the setup will be a bit different. Then we choose our model and create an agent. This agent takes the LLM, the client we created, and defines the maximum number of steps it can take. It also gives a prompt to the LLM which is used on the MCP server. It then prints the result and gives it to us. This is a very basic example of using the Airbnb MCP. You can modify it however you like and build really interesting applications. You don't need a separate client anymore. You can bind an LLM to an MCP and create modular applications. If you've seen our WhatsApp MCP video, that same concept can be used here to make fully autonomous WhatsApp agents. Now, let me run it for you. The server has started and it's running. It looks like there was some kind of error, but we still got the output. We received the listings from the Airbnb and MCP. It gave us the links because we added a feature that filters listings by our preferences, like having a pool and good ratings. It handpicked based on those conditions. This is a cool implementation. It works, and the possibilities for creating different agents are endless. The code you just saw is already in the GitHub repository, so there's no need to include it in the description. If you want to modify the code, you can either write it yourself or ask Cursor to do it. One issue you might run into is that Cursor doesn't have the context of this framework. To give it that context, scroll down to the features section and go to docs. Add a new doc and in the link field, go back to the GitHub repo and open the readme file. You don't need to provide the link to the entire repository. Just use the readme file since it contains the full documentation. Copy the link and paste it into the doc section. Cursor will read it, index it, and use it as context. To use it in code, type the at sign, go into docs, and select MCP use docs. It will reference that and generate code based on the framework properly. Another thing you can do is convert the repo into an LLM ingestible format if you have any questions about it. To do that, you can replace hub with ingest in the URL. This will open the repository and get ingest. It will convert the entire repo into readable text that you can use with any LLM. You can then ask questions about it if you're ever confused or need clarification. You've seen it in action and you can check the repo for other example use cases like using Playright and Airbnb. I use the Airbnb one but with OpenAI. The Blender MCP server can also be used. This framework also supports HTTP connections which means you can connect to servers running on local host. It includes multi-server support too, allowing multiple servers to be defined in a single file. If you're working with multiple MCP servers, you can either specify which result should come from which server or handle it dynamically. By setting use service manager to true, the agent will intelligently choose the right MCP server. You can also control which tools it has access to. This is a solid framework and I'm already thinking of all the wild ways to build new applications with the MCP library. You should check it out too. I'm working on a few projects with it right now. If you don't fully understand it, I've already shown how you can use an LLM to make sense of everything. You can also ask chat GPT or let cursor write the code for you. If you like the video, consider donating through the link in the description and do subscribe. Thanks for watching.