Skip to content

hideya/langchain-mcp-tools-py

Repository files navigation

MCP To LangChain Tools Conversion Utility License: MIT pypi version

This package is intended to simplify the use of Model Context Protocol (MCP) server tools with LangChain / Python.

Model Context Protocol (MCP), an open source technology announced by Anthropic, dramatically expands generative AI’s scope by enabling external tool and resource integration, including Google Drive, Slack, Notion, Spotify, Docker, PostgreSQL, and more…

Over 450 functional components available as MCP servers:

The goal of this utility is to make these 450+ MCP servers readily accessible from LangChain.

It contains a utility function convert_mcp_to_langchain_tools().
This async function handles parallel initialization of specified multiple MCP servers and converts their available tools into a list of LangChain-compatible tools.

For detailed information on how to use this library, please refer to the following document:

A typescript equivalent of this utility is available here

Prerequisites

  • Python 3.11+

Installation

pip install langchain-mcp-tools

Quick Start

convert_mcp_to_langchain_tools() utility function accepts MCP server configurations that follow the same structure as Claude for Desktop, but only the contents of the mcpServers property, and is expressed as a dict, e.g.:

mcp_configs = {
    'filesystem': {
        'command': 'npx',
        'args': ['-y', '@modelcontextprotocol/server-filesystem', '.']
    },
    'fetch': {
        'command': 'uvx',
        'args': ['mcp-server-fetch']
    }
}

tools, cleanup = await convert_mcp_to_langchain_tools(
    mcp_configs
)

This utility function initializes all specified MCP servers in parallel, and returns LangChain Tools (tools: List[BaseTool]) by gathering available MCP tools from the servers, and by wrapping them into LangChain tools. It also returns an async callback function (cleanup: McpServerCleanupFn) to be invoked to close all MCP server sessions when finished.

The returned tools can be used with LangChain, e.g.:

# from langchain.chat_models import init_chat_model
llm = init_chat_model(
    model='claude-3-5-haiku-latest',
    model_provider='anthropic'
)

# from langgraph.prebuilt import create_react_agent
agent = create_react_agent(
    llm,
    tools
)

Find complete, minimal working usage examples here

For hands-on experimentation with MCP server integration, try this LangChain application built with the utility

For detailed information on how to use this library, please refer to the following document:
"Supercharging LangChain: Integrating 450+ MCP with ReAct"

Limitations

Currently, only text results of tool calls are supported.