Convert LangChain tools to FastMCP tools β in one line of code.
Stop rewriting your tools. Just adapt them.
lc2mcp is a lightweight adapter that converts existing LangChain tools into FastMCP tools, enabling you to quickly build MCP servers accessible to Claude, Cursor, and any MCP-compatible client.
| Feature | Description |
|---|---|
| π Instant Conversion | One function call to convert any LangChain tool to FastMCP tool |
| π¦ Ecosystem Access | Unlock 1000+ LangChain community tools (Search, Wikipedia, SQL, APIs...) |
| π― Zero Boilerplate | Automatic Pydantic β JSON Schema conversion |
| π Context Injection | Pass auth, user info, and request context to tools |
| π Progress & Logging | Full support for MCP progress notifications and logging |
| π·οΈ Namespace Support | Prefix tool names and handle conflicts automatically |
pip install lc2mcpfrom langchain_core.tools import tool
from fastmcp import FastMCP
from lc2mcp import register_tools
@tool
def get_weather(city: str) -> str:
"""Get current weather for a city."""
return f"Sunny, 25Β°C in {city}"
mcp = FastMCP("weather-server")
register_tools(mcp, [get_weather]) # β That's it!
if __name__ == "__main__":
mcp.run()Your tool is now available to Claude, Cursor, and any MCP client.
To include parameter descriptions in the MCP tool schema, you have two options:
Option 1: Use parse_docstring=True (Recommended for simplicity)
@tool(parse_docstring=True)
def get_weather(city: str, unit: str = "celsius") -> str:
"""Get current weather for a city.
Args:
city: The name of the city to query
unit: Temperature unit (celsius or fahrenheit)
"""
return f"Sunny, 25Β°C in {city}"Option 2: Use args_schema (Recommended for complex types)
from pydantic import BaseModel, Field
class WeatherInput(BaseModel):
city: str = Field(description="The name of the city to query")
unit: str = Field(default="celsius", description="Temperature unit")
@tool(args_schema=WeatherInput)
def get_weather(city: str, unit: str = "celsius") -> str:
"""Get current weather for a city."""
return f"Sunny, 25Β°C in {city}"Note: Without
parse_docstring=Trueorargs_schema, parameter descriptions from docstrings will not be extracted.
βββββββββββββββββββ βββββββββββββββ βββββββββββββββββββ
β LangChain Tool β ββββΆ β lc2mcp β ββββΆ β FastMCP Tool β
β (@tool, etc.) β β (adapter) β β β
βββββββββββββββββββ βββββββββββββββ ββββββββββ¬βββββββββ
β
βΌ
βββββββββββββββββββ
β FastMCP Server β
ββββββββββ¬βββββββββ
β
βΌ
βββββββββββββββββββββββββ
β MCP Clients β
β (Claude, Cursor, ...) β
βββββββββββββββββββββββββ
LangChain and MCP ecosystems can be connected in both directions:
| Direction | Tool | Description |
|---|---|---|
| LangChain β MCP | lc2mcp β
|
Convert LangChain tools to MCP tools (this project) |
| MCP β LangChain | langchain-mcp-adapters |
Convert MCP tools to LangChain tools (official) |
When to use lc2mcp:
- You have existing LangChain tools and want to expose them via MCP
- You want to build an MCP server using LangChain's rich tool ecosystem
- You need to serve tools to Claude, Cursor, or other MCP clients
When to use langchain-mcp-adapters:
- You have MCP servers and want to use them in LangChain agents
- You want to call MCP tools from LangGraph workflows
Using both together:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Your Application β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β β
β LangChain Tools ββββ lc2mcp βββββΆ MCP Server βββΆ MCP Clients β
β β β (Claude, etc) β
β β β β
β βΌ βΌ β
β LangChain Agent βββ langchain-mcp-adapters βββ MCP Tools β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Both libraries are complementary and can be used together to build powerful AI applications that bridge the LangChain and MCP ecosystems.
Instantly expose DuckDuckGo search and Wikipedia to MCP clients:
pip install lc2mcp langchain-community duckduckgo-search wikipediafrom fastmcp import FastMCP
from langchain_community.tools import DuckDuckGoSearchRun, WikipediaQueryRun
from langchain_community.utilities import WikipediaAPIWrapper
from lc2mcp import register_tools
mcp = FastMCP("knowledge-server")
register_tools(mcp, [
DuckDuckGoSearchRun(),
WikipediaQueryRun(api_wrapper=WikipediaAPIWrapper()),
])
if __name__ == "__main__":
mcp.run()Inject user authentication and app context into your tools:
from dataclasses import dataclass
from fastmcp import Context, FastMCP
from langchain_core.tools import tool
from langgraph.prebuilt import ToolRuntime
from lc2mcp import register_tools
@dataclass(frozen=True)
class UserContext:
user_id: str
tenant_id: str
@tool
def whoami(runtime: ToolRuntime[UserContext]) -> str:
"""Return the current user."""
return f"Hello, user {runtime.context.user_id} from {runtime.context.tenant_id}"
def runtime_adapter(mcp_ctx: Context) -> ToolRuntime[UserContext]:
return ToolRuntime(
context=UserContext(
user_id=mcp_ctx.get_state("user_id") or "anonymous",
tenant_id=mcp_ctx.get_state("tenant_id") or "default",
),
state={}, config={}, stream_writer=lambda x: None,
tool_call_id=None, store=None,
)
mcp = FastMCP("auth-server")
register_tools(mcp, [whoami], runtime_adapter=runtime_adapter)
if __name__ == "__main__":
mcp.run()Use MCP context for real-time progress updates and logging:
from fastmcp import Context, FastMCP
from langchain_core.tools import tool
from lc2mcp import register_tools
@tool
async def process_data(data: str, mcp_ctx: Context) -> str:
"""Process data with progress reporting."""
await mcp_ctx.info(f"Starting: {data}")
await mcp_ctx.report_progress(0, 100, "Starting")
# ... processing steps ...
await mcp_ctx.report_progress(50, 100, "Processing")
await mcp_ctx.info("Complete!")
await mcp_ctx.report_progress(100, 100, "Done")
return f"Processed: {data}"
mcp = FastMCP("processor")
register_tools(mcp, [process_data], inject_mcp_ctx=True)
if __name__ == "__main__":
mcp.run()Organize tools with prefixes and handle name collisions:
from fastmcp import FastMCP
from lc2mcp import register_tools
mcp = FastMCP("multi-domain")
# Prefix all finance tools
register_tools(mcp, finance_tools, name_prefix="finance.")
# Auto-suffix on collision: tool β tool_2 β tool_3
register_tools(mcp, ops_tools, name_prefix="ops.", on_name_conflict="suffix")
if __name__ == "__main__":
mcp.run()Convert and register LangChain tools as FastMCP tools on a server.
register_tools(
mcp: FastMCP,
tools: list[BaseTool | Callable],
*,
name_prefix: str | None = None, # e.g. "finance." β "finance.get_stock"
on_name_conflict: str = "error", # "error" | "overwrite" | "suffix"
inject_mcp_ctx: bool = False, # inject mcp_ctx: Context
runtime_adapter: Callable | None = None, # Context β ToolRuntime[...]
)Convert a single LangChain tool to FastMCP tool for manual registration.
to_mcp_tool(
tool: BaseTool | Callable,
*,
name: str | None = None,
description: str | None = None,
args_schema: Type[BaseModel] | None = None,
inject_mcp_ctx: bool = False,
runtime_adapter: Callable | None = None,
) -> Callable| Component | Supported Versions |
|---|---|
| Python | 3.10, 3.11, 3.12+ |
| LangChain | >= 1.0.0 |
| FastMCP | >= 2.0.0 |
| Tool Type | Status |
|---|---|
@tool decorated functions |
β Full support |
@tool(parse_docstring=True) |
β Full support (with parameter descriptions) |
@tool(args_schema=...) |
β Full support (with parameter descriptions) |
StructuredTool |
β Full support |
BaseTool subclasses |
β
Supported (requires args_schema) |
Contributions are welcome! Please feel free to submit a Pull Request.
MIT License - see LICENSE for details.
Made with β€οΈ for the LangChain and MCP communities