Memgraph MCP Server is a lightweight server implementation of the Model Context Protocol (MCP) designed to connect Memgraph with LLMs.
uv
and create venv
with uv venv
. Activate virtual environment with .venv\Scripts\activate
.uv add "mcp[cli]" httpx
uv run server.py
.MacOS/Linux
code ~/Library/Application\ Support/Claude/claude_desktop_config.json
Windows
code $env:AppData\Claude\claude_desktop_config.json
Example config:
{
"mcpServers": {
"mpc-memgraph": {
"command": "/Users/katelatte/.local/bin/uv",
"args": [
"--directory",
"/Users/katelatte/projects/mcp-memgraph",
"run",
"server.py"
]
}
}
}
[!NOTE]
You may need to put the full path to the uv executable in the command field. You can get this by runningwhich uv
on MacOS/Linux orwhere uv
on Windows. Make sure you pass in the absolute path to your server.
docker run -p 7687:7687 memgraph/memgraph-mage --schema-info-enabled=True
The --schema-info-enabled
configuration setting is set to True
to allow LLM to run SHOW SCHEMA INFO
query.Run a Cypher query against Memgraph.
Get Memgraph schema information (prerequisite: --schema-info-enabled=True
).
The Memgraph MCP Server is just at its beginnings. We're actively working on expanding its capabilities and making it even easier to integrate Memgraph into modern AI workflows. In the near future, we'll be releasing a TypeScript version of the server to better support JavaScript-based environments. Additionally, we plan to migrate this project into our central AI Toolkit repository, where it will live alongside other tools and integrations for LangChain, LlamaIndex, and MCP. Our goal is to provide a unified, open-source toolkit that makes it seamless to build graph-powered applications and intelligent agents with Memgraph at the core.
{
"mcpServers": {
"mpc-memgraph": {
"env": {},
"args": [
"--directory",
"/Users/katelatte/projects/mcp-memgraph",
"run",
"server.py"
],
"command": "/Users/katelatte/.local/bin/uv"
}
}
}
Seamless access to top MCP servers powering the future of AI integration.