"primitive" RAG-like web search model context protocol (MCP) server that runs locally. ✨ no APIs ✨
uv
: https://docs.astral.sh/uv/Just paste this directly into Claude config. You can find the configuration paths here: https://modelcontextprotocol.io/quickstart/user
{
"mcpServers": {
"mcp-local-rag":{
"command": "uvx",
"args": [
"--python=3.10",
"--from",
"git+https://github.com/nkapila6/mcp-local-rag",
"mcp-local-rag"
]
}
}
}
git clone https://github.com/nkapila6/mcp-local-rag
{
"mcpServers": {
"mcp-local-rag": {
"command": "uv",
"args": [
"--directory",
"<path where this folder is located>/mcp-local-rag/",
"run",
"src/mcp_local_rag/main.py"
]
}
}
}
When asked to fetch/lookup/search the web, the model prompts you to use MCP server for the chat.
In the example, have asked it about Google's latest Gemma models released yesterday. This is new info that Claude is not aware about.
The result from the local rag_search
helps the model answer with new info.
{
"mcpServers": {
"mcp-local-rag": {
"env": {},
"args": [
"--python=3.10",
"--from",
"git+https://github.com/nkapila6/mcp-local-rag",
"mcp-local-rag"
],
"command": "uvx"
}
}
}
Seamless access to top MCP servers powering the future of AI integration.