A lightweight MCP (Model Context Protocol) server for Garak.
Example:
https://github.com/user-attachments/assets/f6095d26-2b79-4ef7-a889-fd6be27bbbda
Name | Description |
---|---|
list_model_types | List all available model types (ollama, openai, huggingface, ggml) |
list_models | List all available models for a given model type |
list_garak_probes | List all available Garak attacks/probes |
get_report | Get the report of the last run |
run_attack | Run an attack with a given model and probe |
list_model_types
list_models
model_type
(string, required): The type of model to list (ollama, openai, huggingface, ggml)list_garak_probes
get_report
run_attack
model_type
(string, required): The type of model to usemodel_name
(string, required): The name of the model to useprobe_name
(string, required): The name of the attack/probe to usePython 3.11 or higher: This project requires Python 3.11 or newer.
# Check your Python version
python --version
Install uv: A fast Python package installer and resolver.
pip install uv
Or use Homebrew:
brew install uv
Optional: Ollama: If you want to run attacks on ollama models be sure that the ollama server is running.
ollama serve
git clone https://github.com/BIGdeadLock/Garak-MCP.git
{
"mcpServers": {
"garak-mcp": {
"command": "uv",
"args": ["--directory", "path-to/Garak-MCP", "run", "garak-server"],
"env": {}
}
}
}
Tested on:
{
"mcpServers": {
"garak-mcp": {
"env": {},
"args": [
"--directory",
"path-to/Garak-MCP",
"run",
"garak-server"
],
"command": "uv"
}
}
}
Seamless access to top MCP servers powering the future of AI integration.