A Model Context Protocol (MCP) server that enables seamless generation of high-quality images using the Flux.1 Schnell model via Together AI. This server provides a standardized interface to specify image generation parameters.
npm install together-mcp
Or run directly:
npx together-mcp@latest
Add to your MCP server configuration:
{
"mcpServers": {
"together-image-gen": {
"command": "npx",
"args": ["together-mcp@latest -y"],
"env": {
"TOGETHER_API_KEY": "<API KEY>"
}
}
}
}
The server provides one tool: generate_image
This tool has only one required parameter - the prompt. All other parameters are optional and use sensible defaults if not provided.
{
// Required
prompt: string; // Text description of the image to generate
// Optional with defaults
model?: string; // Default: "black-forest-labs/FLUX.1-schnell-Free"
width?: number; // Default: 1024 (min: 128, max: 2048)
height?: number; // Default: 768 (min: 128, max: 2048)
steps?: number; // Default: 1 (min: 1, max: 100)
n?: number; // Default: 1 (max: 4)
response_format?: string; // Default: "b64_json" (options: ["b64_json", "url"])
image_path?: string; // Optional: Path to save the generated image as PNG
}
Only the prompt is required:
{
"name": "generate_image",
"arguments": {
"prompt": "A serene mountain landscape at sunset"
}
}
Override any defaults and specify a path to save the image:
{
"name": "generate_image",
"arguments": {
"prompt": "A serene mountain landscape at sunset",
"width": 1024,
"height": 768,
"steps": 20,
"n": 1,
"response_format": "b64_json",
"model": "black-forest-labs/FLUX.1-schnell-Free",
"image_path": "/path/to/save/image.png"
}
}
The response will be a JSON object containing:
{
"id": string, // Generation ID
"model": string, // Model used
"object": "list",
"data": [
{
"timings": {
"inference": number // Time taken for inference
},
"index": number, // Image index
"b64_json": string // Base64 encoded image data (if response_format is "b64_json")
// OR
"url": string // URL to generated image (if response_format is "url")
}
]
}
If image_path was provided and the save was successful, the response will include confirmation of the save location.
If not specified in the request, these defaults are used:
prompt
parameter is required{
"@modelcontextprotocol/sdk": "0.6.0",
"axios": "^1.6.7"
}
Clone and build the project:
git clone https://github.com/manascb1344/together-mcp-server
cd together-mcp-server
npm install
npm run build
npm run build
- Build the TypeScript projectnpm run watch
- Watch for changes and rebuildnpm run inspector
- Run MCP inspectorContributions are welcome! Please follow these steps:
feature/my-new-feature
)Feature requests and bug reports can be submitted via GitHub Issues. Please check existing issues before creating a new one.
For significant changes, please open an issue first to discuss your proposed changes.
This project is licensed under the MIT License. See the LICENSE file for details.
Seamless access to top MCP servers powering the future of AI integration.