An MCP server implementation for retrieving data such as PDF's from S3.
Expose AWS S3 Data through Resources. (think of these sort of like GET endpoints; they are used to load information into the LLM's context). Currently only PDF documents supported and limited to 1000 objects.
On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"s3-mcp-server": {
"command": "uv",
"args": [
"--directory",
"/Users/user/generative_ai/model_context_protocol/s3-mcp-server",
"run",
"s3-mcp-server"
]
}
}
}
{
"mcpServers": {
"s3-mcp-server": {
"command": "uvx",
"args": [
"s3-mcp-server"
]
}
}
}
To prepare the package for distribution:
uv sync
uv build
This will create source and wheel distributions in the dist/
directory.
uv publish
Note: You'll need to set PyPI credentials via environment variables or command flags:
--token
or UV_PUBLISH_TOKEN
--username
/UV_PUBLISH_USERNAME
and --password
/UV_PUBLISH_PASSWORD
Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.
You can launch the MCP Inspector via npm
with this command:
npx @modelcontextprotocol/inspector uv --directory /Users/user/generative_ai/model_context_protocol/s3-mcp-server run s3-mcp-server
Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.
See CONTRIBUTING for more information.
This library is licensed under the MIT-0 License. See the LICENSE file.
{
"mcpServers": {
"s3-mcp-server": {
"env": {},
"args": [
"--directory",
"/Users/user/generative_ai/model_context_protocol/s3-mcp-server",
"run",
"s3-mcp-server"
],
"command": "uv"
}
}
}
Seamless access to top MCP servers powering the future of AI integration.