zz ( -.-) > ^ <
> what if i told you your AI assistant is just a cat?
> MCP server wraps httpcat CLI. cat now has tools. cat can trade.
// Model Context Protocol = cat-approved way for AI to use httpcat
// your LLM can create tokens, buy, sell, and check balances directly
{
"mcpServers": {
"httpcat": {
"command": "npx",
"args": ["-y", "httpcat-cli", "mcp-server"],
"env": {
"HTTPCAT_PRIVATE_KEY": "0x..."
}
}
}
}> that's it. your AI assistant can now trade tokens directly.
Model Context Protocol (MCP) is a standard way for AI assistants to interact with external tools and data sources. It uses stdio transport and JSON-RPC for communication.
> 1. Add httpcat to your MCP client configuration
> 2. Set your private key in the env (or pass per-tool)
> 3. Your LLM can now call httpcat tools directly
> 4. Tools return structured JSON responses
• Cursor (AI code editor)
• Claude Desktop (Anthropic)
• Any MCP-compatible client
Private keys are resolved in priority order: tool parameter > HTTPCAT_PRIVATE_KEY env var > config file. Never commit your private key to version control.
> ask your AI assistant:
All tools return structured JSON responses with:
{
"success": true,
"operation": "buy_token",
"responseId": "resp_...",
"timestamp": "2024-...",
"data": { ... }
}Errors include structured error information with message, code, and details.