MCPConn is a strong Python library that serves as a bridge between your functions and numerous AI fashions by the Model Context Protocol (MCP). Whether or not you’re constructing a chatbot, an AI-powered instrument, or integrating AI capabilities into your present software, MCPConn offers a streamlined, safe, and standardized approach to work together with AI fashions.
MCPConn acts as a high-level shopper library that simplifies the combination of AI fashions into Python functions. It implements the Mannequin Context Protocol (MCP), which offers a standardized approach to talk with completely different AI suppliers like Anthropic and OpenAI. Consider it as a common adapter for AI — as an alternative of coping with completely different APIs for every supplier, you get a constant interface that works throughout all of them.
- Change between AI suppliers (Anthropic, OpenAI) with minimal code adjustments
- Constant interface whatever the underlying AI mannequin
- Future-proof your software towards supplier adjustments
- STDIO transport for native improvement and testing
- Server-Despatched Occasions (SSE) for real-time streaming
- Streamable HTTP for traditional net integration
Notice: OpenAI solely helps distant MCP endpoints
- Complete guardrails system for content material filtering
- PII detection and masking
- Injection assault prevention
- Customized phrase filtering and response blocking
- Constructed-in dialog historical past monitoring
- Session administration with distinctive dialog IDs
- Persistent context throughout a number of interactions
- Constructed with
asyncio
for high-performance, non-blocking I/O - Environment friendly dealing with of concurrent requests
- Streaming response help
- Standardized approach to expose exterior instruments to AI fashions
- Constant instrument utilization throughout completely different suppliers
- Simple integration of customized instruments and APIs
pip set up mcpconn
import asyncio
from mcpconn import MCPClientasync def major():
# Set your OpenAI API key within the atmosphere earlier than working
# export OPENAI_API_KEY="your-key-here"
# Hook up with a distant MCP server utilizing OpenAI and streamable_http transport
# NOTE: OpenAI solely helps distant MCP endpoints (not native/stdio/localhost). See: https://platform.openai.com/docs/guides/tools-remote-mcp
shopper = MCPClient(llm_provider="openai")
await shopper.join("https://mcp.deepwiki.com/mcp", transport="streamable_http")
# Ship a message and get a response
response = await shopper.question("give me record of instruments supplied")
print(f"AI: {response}")
# Disconnect from the server
await shopper.disconnect()
if __name__ == "__main__":
asyncio.run(major())