Context
You want to integrate JIRA, Slack, and Confluence Model Context Protocol (MCP) servers with LangChain to enable AI agents to interact with these services using natural language. This integration allows you to streamline workflows and increase productivity by letting AI handle technical execution instead of manually navigating interfaces.
Answer
Yes, LangChain has full support for MCP integration. Here's how to connect each service:
General MCP Integration with LangChain
LangChain supports MCP through the langchain-mcp-adapters library. You can integrate existing MCP servers using the following code:
from langchain_mcp_adapters import MultiServerMCPClient
client = MultiServerMCPClient({
"your-mcp-server": {
"transport": "streamable_http",
"url": "http://your-mcp-server-url/mcp"
}
})
tools = await client.get_tools()Slack Integration
For Slack, you have two options:
LangSmith Agent Builder Slack App (Recommended): Use the LangSmith Agent Builder Slack App which provides:
Send DMs and post to channels
Read threads
No custom MCP setup needed
Slack data remains private (not used for training)
Custom MCP Integration: If you have an existing Slack MCP server, use the general MCP integration method above.
Confluence Integration
For Confluence, you can use LangChain's built-in Confluence integration or connect via MCP if you have a Confluence MCP server.
JIRA Integration
For JIRA, especially self-hosted instances, you can use MCP servers. If you need to connect to a self-hosted JIRA instance on a private network, ensure your MCP server has proper network access and authentication configured.
Example MCP Configuration
Here's an example configuration for connecting to an MCP server with authentication:
{
"mcpServers": {
"your-service": {
"url": "https://your-mcp-server.com/mcp/sse",
"transport": "sse",
"headers": {
"Authorization": "Bearer YOUR_API_TOKEN"
}
}
}
}Deployment on LangSmith
If you're deploying agents on LangSmith, refer to the guide on how to deploy an MCP server on LangSmith deployments.
Sources: