Integrations overview | LangChain Reference (original) (raw)
Welcome! These pages include reference documentation for all langchain-* Python integration packages.
To learn more about integrations in LangChain, visit the Integrations overview.
Model Context Protocol (MCP)
LangChain supports the Model Context Protocol (MCP). This lets external tools work with LangChain and LangGraph applications through a standard interface.
To use MCP tools in your project, see langchain-mcp-adapters.
Popular providers¶
langchain-openai
Interact with OpenAI (completions, responses) and OpenAI compatible APIs.
Reference
* langchain-anthropic
Interact with Claude (Anthropic) APIs.
Reference
* langchain-google-genai
Access Google Gemini models via the Google Gen AI SDK.
Reference
* langchain-aws
Use integrations related to the AWS platform such as Bedrock, S3, and more.
Reference
* langchain-huggingface
Access HuggingFace-hosted models in LangChain.
Reference
* langchain-groq
Interface to Groq Cloud.
Reference
* langchain-ollama
Use locally hosted models via Ollama.
Reference
Other providers, including langchain-community, are listed in the section navigation (left sidebar).
"I don't see the integration I'm looking for"
LangChain has hundreds of integrations, but not all are documented on this site. If you don't see the integration you're looking for, refer to their provider page in the LangChain docs. Furthermore, many community maintained integrations are available in the langchain-community package.
Create new integrations
For information on contributing new integrations, see the guide.