Connect LangChain & LangGraph
from canopy_ai.langchain import to_langchain_tools returns ready-to-use StructuredTool instances. Pass them straight to LangGraph's create_react_agent or any LangChain agent that accepts a tool list.
npx @canopy-ai/sdk connect in your project root. It opens a consent page in your browser, then writes credentials to ~/.config/canopy/credentials and merges a canopy MCP server entry into any installed Claude Code, Cursor, Claude Desktop, Windsurf, Cline, VS Code, or Zed. Skip Steps 2 and 4 below.Step 1 — Connect your agent in the dashboard
Canopy is bring-your-own-agent. This step doesn't create the agent itself — you've already built that, or are about to. It registers a Canopy-side record that pairs your agent with a spending policy and gives you an agt_… ID to use in your code.
Sign in at trycanopy.ai and go to Agents → Connect agent. Give the agent a name and pick (or create) a policy. The policy controls the spend cap, recipient allowlist, and approval threshold every payment from this agent will be evaluated against.
Step 2 — Copy your credentials
You need two values in your code:
- Org API key (
ak_live_…orak_test_…) — from Settings → API Keys. Copy it the moment you create it; the plaintext is shown only once. - Agent ID (
agt_…) — from the agent's detail page in /dashboard/agents.
Step 3 — Install the package
pip install 'canopy-ai[langchain]' langgraph langchain-openaiStep 4 — Set your environment variables
CANOPY_API_KEY=ak_live_xxxxxxxxxxxxxxxx
CANOPY_AGENT_ID=agt_xxxxxxxxUse a .env file locally and your platform's secret manager in production. Never commit credentials.
Step 5 — Connect in your agent code
Paste the snippet below into your existing LangGraph agent.
# 1. Add to your .env:
# CANOPY_API_KEY=ak_live_xxxxxxxxxxxxxxxx
# 2. In your agent code:
import os
from canopy_ai import Canopy
from canopy_ai.langchain import to_langchain_tools
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent
canopy = Canopy(
api_key=os.environ["CANOPY_API_KEY"],
agent_id="agt_xxxxxxxx",
)
agent = create_react_agent(
ChatOpenAI(model="gpt-4o"),
tools=to_langchain_tools(canopy),
)
agent.invoke({
"messages": [{"role": "user", "content": "Send 10 cents to 0x1234..."}],
})Step 6 — Verify the connection
Run your agent once. As soon as Canopy receives a request from it, the dashboard flips the agent to connected and shows the first event captured. If nothing happens after a minute, see Troubleshooting.
Install
canopy_ai.langchain requires the optional dep langchain-core. Install with pip install 'canopy-ai[langchain]'.
Async LangGraph
For async workflows (FastAPI, asyncio agent loops), use AsyncCanopy instead — to_langchain_tools(async_canopy) detects the async executors and binds the coroutine= argument on each StructuredTool.
Where to go next
- Payment outcomes — what the agent gets back when a payment succeeds, pends, or is denied
- Python SDK reference —
to_langchain_toolsand other helpers