LangGraph is the primary framework integration in Idun Agent Platform. It supports full AG-UI streaming, CopilotKit, and persistent checkpointing through in-memory, SQLite, or PostgreSQL backends.
Want to start from working code? The agent templates include 7 LangGraph examples covering tool calling, structured I/O, and multi-step workflows.
Create a LangGraph agent
The fastest path. Create the agent in the Manager, then connect your code.Open the agent wizard
From the Agent Dashboard, click Create an agent. In step 1, enter a name and select LangGraph.
Configure the framework
In step 2, fill in:
-
Graph Definition: path to your graph file and variable, e.g.
./agent/graph.py:graph
-
Agent Host: Localhost (for local development) or Remote
-
Server Port: the port your agent will listen on
The graph definition format is file_path:variable_name. The file path is relative to where you run the agent. The variable must be an uncompiled StateGraph (the engine compiles it with the checkpointer). Enroll your agent
Step 3 gives you the enrollment instructions. In your agent’s project directory:pip install idun-agent-engine langgraph langchain-openai
export IDUN_MANAGER_HOST=http://localhost:8000
export IDUN_AGENT_API_KEY=<your-api-key>
idun agent serve --source manager
Get the API key from the API Integration tab on your agent’s detail page. The agent connects to the Manager, fetches its config, and starts. The Manager sets in-memory checkpointing by default. To switch to SQLite or PostgreSQL, see Memory. Set agent.type to LANGGRAPH and provide the graph_definition field.server:
api:
port: 8008
agent:
type: LANGGRAPH
config:
name: "my-langgraph-agent"
graph_definition: "my_agent/agent.py:graph"
checkpointer:
type: memory
Then run:pip install idun-agent-engine langgraph langchain-openai
idun agent serve --source file --path config.yaml
Write your agent code
Create a file at the path matching your graph_definition:
from typing import Annotated, TypedDict
from langchain_openai import ChatOpenAI
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
class State(TypedDict):
messages: Annotated[list, add_messages]
llm = ChatOpenAI(model="gpt-4o-mini")
def chatbot(state: State):
return {"messages": [llm.invoke(state["messages"])]}
graph = StateGraph(State)
graph.add_node("chatbot", chatbot)
graph.add_edge(START, "chatbot")
graph.add_edge("chatbot", END)
# Do NOT call graph.compile() here.
# Export the uncompiled StateGraph.
You must export an uncompiled StateGraph. The engine compiles the graph internally to attach the checkpointer and platform features. Exporting a CompiledStateGraph (the result of .compile()) will raise an error at startup.
The graph_definition field
The value follows the format file_path:variable_name:
- File path: relative path to the Python file (e.g.,
my_agent/agent.py)
- Variable name: the
StateGraph variable in that file (e.g., graph)
The engine tries the file path first, then falls back to Python module import notation.
Checkpointing
LangGraph agents can persist conversation state across restarts using checkpointers. The Manager sets in-memory checkpointing by default. You can switch to SQLite or PostgreSQL for durability. See Memory and checkpointing for LangGraph for backend options and configuration.
Next steps