Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.idunplatform.com/llms.txt

Use this file to discover all available pages before exploring further.

LangGraph is the primary framework integration in Idun Agent Platform. It supports full AG-UI streaming, CopilotKit, and persistent checkpointing through in-memory, SQLite, or PostgreSQL backends.
Want to start from working code? The agent templates include 7 LangGraph examples covering tool calling, structured I/O, and multi-step workflows.

Create a LangGraph agent

The fastest path. Create the agent in the Manager, then connect your code.
1

Open the agent wizard

From the Agent Dashboard, click Create an agent. In step 1, enter a name and select LangGraph.Agent creation wizard with LangGraph selected
2

Configure the framework

In step 2, fill in:
  • Graph Definition: path to your graph file and variable, e.g. ./agent/graph.py:graph
  • Agent Host: Localhost (for local development) or Remote
  • Server Port: the port your agent will listen on LangGraph framework configuration step
The graph definition format is file_path:variable_name. The file path is relative to where you run the agent. The variable should be a StateGraph (the engine compiles it with the checkpointer). A CompiledStateGraph is also accepted but will be recompiled.
3

Enroll your agent

Step 3 gives you the enrollment instructions. In your agent’s project directory:
pip install idun-agent-engine langgraph langchain-openai
export IDUN_MANAGER_HOST=http://localhost:8000
export IDUN_AGENT_API_KEY=<your-api-key>
idun agent serve --source manager
Get the API key from the API Integration tab on your agent’s detail page. The agent connects to the Manager, fetches its config, and starts.
The Manager sets in-memory checkpointing by default. To switch to SQLite or PostgreSQL, see Memory.

Write your agent code

Create a file at the path matching your graph_definition:
my_agent/agent.py
from typing import Annotated, TypedDict

from langchain_openai import ChatOpenAI
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages


class State(TypedDict):
    messages: Annotated[list, add_messages]


llm = ChatOpenAI(model="gpt-4o-mini")


def chatbot(state: State):
    return {"messages": [llm.invoke(state["messages"])]}


graph = StateGraph(State)
graph.add_node("chatbot", chatbot)
graph.add_edge(START, "chatbot")
graph.add_edge("chatbot", END)

# Exporting the uncompiled StateGraph is recommended.
# The engine compiles it with the configured checkpointer.
Exporting an uncompiled StateGraph is recommended. If you export a CompiledStateGraph (the result of .compile()), the engine will extract the original StateGraph via .builder and recompile it with the engine-managed checkpointer and store. Compile options like interrupt_before and interrupt_after are preserved. A warning is logged when this happens.

The graph_definition field

The value follows the format file_path:variable_name:
  • File path: relative path to the Python file (e.g., my_agent/agent.py)
  • Variable name: the StateGraph variable in that file (e.g., graph)
The engine tries the file path first, then falls back to Python module import notation.

Checkpointing

LangGraph agents can persist conversation state across restarts using checkpointers. The Manager sets in-memory checkpointing by default. You can switch to SQLite or PostgreSQL for durability. See Memory and checkpointing for LangGraph for backend options and configuration.

Next steps

Last modified on April 15, 2026