Migrate from LangChain
Move a LangChain or LangGraph project into Connic. The connic migrate CLI handles the initial conversion, then you clean up LangGraph-specific patterns that need manual attention.
Concept mapping
LangChain keeps agent definitions, tools, and orchestration in Python. Connic separates them into YAML configuration and standalone Python tool modules. The table below shows how the main LangChain concepts translate.
| LangChain / LangGraph | Connic | Notes |
|---|---|---|
create_agent() / create_react_agent() | YAML file in agents/ with type: llm | Both signatures are detected |
system_prompt / prompt | system_prompt | Extracted from keyword arguments |
ChatOpenAI, init_chat_model, etc. | model with provider prefix | gpt-4.1 becomes openai/gpt-4.1 |
@tool functions / plain callables | Python functions in tools/ | Decorator is removed; the function body is kept |
tools=[...] | tools: list in agent YAML | Resolved to module.function references |
LangGraph StateGraph / workflows | Sequential agents or custom tool logic | Requires manual restructuring |
| Checkpointers / stores | Connic sessions, knowledge base, or database | Requires manual redesign |
| LangSmith tracing | Connic observability (built in) | No migration needed; remove LangSmith integration code |
| Retrieval chains / RAG | Knowledge tools or custom tools | Restructure as a Connic tool or use the built-in knowledge base |
Complex LangGraph projects
If your project is built primarily around LangGraph state graphs, handoffs, or heavily customized retrieval chains, connic migrate will extract the agent and tool definitions it finds, but the graph orchestration itself will need manual restructuring. You can use a coding agent (Cursor, Windsurf, Claude Code, Codex, etc.) to handle the full migration. Run connic migrate first for the scaffold, then let the coding agent finish the cleanup using the Connic docs as context.
Example prompt for a coding agent
You are migrating a Python LangChain or LangGraph project to Connic.
1. Inspect the existing project at ./my-langchain-project and explain how agents, tools, prompts, retrieval, memory, and orchestration are structured.
2. Run `connic migrate --source ./my-langchain-project --dest ./my-connic-project`.
3. Review the generated Connic project and fix any issues listed in MIGRATION_REPORT.md.
4. Run `connic lint` inside the migrated project and resolve any errors.
5. Summarize what migrated cleanly and what still needs manual work.
Prefer Connic conventions: YAML agents in agents/, Python tools in tools/, middleware/ for hooks, schemas/ for structured output.
Use the Connic docs at https://connic.co/docs/v1 as a reference.Run the migration
Install the SDK and run connic migrate. It will prompt you for the path to your existing LangChain project and a destination for the new Connic project.
pip install connic-composer-sdk
connic migrateYou can also pass both paths directly:
connic migrate --source ./my-langchain-project --dest ./my-connic-projectThe migrator scans every Python file for create_agent and create_react_agent calls, extracts tools, resolves model names and prompts (including across imports), generates the Connic project, and runs connic lint on the result.
Review the generated project
The migrator creates the following structure. Start by reading MIGRATION_REPORT.md, which lists every agent that was migrated, what tools were resolved, and any items that need manual work.
my-connic-project/
├── agents/
│ └── agent.yaml
├── tools/
│ └── tools.py # Extracted tool functions
├── middleware/
├── schemas/
├── requirements.txt
├── README.md
└── MIGRATION_REPORT.md # Review this firstUnderstand the output
Below are examples of how typical LangChain definitions translate to Connic.
Agent definition
LangChain (before)
from langchain.agents import create_agent
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(model="gpt-4.1")
agent = create_agent(
llm,
tools=[search_docs, fetch_weather],
system_prompt="You are a helpful research assistant.",
)Connic (after)
version: "1.0"
name: agent
type: llm
model: openai/gpt-4.1
description: "You are a helpful research assistant."
system_prompt: |
You are a helpful research assistant.
tools:
- tools.search_docs
- tools.fetch_weatherTool function
LangChain (before)
from langchain_core.tools import tool
@tool
def fetch_weather(city: str) -> str:
"""Return the current weather for a city."""
return requests.get(f"https://api.weather.example/{city}").textConnic (after)
import requests
def fetch_weather(city: str) -> str:
"""Return the current weather for a city."""
return requests.get(f"https://api.weather.example/{city}").textConnic uses the function signature and docstring directly, so the @tool decorator is not needed. The migrator extracts the function body along with its imports and dependencies.
Clean up
Migrates automatically
create_agent() and create_react_agent() calls
Plain Python tools and @tool decorated functions
Static system_prompt / prompt keyword arguments
Model names from ChatOpenAI, init_chat_model, and similar
Cross-file imports (variables and functions are resolved)
Tool dependencies (helper functions and local modules)
Needs manual work
LangGraph StateGraph workflows and handoffs
Dynamic prompt construction (factories, templates with runtime values)
Checkpointers and custom persistence stores
Retrieval chains and RAG pipelines
LangSmith integration code (tracing, evaluations, prompt registries)
Custom agent wrappers or subclassed agents
Agents that are not assigned to a top-level variable
Checklist after migration
- Read
MIGRATION_REPORT.mdand address every follow-up item. - Open each agent YAML in
agents/and verify the system prompt and tool list. - Check that tool modules in
tools/still import everything they need (relative imports from the original project may break). - Remove LangSmith, LangServe, or other LangChain platform code that is no longer needed.
- Restructure LangGraph workflows into sequential agents or custom tools.
- Replace checkpointer-based persistence with Connic sessions or the knowledge base.
- Run
connic lintafter each cleanup pass. - Run
connic testto verify agents work end-to-end.