Connic
Getting Started

Migrate from LangChain

Move a LangChain or LangGraph project into Connic. The connic migrate CLI handles the initial conversion, then you clean up LangGraph-specific patterns that need manual attention.

Concept mapping

LangChain keeps agent definitions, tools, and orchestration in Python. Connic separates them into YAML configuration and standalone Python tool modules. The table below shows how the main LangChain concepts translate.

LangChain / LangGraphConnicNotes
create_agent() / create_react_agent()YAML file in agents/ with type: llmBoth signatures are detected
system_prompt / promptsystem_promptExtracted from keyword arguments
ChatOpenAI, init_chat_model, etc.model with provider prefixgpt-4.1 becomes openai/gpt-4.1
@tool functions / plain callablesPython functions in tools/Decorator is removed; the function body is kept
tools=[...]tools: list in agent YAMLResolved to module.function references
LangGraph StateGraph / workflowsSequential agents or custom tool logicRequires manual restructuring
Checkpointers / storesConnic sessions, knowledge base, or databaseRequires manual redesign
LangSmith tracingConnic observability (built in)No migration needed; remove LangSmith integration code
Retrieval chains / RAGKnowledge tools or custom toolsRestructure as a Connic tool or use the built-in knowledge base

Complex LangGraph projects

If your project is built primarily around LangGraph state graphs, handoffs, or heavily customized retrieval chains, connic migrate will extract the agent and tool definitions it finds, but the graph orchestration itself will need manual restructuring. You can use a coding agent (Cursor, Windsurf, Claude Code, Codex, etc.) to handle the full migration. Run connic migrate first for the scaffold, then let the coding agent finish the cleanup using the Connic docs as context.

Example prompt for a coding agent
prompt.txt
You are migrating a Python LangChain or LangGraph project to Connic.

1. Inspect the existing project at ./my-langchain-project and explain how agents, tools, prompts, retrieval, memory, and orchestration are structured.
2. Run `connic migrate --source ./my-langchain-project --dest ./my-connic-project`.
3. Review the generated Connic project and fix any issues listed in MIGRATION_REPORT.md.
4. Run `connic lint` inside the migrated project and resolve any errors.
5. Summarize what migrated cleanly and what still needs manual work.

Prefer Connic conventions: YAML agents in agents/, Python tools in tools/, middleware/ for hooks, schemas/ for structured output.
Use the Connic docs at https://connic.co/docs/v1 as a reference.
1

Run the migration

Install the SDK and run connic migrate. It will prompt you for the path to your existing LangChain project and a destination for the new Connic project.

terminal
pip install connic-composer-sdk

connic migrate

You can also pass both paths directly:

terminal
connic migrate --source ./my-langchain-project --dest ./my-connic-project

The migrator scans every Python file for create_agent and create_react_agent calls, extracts tools, resolves model names and prompts (including across imports), generates the Connic project, and runs connic lint on the result.

2

Review the generated project

The migrator creates the following structure. Start by reading MIGRATION_REPORT.md, which lists every agent that was migrated, what tools were resolved, and any items that need manual work.

project-structure
my-connic-project/
├── agents/
│   └── agent.yaml
├── tools/
│   └── tools.py              # Extracted tool functions
├── middleware/
├── schemas/
├── requirements.txt
├── README.md
└── MIGRATION_REPORT.md       # Review this first
3

Understand the output

Below are examples of how typical LangChain definitions translate to Connic.

Agent definition

LangChain (before)

agent.py
from langchain.agents import create_agent
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(model="gpt-4.1")

agent = create_agent(
    llm,
    tools=[search_docs, fetch_weather],
    system_prompt="You are a helpful research assistant.",
)

Connic (after)

agents/agent.yaml
version: "1.0"
name: agent
type: llm
model: openai/gpt-4.1
description: "You are a helpful research assistant."
system_prompt: |
  You are a helpful research assistant.
tools:
  - tools.search_docs
  - tools.fetch_weather

Tool function

LangChain (before)

tools.py
from langchain_core.tools import tool

@tool
def fetch_weather(city: str) -> str:
    """Return the current weather for a city."""
    return requests.get(f"https://api.weather.example/{city}").text

Connic (after)

tools/fetch_weather.py
import requests

def fetch_weather(city: str) -> str:
    """Return the current weather for a city."""
    return requests.get(f"https://api.weather.example/{city}").text

Connic uses the function signature and docstring directly, so the @tool decorator is not needed. The migrator extracts the function body along with its imports and dependencies.

4

Clean up

Migrates automatically

create_agent() and create_react_agent() calls

Plain Python tools and @tool decorated functions

Static system_prompt / prompt keyword arguments

Model names from ChatOpenAI, init_chat_model, and similar

Cross-file imports (variables and functions are resolved)

Tool dependencies (helper functions and local modules)

Needs manual work

LangGraph StateGraph workflows and handoffs

Dynamic prompt construction (factories, templates with runtime values)

Checkpointers and custom persistence stores

Retrieval chains and RAG pipelines

LangSmith integration code (tracing, evaluations, prompt registries)

Custom agent wrappers or subclassed agents

Agents that are not assigned to a top-level variable

Checklist after migration

  1. Read MIGRATION_REPORT.md and address every follow-up item.
  2. Open each agent YAML in agents/ and verify the system prompt and tool list.
  3. Check that tool modules in tools/ still import everything they need (relative imports from the original project may break).
  4. Remove LangSmith, LangServe, or other LangChain platform code that is no longer needed.
  5. Restructure LangGraph workflows into sequential agents or custom tools.
  6. Replace checkpointer-based persistence with Connic sessions or the knowledge base.
  7. Run connic lint after each cleanup pass.
  8. Run connic test to verify agents work end-to-end.