InitRunner

Import from LangChain

Already have a LangChain agent? InitRunner can convert it into a native role file automatically — mapping your model, system prompt, and tools to InitRunner's YAML format. Custom @tool functions are extracted into a sidecar Python module so they keep working without changes.

You can import via the dashboard (paste your code) or the CLI (point at a file).

Dashboard Import

1. Start a new agent and select Import

Open the dashboard, go to Agents → New Agent, type a name, and select Import under "Start From". Paste your LangChain Python code into the source editor. Choose the model that will power the conversion (this is the builder model, not your agent's model — the agent's model is read from your code).

Paste your LangChain source code and select a builder model

2. Review the converted agent

InitRunner parses your code and generates a complete role definition. Review the YAML — your model, system prompt, and tools are mapped automatically. If any LangChain features couldn't be converted (e.g. memory, LCEL chains), you'll see warnings at the top explaining what to configure manually.

Review the generated InitRunner agent YAML and any import warnings

3. Save

Click Save Agent to write the role file. Your imported agent is ready to run from the dashboard or CLI.

CLI Import

Point initrunner new at your LangChain Python file:

initrunner new --langchain my_agent.py

InitRunner reads the file, extracts the agent configuration, and generates a role.yaml in the current directory. If your code has @tool functions, a sidecar module (e.g. role_tools.py) is created alongside.

By default you enter an interactive refinement loop where you can tweak the generated YAML. Skip it with --no-refine:

# Import without interactive refinement
initrunner new --langchain my_agent.py --no-refine

# Custom output path
initrunner new --langchain my_agent.py --output math-agent.yaml

# Use a specific builder model for conversion
initrunner new --langchain my_agent.py --provider anthropic --model claude-sonnet-4-6
FlagDescription
--langchain PATHPath to the LangChain Python file
--output PATHOutput file path (default: role.yaml)
--provider TEXTBuilder model provider (auto-detected if omitted)
--model TEXTBuilder model name
--no-refineSkip the interactive refinement loop
--forceOverwrite existing file without prompting

Before and After

Here's a concrete example. This LangChain agent has two custom tools, a system prompt, and a model configuration:

LangChain agent (math_agent.py):

from langchain.agents import create_agent
from langchain.tools import tool
import math

@tool
def calculate(expression: str) -> str:
    """Evaluate a mathematical expression safely.

    Args:
        expression: A math expression like '2 + 2' or 'sqrt(16) * 3'
    """
    allowed = {
        "sqrt": math.sqrt, "sin": math.sin, "cos": math.cos,
        "pi": math.pi, "e": math.e, "abs": abs, "round": round, "pow": pow,
    }
    try:
        return str(eval(expression, {"__builtins__": {}}, allowed))
    except Exception as e:
        return f"Error: {e}"

@tool
def convert_units(value: float, from_unit: str, to_unit: str) -> str:
    """Convert between common units of measurement.

    Args:
        value: The numeric value to convert
        from_unit: Source unit (km, mi, kg, lb, c, f)
        to_unit: Target unit (km, mi, kg, lb, c, f)
    """
    table = {
        ("km", "mi"): lambda v: v * 0.621371,
        ("mi", "km"): lambda v: v * 1.60934,
        ("kg", "lb"): lambda v: v * 2.20462,
        ("lb", "kg"): lambda v: v * 0.453592,
        ("c", "f"): lambda v: v * 9 / 5 + 32,
        ("f", "c"): lambda v: (v - 32) * 5 / 9,
    }
    key = (from_unit.lower(), to_unit.lower())
    if key not in table:
        return f"Unknown conversion: {from_unit} -> {to_unit}"
    return f"{value} {from_unit} = {round(table[key](value), 4)} {to_unit}"

agent = create_agent(
    model="openai:gpt-4.1-mini",
    tools=[calculate, convert_units],
    system_prompt="You are a precise math and unit conversion assistant. Always use the calculate tool for math and convert_units for conversions. Show your work clearly.",
)

Run the import:

initrunner new --langchain math_agent.py --no-refine

Generated role.yaml:

apiVersion: initrunner/v1
kind: Agent
metadata:
  name: math-unit-assistant
  description: Precise math and unit conversion assistant using custom tools.
spec:
  role: |
    You are a precise math and unit conversion assistant.
    Always use the calculate tool for math and convert_units
    for conversions. Show your work clearly.
  model:
    provider: openai
    name: gpt-4.1-mini
  tools:
    - type: custom
      module: role_tools

Generated role_tools.py:

"""Custom tools extracted from LangChain agent."""
import math

def calculate(expression: str) -> str:
    """Evaluate a mathematical expression safely."""
    allowed = {
        "sqrt": math.sqrt, "sin": math.sin, "cos": math.cos,
        "pi": math.pi, "e": math.e, "abs": abs, "round": round, "pow": pow,
    }
    try:
        return str(eval(expression, {"__builtins__": {}}, allowed))
    except Exception as e:
        return f"Error: {e}"

def convert_units(value: float, from_unit: str, to_unit: str) -> str:
    """Convert between common units of measurement."""
    table = {
        ("km", "mi"): lambda v: v * 0.621371,
        ("mi", "km"): lambda v: v * 1.60934,
        ("kg", "lb"): lambda v: v * 2.20462,
        ("lb", "kg"): lambda v: v * 0.453592,
        ("c", "f"): lambda v: v * 9 / 5 + 32,
        ("f", "c"): lambda v: (v - 32) * 5 / 9,
    }
    key = (from_unit.lower(), to_unit.lower())
    if key not in table:
        return f"Unknown conversion: {from_unit} -> {to_unit}"
    return f"{value} {from_unit} = {round(table[key](value), 4)} {to_unit}"

The model, system prompt, and tools carry over automatically. Your @tool functions are extracted into role_tools.py with the @tool decorator removed — InitRunner discovers them as type: custom tools.

What Gets Converted

LangChainInitRunner
create_agent("openai:gpt-4.1", ...)spec.model: {provider: openai, name: gpt-4.1}
ChatAnthropic(model="...", temperature=0.7)spec.model: {provider: anthropic, temperature: 0.7}
init_chat_model("...", max_tokens=1000)spec.model: {max_tokens: 1000}
system_prompt="..."spec.role
@tool decorated functionstype: custom + sidecar .py module
DuckDuckGoSearchRun, TavilySearchResults, BraveSearchtype: search
PythonREPLTooltype: python
ShellTooltype: shell
ReadFileTool, WriteFileTool, ListDirectoryTooltype: filesystem
RequestsGetTool, RequestsPostTooltype: http
WikipediaQueryRuntype: web_reader
create_agent (ReAct pattern)spec.reasoning: {pattern: react}
response_format=MySchemaspec.output: {type: json_schema}
CallLimitMiddleware(max_calls=15)spec.guardrails: {max_iterations: 15}

What to Configure Manually

Some LangChain features don't have a direct 1:1 mapping and need manual configuration after import. The importer warns you about these:

LangChain FeatureInitRunner EquivalentGuide
ConversationBufferMemory, ConversationSummaryMemoryspec.memoryMemory
LCEL pipelines (prompt | model | parser)Describe the pipeline in spec.roleConfiguration
LangGraph state machinesflow.yaml multi-agent orchestrationFlow
Retrievers / VectorStoresspec.ingest for document ingestion + RAGIngestion
Callback handlersspec.observabilityObservability

Next Steps

On this page