Skip to main content
A ready-to-run example is available here!

Your First Agent

This is the most basic example showing how to set up and run an OpenHands agent.
1

LLM Configuration

Configure the language model that will power your agent:
llm = LLM(
    model=model,
    api_key=SecretStr(api_key),
    base_url=base_url,  # Optional
    service_id="agent"
)
2

Select an Agent

Use the preset agent with common built-in tools:
agent = get_default_agent(llm=llm, cli_mode=True)
The default agent includes BashTool, FileEditorTool, etc.
For the complete list of available tools see the tools package source code.
3

Start a Conversation

Start a conversation to manage the agent’s lifecycle:
conversation = Conversation(agent=agent, workspace=cwd)
conversation.send_message(
  "Write 3 facts about the current project into FACTS.txt."
)
conversation.run()
4

Expected Behavior

When you run this example:
  1. The agent analyzes the current directory
  2. Gathers information about the project
  3. Creates FACTS.txt with 3 relevant facts
  4. Completes and exits
Example output file:
FACTS.txt
---------
1. This is a Python project using the OpenHands Software Agent SDK.
2. The project includes examples demonstrating various agent capabilities.
3. The SDK provides tools for file manipulation, bash execution, and more.

Ready-to-run Example

This example is available on GitHub: examples/01_standalone_sdk/01_hello_world.py
examples/01_standalone_sdk/01_hello_world.py
import os

from openhands.sdk import LLM, Agent, Conversation, Tool
from openhands.tools.file_editor import FileEditorTool
from openhands.tools.task_tracker import TaskTrackerTool
from openhands.tools.terminal import TerminalTool


llm = LLM(
    model=os.getenv("LLM_MODEL", "anthropic/claude-sonnet-4-5-20250929"),
    api_key=os.getenv("LLM_API_KEY"),
    base_url=os.getenv("LLM_BASE_URL", None),
)

agent = Agent(
    llm=llm,
    tools=[
        Tool(name=TerminalTool.name),
        Tool(name=FileEditorTool.name),
        Tool(name=TaskTrackerTool.name),
    ],
)

cwd = os.getcwd()
conversation = Conversation(agent=agent, workspace=cwd)

conversation.send_message("Write 3 facts about the current project into FACTS.txt.")
conversation.run()
print("All done!")
You can run the example code as-is.
The model name should follow the LiteLLM convention: provider/model_name (e.g., anthropic/claude-sonnet-4-5-20250929, openai/gpt-4o). The LLM_API_KEY should be the API key for your chosen provider.
ChatGPT Plus/Pro subscribers: You can use LLM.subscription_login() to authenticate with your ChatGPT account and access Codex models without consuming API credits. See the LLM Subscriptions guide for details.

Next Steps