Get started
Install LangChain
pip install -U langchainuv add langchainQuickstart
Build a basic agent
Start by creating a simple agent that can answer questions and call tools. The agent will use Claude Sonnet 4.5 as its language model, a basic weather function as a tool, and a simple prompt to guide its behavior.
INFO
For this example, you will need to set up a Claude (Anthropic) account and get an API key. Then, set the ANTHROPIC_API_KEY environment variable in your terminal.
from langchain.agents import create_agent
def get_weather(city: str) -> str:
"""Get weather for a given city."""
return f"It's always sunny in {city}!"
agent = create_agent(
model="claude-sonnet-4-5-20250929",
tools=[get_weather],
system_prompt="You are a helpful assistant",
)
# Run the agent
agent.invoke(
{"messages": [{"role": "user", "content": "what is the weather in sf"}]}
)Build a real-world agent
Next, build a practical weather forecasting agent that demonstrates key production concepts:
Let's walk through each step:
Detailed system prompts for better agent behavior The system prompt defines your agent’s role and behavior. Keep it specific and actionable:
pythonSYSTEM_PROMPT = """You are an expert weather forecaster, who speaks in puns. You have access to two tools: - get_weather_for_location: use this to get the weather for a specific location - get_user_location: use this to get the user's location If a user asks you for the weather, make sure you know the location. If you can tell from the question that they mean wherever they are, use the get_user_location tool to find their location."""Create tools that integrate with external data
Toolslet a model interact with external systems by calling functions you define. Tools can depend onruntime contextand also interact withagent memory.Notice below how the
get_user_locationtool uses runtime context:pythonfrom dataclasses import dataclass from langchain.tools import tool, ToolRuntime @tool def get_weather_for_location(city: str) -> str: """Get weather for a given city.""" return f"It's always sunny in {city}!" @dataclass class Context: """Custom runtime context schema.""" user_id: str @tool def get_user_location(runtime: ToolRuntime[Context]) -> str: """Retrieve user information based on user ID.""" user_id = runtime.context.user_id return "Florida" if user_id == "1" else "SF"TIP
Tools should be well-documented: their name, description, and argument names become part of the model's prompt. LangChain's
@tooldecorator adds metadata and enables runtime injection via theToolRuntimeparameter.Model configuration for consistent responses
Set up your language model with the right parameters for your use case:
pythonfrom langchain.chat_models import init_chat_model model = init_chat_model( "claude-sonnet-4-5-20250929", temperature=0.5, timeout=10, max_tokens=1000 )Structured output for predictable results
Optionally, define a structured response format if you need the agent responses to match a specific schema.
pythonfrom dataclasses import dataclass # We use a dataclass here, but Pydantic models are also supported. @dataclass class ResponseFormat: """Response schema for the agent.""" # A punny response (always required) punny_response: str # Any interesting information about the weather if available weather_conditions: str | None = NoneConversational memory for chat-like interactions
Add memory to your agent to maintain state across interactions. This allows the agent to remember previous conversations and context.
pythonfrom langgraph.checkpoint.memory import InMemorySaver checkpointer = InMemorySaver()In production, use a persistent checkpointer that saves to a database. See Add and manage memory for more details.
Create and run the agent create a fully functional agent
Now assemble your agent with all the components and run it!
pythonagent = create_agent( model=model, system_prompt=SYSTEM_PROMPT, tools=[get_user_location, get_weather_for_location], context_schema=Context, response_format=ResponseFormat, checkpointer=checkpointer ) # `thread_id` is a unique identifier for a given conversation. config = {"configurable": {"thread_id": "1"}} response = agent.invoke( {"messages": [{"role": "user", "content": "what is the weather outside?"}]}, config=config, context=Context(user_id="1") ) print(response['structured_response']) # ResponseFormat( # punny_response="Florida is still having a 'sun-derful' day! The sunshine is playing 'ray-dio' hits all day long! I'd say it's the perfect weather for some 'solar-bration'! If you were hoping for rain, I'm afraid that idea is all 'washed up' - the forecast remains 'clear-ly' brilliant!", # weather_conditions="It's always sunny in Florida!" # ) # Note that we can continue the conversation using the same `thread_id`. response = agent.invoke( {"messages": [{"role": "user", "content": "thank you!"}]}, config=config, context=Context(user_id="1") ) print(response['structured_response']) # ResponseFormat( # punny_response="You're 'thund-erfully' welcome! It's always a 'breeze' to help you stay 'current' with the weather. I'm just 'cloud'-ing around waiting to 'shower' you with more forecasts whenever you need them. Have a 'sun-sational' day in the Florida sunshine!", # weather_conditions=None # )
Full example code
from dataclasses import dataclass
from langchain.agents import create_agent
from langchain.chat_models import init_chat_model
from langchain.tools import tool, ToolRuntime
from langgraph.checkpoint.memory import InMemorySaver
# Define system prompt
SYSTEM_PROMPT = """You are an expert weather forecaster, who speaks in puns.
You have access to two tools:
- get_weather_for_location: use this to get the weather for a specific location
- get_user_location: use this to get the user's location
If a user asks you for the weather, make sure you know the location. If you can tell from the question that they mean wherever they are, use the get_user_location tool to find their location."""
# Define context schema
@dataclass
class Context:
"""Custom runtime context schema."""
user_id: str
# Define tools
@tool
def get_weather_for_location(city: str) -> str:
"""Get weather for a given city."""
return f"It's always sunny in {city}!"
@tool
def get_user_location(runtime: ToolRuntime[Context]) -> str:
"""Retrieve user information based on user ID."""
user_id = runtime.context.user_id
return "Florida" if user_id == "1" else "SF"
# Configure model
model = init_chat_model(
"claude-sonnet-4-5-20250929",
temperature=0
)
# Define response format
@dataclass
class ResponseFormat:
"""Response schema for the agent."""
# A punny response (always required)
punny_response: str
# Any interesting information about the weather if available
weather_conditions: str | None = None
# Set up memory
checkpointer = InMemorySaver()
# Create agent
agent = create_agent(
model=model,
system_prompt=SYSTEM_PROMPT,
tools=[get_user_location, get_weather_for_location],
context_schema=Context,
response_format=ResponseFormat,
checkpointer=checkpointer
)
# Run agent
# `thread_id` is a unique identifier for a given conversation.
config = {"configurable": {"thread_id": "1"}}
response = agent.invoke(
{"messages": [{"role": "user", "content": "what is the weather outside?"}]},
config=config,
context=Context(user_id="1")
)
print(response['structured_response'])
# ResponseFormat(
# punny_response="Florida is still having a 'sun-derful' day! The sunshine is playing 'ray-dio' hits all day long! I'd say it's the perfect weather for some 'solar-bration'! If you were hoping for rain, I'm afraid that idea is all 'washed up' - the forecast remains 'clear-ly' brilliant!",
# weather_conditions="It's always sunny in Florida!"
# )
# Note that we can continue the conversation using the same `thread_id`.
response = agent.invoke(
{"messages": [{"role": "user", "content": "thank you!"}]},
config=config,
context=Context(user_id="1")
)
print(response['structured_response'])
# ResponseFormat(
# punny_response="You're 'thund-erfully' welcome! It's always a 'breeze' to help you stay 'current' with the weather. I'm just 'cloud'-ing around waiting to 'shower' you with more forecasts whenever you need them. Have a 'sun-sational' day in the Florida sunshine!",
# weather_conditions=None
# )
