LangGraph

This section contains implementation using LangGraph.

Getting Started

Prerequisites

Setup

  1. Clone the repository:

    git clone https://github.com/GDP-ADMIN/gen-ai-examples.git 
  2. cd gen-ai-examples/examples/aip-agent-quickstart
  3. Install dependencies:

    poetry install
  4. Setup OpenAPI key

    export OPENAI_API_KEY='your-openai-api-key-here'

Basic implementation

We can use LangGraphAgent to utilize LangGraph's framework using our Agent Interface.

from gllm_agents.agent.langgraph_agent import LangGraphAgent
from langchain_openai import ChatOpenAI

from aip_agent_quickstart.config import CALCULATOR_AGENT_INSTRUCTION
from aip_agent_quickstart.tools import langchain_add_numbers

agent = LangGraphAgent(
    name="LangGraphArithmeticAgent",
    instruction=CALCULATOR_AGENT_INSTRUCTION,
    model=ChatOpenAI(model="gpt-4.1", temperature=0),
    tools=[langchain_add_numbers],
)
response = agent.run(
   query="What is the sum of 23 and 47? And then add 10 to that, then add 5 more."
)
print(response["output"])

Running the Agent

You can run the following to test running the agent;

poetry run python hello_world_langgraph.py

You should see the following output:

The sum of 23 and 47 is 70. Adding 10 gives 80, and adding 5 more gives a final result of 85.

Multi-agent

We provide multi-agent capabilities out of the box for LangGraph agents. The following example showcases:

  1. How to define multiple specialized agents (WeatherAgent, MathAgent).

  2. How to set up a CoordinatorAgent that can delegate to these specialized agents.

  3. How the CoordinatorAgent uses dynamically created tools to call sub-agents.

  4. How the CoordinatorAgent can delegate tasks to the appropriate sub-agents.

from gllm_agents.agent.langgraph_agent import LangGraphAgent
from langchain_openai import ChatOpenAI

from aip_agent_quickstart.config import (
    COORDINATOR_MULTI_AGENT_INSTRUCTION,
    MATH_AGENT_INSTRUCTION,
    WEATHER_AGENT_INSTRUCTION,
)
from aip_agent_quickstart.tools import langchain_add_numbers, langchain_weather_tool

weather_agent = LangGraphAgent(
    name="WeatherAgent",
    instruction=WEATHER_AGENT_INSTRUCTION,
    model=ChatOpenAI(model="gpt-4.1", temperature=0),
    tools=[langchain_weather_tool],
)
math_agent = LangGraphAgent(
    name="MathAgent",
    instruction=MATH_AGENT_INSTRUCTION,
    model=ChatOpenAI(model="gpt-4.1", temperature=0),
    tools=[langchain_add_numbers],
)
coordinator_agent = LangGraphAgent(
    name="CoordinatorAgent",
    instruction=COORDINATOR_MULTI_AGENT_INSTRUCTION,
    model=ChatOpenAI(model="gpt-4.1", temperature=0),
    agents=[weather_agent, math_agent],
)
response = coordinator_agent.run(
    query="What is the weather in Tokyo and what is 5 + 7?"
)
print(response.get("output"))

Running the Agent

You can run the following to test the capability:

poetry run python hello_world_multi_agent_langgraph.py

You should see the following output:

Agent 'CoordinatorAgent' (Coordinator) configured with 2 total tools. Can delegate to: ['WeatherAgent', 'MathAgent'].

Coordinator asynchronously delegating to agent 'WeatherAgent' with query: What is the weather in Tokyo?
Coordinator asynchronously delegating to agent 'MathAgent' with query: Add 5 and 7.

The weather in Tokyo is currently 25°C with clear skies.
5 + 7 equals 12.

MCP

We have the capability to connect to MCP client for LangGraph agents.

Start the MCP server

In a new terminal window, execute this

poetry run python aip_agent_quickstart/mcp_servers/mcp_server_stdio.py

Upon success, the window remains open without displaying any output.

Run agent using MCP-based tools

The following is how we can configure our LangGraph Agent to run using MCP-based tools.

from gllm_agents.agent.langgraph_agent import LangGraphAgent
from langchain_openai import ChatOpenAI

from aip_agent_quickstart.config import DEFAULT_AGENT_INSTRUCTION
from aip_agent_quickstart.mcp_configs.configs import mcp_config_stdio

langgraph_agent = LangGraphAgent(
    name="langgraph_mcp_example",
    instruction=DEFAULT_AGENT_INSTRUCTION,
    model=ChatOpenAI(model="gpt-4.1", temperature=0),
)
langgraph_agent.add_mcp_server(mcp_config_stdio)

response = langgraph_agent.run(query="What's the weather forecast for monday?")
print(f"Response: {response.get('output')}")

To test the agent, in the current terminal window run the following command:

poetry run python hello_world_langgraph_mcp_stdio.py

You should see the following output:

Attempting to connect to MCP server 'weather_tools'...
Successfully connected to MCP server 'weather_tools'...
Agent 'langgraph_mcp_example': Retrieved 1 tools from MCP server.

Response: The weather forecast for Monday is sunny, with temperatures ranging between 28°C and 32°C.

A2A (Agent-to-Agent)

LangGraph agents can act as A2A agents and can also coordinate agents hosted on remote A2A servers.

Deploy Agent as A2A server

To initiate the specialized Weather Agent server, follow one of these instruction(s):

Start the Server

To start the Weather Agent server using Python and Poetry, first we need to open a new terminal window. Then export OpenAI API key:

export OPENAI_API_KEY='your-openai-api-key-here'
export framework='langgraph'

Then run this command from your project's root directory:

poetry run python aip_agent_quickstart/agents/weather_agent/server.py

By default, the server will start and listen on http://localhost:8001

You can customize the host and port using the --host and --port command-line options.

This script performs the following key actions:

  1. Imports weather_tool.

  2. Builds a Agent Card with metadata.

  3. Wraps a LangGraph agent as an A2A app.

Call Agents via the A2A client

Once your Weather Agent server runs, then attempt one of the following approaches:

The following is how we can call A2A agent with non-streaming mode

from gllm_agents.agent.langgraph_agent import LangGraphAgent
from gllm_agents.agent.types import A2AClientConfig
from langchain_openai import ChatOpenAI

from aip_agent_quickstart.config import DEFAULT_AGENT_INSTRUCTION

agent = LangGraphAgent(
    name="AssistantAgent",
    instruction=DEFAULT_AGENT_INSTRUCTION,
    model=ChatOpenAI(model="gpt-4.1"),
)
client_a2a_config = A2AClientConfig(discovery_urls=["http://localhost:8001"])
agent_cards = agent.discover_agents(client_a2a_config)
response = agent.send_to_agent(agent_cards[0], message="What is the weather in Jakarta?")
print(response["content"])

Test the script by running

poetry run python hello_world_a2a_langgraph_client.py

These scripts will do the following:

  1. Instantiate LangGraphAgent

  2. Discover the Weather Agent at its URL.

  3. Send What is the weather in Jakarta City?

  4. Receive the full response when it's ready.

Expected Output:

Discovered agents (1 Agents): 
1. WeatherAgent
...
The weather in Jakarta City is currently 32°C, partly cloudy, and there is high humidity

Stopping the A2A Server

If the server is running in the foreground (as it will with the command above), press Ctrl+C in the terminal where it's running.

Last updated