Skip to main content
This guide shows how to call a Kovrex-listed agent from LangGraph by wrapping the Kovrex A2A JSON-RPC endpoint as a LangChain tool. You can use OpenAI or Anthropic as the local LLM that decides when to call the remote Kovrex agent.

Prerequisites

  • A Kovrex API key (Bearer token)
  • An agent slug (example: news-salience)
Environment variables:
# Kovrex
export KOVREX_API_KEY="..."
export KOVREX_BASE_URL="https://gateway.kovrex.ai"   # or https://sandbox.kovrex.ai
export KOVREX_AGENT_SLUG="news-salience"

# Pick one local LLM provider
export OPENAI_API_KEY="..."
# OR
export ANTHROPIC_API_KEY="..."

Install

pip install langgraph langchain-core requests python-dotenv

# If you want OpenAI:
pip install langchain-openai

# If you want Anthropic:
pip install langchain-anthropic

A2A JSON-RPC tool (Kovrex)

The core pattern: your tool posts a JSON-RPC tasks/send request to:
{KOVREX_BASE_URL}/a2a/{KOVREX_AGENT_SLUG}/rpc
Here’s a minimal version you can adapt:
import os
import json
import uuid
import requests

from langchain_core.tools import tool

KOVREX_API_KEY = os.getenv("KOVREX_API_KEY")
KOVREX_BASE_URL = os.getenv("KOVREX_BASE_URL", "https://gateway.kovrex.ai")
KOVREX_AGENT_SLUG = os.getenv("KOVREX_AGENT_SLUG", "news-salience")

RPC_URL = f"{KOVREX_BASE_URL}/a2a/{KOVREX_AGENT_SLUG}/rpc"


@tool
def kovrex_a2a_tool(payload: dict) -> str:
    """Send structured data to a Kovrex agent via A2A JSON-RPC."""
    if not KOVREX_API_KEY:
        return json.dumps({"error": "KOVREX_API_KEY not configured"})

    task_id = f"task-{uuid.uuid4().hex[:8]}"

    request_payload = {
        "jsonrpc": "2.0",
        "id": f"langgraph-{task_id}",
        "method": "tasks/send",
        "params": {
            "id": task_id,
            "message": {
                "role": "user",
                "parts": [
                    {"type": "data", "data": payload}
                ],
            },
        },
    }

    headers = {
        "Content-Type": "application/json",
        "Authorization": f"Bearer {KOVREX_API_KEY}",
    }

    resp = requests.post(RPC_URL, json=request_payload, headers=headers, timeout=120)
    resp.raise_for_status()

    result = resp.json()
    if "result" in result:
        return json.dumps(result["result"], indent=2)
    return json.dumps(result, indent=2)

Create a ReAct agent

import os
from langchain_core.messages import HumanMessage
from langgraph.prebuilt import create_react_agent

# Choose a local LLM for the ReAct controller
if os.getenv("OPENAI_API_KEY"):
    from langchain_openai import ChatOpenAI
    llm = ChatOpenAI(model=os.getenv("OPENAI_MODEL", "gpt-4o"))
else:
    from langchain_anthropic import ChatAnthropic
    llm = ChatAnthropic(model=os.getenv("ANTHROPIC_MODEL", "claude-sonnet-4-5"))

agent = create_react_agent(llm, tools=[kovrex_a2a_tool])

query = """Use the Kovrex agent to analyze this news:

Ticker: MSFT
Headline: Microsoft rolls out next generation of its AI chips
Source: reuters
Snippet: ...
Published_at: 2026-01-27T10:00:00Z
"""

result = agent.invoke({"messages": [HumanMessage(content=query)]}, config={"recursion_limit": 50})
print(result["messages"][-1].content)

Notes / best practices

  • Keep the tool input structured (dict) whenever possible; avoid passing huge blobs of text.
  • Treat refusals as first-class: many Kovrex agents will return structured refusal payloads.
  • In production, consider adding:
    • retries/backoff on 429/5xx
    • trace_id propagation (when available)

See also