Run this first

Quickstart

Run one guarded agent call locally. Add hosted later.

Three-step path

Do the local proof first

If this works, the rest makes sense.

See hosted plans
01 Install

Use one install command for the SDK and your provider client.

02 Run locally

Start with one guarded call so you can see the stop conditions before you wire any hosted service.

03 Verify

Use doctor, the CLI, and local traces to confirm the run behaved the way you expected.

Run this

Pick your integration

OpenAI is the default path.

Install

pip install agentguard47 openai

Smallest OpenAI path: init once, keep the proof local, and let AgentGuard auto-patch the client.

Code

import agentguard
from openai import OpenAI

agentguard.init(
    service="openai-agent",
    budget_usd=5.00,
    trace_file="traces.jsonl",
    local_only=True,
)

client = OpenAI()
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Give me a one-line summary of AgentGuard."}],
)

print(response.choices[0].message.content)
print("Traces saved to traces.jsonl")
Check it
  • agentguard doctor
  • python agentguard_openai_quickstart.py
  • agentguard incident traces.jsonl
Set this
  • OPENAI_API_KEY
Keep in mind
  • local_only=True keeps trace output local. Your OpenAI call still uses OPENAI_API_KEY.
  • Auto-patching is on by default in agentguard.init().

Hosted handoff

Need hosted control? Add it later.

The SDK stays first. Hosted is for shared operations.

Dashboard install

pip install agentguard47 openai

Dashboard connection

from agentguard import BudgetGuard, HttpSink, Tracer, patch_openai
from openai import OpenAI

guard = BudgetGuard(max_cost_usd=50.00, warn_at_pct=0.8)
http_sink = HttpSink(
    url="https://app.agentguard47.com/api/ingest",
    api_key="ag_YOUR_KEY_HERE",
    batch_size=5,
    flush_interval=0.5,
)
tracer = Tracer(
    sink=http_sink,
    service="openai-agent",
)
patch_openai(tracer, budget_guard=guard)

client = OpenAI()
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Summarize the latest support ticket."}],
)

print(response.choices[0].message.content)
Why teams add it
  • Hosted alerts
  • Retained incidents
  • Remote kill
  • Shared visibility
What stays true
  • Keep the local SDK proof first. Add HttpSink when you want retained history in the hosted dashboard.
  • The budget guard is passed to patch_openai so provider usage feeds the guard directly.
  • The dashboard is the paid control plane for alerts, remote kill, retention, and team workflows.

Links

Need more?

Only the essentials.

Prove one guarded run first.

SDK first. Hosted later.