Use one install command for the SDK and your provider client.
Run one guarded agent call locally. Add hosted later.
Three-step path
If this works, the rest makes sense.
Use one install command for the SDK and your provider client.
Start with one guarded call so you can see the stop conditions before you wire any hosted service.
Use doctor, the CLI, and local traces to confirm the run behaved the way you expected.
Run this
OpenAI is the default path.
pip install agentguard47 openai
Smallest OpenAI path: init once, keep the proof local, and let AgentGuard auto-patch the client.
import agentguard
from openai import OpenAI
agentguard.init(
service="openai-agent",
budget_usd=5.00,
trace_file="traces.jsonl",
local_only=True,
)
client = OpenAI()
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Give me a one-line summary of AgentGuard."}],
)
print(response.choices[0].message.content)
print("Traces saved to traces.jsonl")
agentguard doctorpython agentguard_openai_quickstart.pyagentguard incident traces.jsonlHosted handoff
The SDK stays first. Hosted is for shared operations.
pip install agentguard47 openai
from agentguard import BudgetGuard, HttpSink, Tracer, patch_openai
from openai import OpenAI
guard = BudgetGuard(max_cost_usd=50.00, warn_at_pct=0.8)
http_sink = HttpSink(
url="https://app.agentguard47.com/api/ingest",
api_key="ag_YOUR_KEY_HERE",
batch_size=5,
flush_interval=0.5,
)
tracer = Tracer(
sink=http_sink,
service="openai-agent",
)
patch_openai(tracer, budget_guard=guard)
client = OpenAI()
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Summarize the latest support ticket."}],
)
print(response.choices[0].message.content)
Links
Only the essentials.