Installation
Install the AITracer SDK and capture your first execution trace in minutes.
AITracer captures execution traces, cost metadata, governance signals, and verification records across AI workflows.
This guide walks through:
- installing the SDK
- configuring authentication
- sending your first trace
- validating your first execution record
Install the SDK
Install the Python SDK:
pip install aitracer-sdkFor JavaScript and TypeScript environments, use AITracer's ingestion API or direct trace endpoints.
Configure authentication
Set your API credentials:
export AITRACER_API_KEY="your-api-key"
export AITRACER_BASE_URL="https://api.aitracer.app"For local development:
export AITRACER_BASE_URL="http://localhost:3000"Do not hardcode API credentials in production applications.
Send your first trace
from aitracer import AITracer
client = AITracer(
api_key="your-api-key"
)
trace = client.trace(
model="gpt-4o",
input="Summarize this quarterly financial report",
metadata={
"workflow": "finance-summary",
"environment": "production"
}
)
print(trace.id)This automatically captures:
- model metadata
- token usage
- latency metrics
- cost attribution
- execution metadata
- trace identifiers
Verify the trace
Open the AITracer dashboard and confirm your trace was successfully recorded.
Review:
- trace ID
- request metadata
- model usage
- latency metrics
- cost details
- verification metadata
Enable governance controls
Optional governance checks can be enabled before execution records are stored.
Available controls include:
- PII detection
- credential exposure detection
- payment data detection
- policy validation
- risk scoring
Expand usage
After validating your first trace, AITracer can be deployed across:
- multi-agent workflows
- internal copilots
- customer-facing applications
- automation pipelines
- enterprise AI systems
Supported integrations
AITracer works with common model providers and your own stacks:
- OpenAI
- Anthropic
- Azure OpenAI
- AWS Bedrock
- Google Vertex AI
- OpenAI Agents
- Custom LLM and pipeline stacks via the trace ingestion API and OTLP/HTTP JSON at
/api/integrations/otlp/v1/traces(OpenAPI reference)
Deployment support
Enterprise deployment support is available at: