Skip to main content

Architecture Overview

What is Agentflare?

Agentflare provides protocol-level observability for AI agent tool servers. Track every decision, measure tool selection performance, and validate compliance in real-time. Build transparent, reliable AI agents your team can trust.

Why Agentflare?

The Challenge

Without ObservabilityWith Agentflare
🚫 Opaque Decisions - Why did the agent choose this tool?βœ… Complete Reasoning - See exactly why each decision was made
🚫 Unknown Costs - How much is this costing me?βœ… Real-time Cost Tracking - Track costs per tool, session, and model
🚫 Performance Blindness - Where are the bottlenecks?βœ… Performance Analytics - Identify bottlenecks and optimize
🚫 Debugging Nightmare - Agent failed, but why?βœ… Rich Error Context - Detailed error information and context
🚫 Complex Setup - Manual instrumentation requiredβœ… Simple Proxy - Add tool server, get proxy URL, done

The Solution

Agentflare provides complete visibility into your agent’s decision-making process:
1

Add Tool Server

Add your MCP tool server to Agentflare via the dashboard
2

Get Proxy URL

Receive a unique proxy URL for instant observability
3

Configure Agent

Point your AI agent to use the Agentflare proxy
4

Observe Everything

See tool calls, reasoning, performance, and costs in real-time

Core Features

Hosted proxy service providing instant observability for any tool server:
  • Custom domains (mcp.company.com)
  • Zero code changes required
  • Automatic reasoning capture
  • Cost estimation and tracking
Comprehensive visualization of agent behavior:
  • Live feed of tool calls with reasoning
  • Performance metrics and bottleneck identification
  • Cost analytics per tool/session/model
  • Usage analytics for tool servers
Fine-tune tool schemas and decision pathways:
  • Adjust tool descriptions in real-time
  • Test schema changes instantly
  • Optimize tool selection without redeployment
  • A/B test different configurations
Complete reconstruction of agent decision processes:
  • Step-by-step replay of tool sequences
  • Reasoning visibility at each decision point
  • Cost breakdown per step
  • Debugging and analysis tools
Send data to your existing observability stack:
  • Jaeger, Datadog, New Relic, Grafana
  • Standard OpenTelemetry format
  • Custom exporters supported
  • No vendor lock-in

How It Works

1. Proxy Architecture

Agentflare sits between your AI agents and tool servers:

2. Zero Configuration

No SDK to install, no code to change:
  • Before Agentflare
  • With Agentflare
import { Client } from "@modelcontextprotocol/sdk/client/index.js";
import { HTTPTransport } from "@modelcontextprotocol/sdk/client/http.js";

// Direct connection - no observability
const client = new Client({
  transport: new HTTPTransport({
    baseUrl: "https://your-tool-server.com/mcp"
  })
});

3. Rich Observability

Automatically captures:
  • Tool calls - Every tool invocation with arguments and results
  • Tool reasoning - Why the agent selected each tool
  • Agent reasoning - Step-by-step decision process
  • Performance - Latency, throughput, error rates
  • Cost - API costs per call, session, and model
  • Confidence - Decision confidence scores and alternatives

Use Cases

Development & Debugging

  • Debug agent behavior - Understand why agents make specific choices
  • Optimize tool selection - Identify and improve poor tool choices
  • Test prompt variations - A/B test different prompts and schemas
  • Validate reasoning - Ensure agents think as expected

Production Monitoring

  • Real-time alerting - Get notified of issues immediately
  • Performance tracking - Monitor SLAs and response times
  • Cost control - Stay within budget and optimize spending
  • Compliance - Audit agent decisions for regulatory requirements

Tool Server Optimization

  • Usage analytics - Understand how tools are being used
  • Schema refinement - Optimize tool descriptions with Schema Studio
  • Performance tuning - Identify slow or failing tools
  • Cost analysis - Find expensive tools and optimize

Getting Started

Supported Protocols

Agentflare works with multiple protocols:
  • MCP (Model Context Protocol)
  • Custom Protocols
Full support for MCP tool servers:
  • HTTP/HTTPS transport
  • WebSocket transport
  • Stdio transport (hosted)
  • All MCP methods and capabilities

Join the Community

Have questions? Want to contribute? Join our growing community:
Agentflare is designed to be minimally invasive with < 1ms latency overhead and asynchronous data collection. Your agents run at full speed while you get complete observability.
⌘I