Anthropic SDK Integration
Use the Anthropic SDK with AI SpendOps for accurate usage tracking of Claude models.
Python
from anthropic import Anthropic
client = Anthropic(
api_key="sk-ant-your-key",
base_url="https://proxy.aispendops.com/v1/anthropic",
default_headers={
"X-ASO-API-Key": "aso_k_yourkey.secret",
"X-ASO-Dims": "team=ml,app=assistant",
},
)
message = client.messages.create(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello"}],
)
print(message.content[0].text)
Streaming
with client.messages.stream(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
messages=[{"role": "user", "content": "Write a haiku"}],
) as stream:
for text in stream.text_stream:
print(text, end="")
TypeScript / Node.js
import Anthropic from "@anthropic-ai/sdk";
const client = new Anthropic({
apiKey: "sk-ant-your-key",
baseURL: "https://proxy.aispendops.com/v1/anthropic",
defaultHeaders: {
"X-ASO-API-Key": "aso_k_yourkey.secret",
"X-ASO-Dims": "team=backend,app=assistant",
},
});
const message = await client.messages.create({
model: "claude-sonnet-4-5-20250929",
max_tokens: 1024,
messages: [{ role: "user", content: "Hello" }],
});
console.log(message.content[0].text);
Use the native endpoint
Always use Anthropic's native /v1/messages endpoint (which the Anthropic SDK uses by default) rather than the OpenAI-compatible /v1/chat/completions. The native endpoint provides accurate streaming usage data including cache token breakdowns.