โ† back to dashboard
๐Ÿ’ฌ Claude primer ยท optional

Tell Claude how aiusage works

Paste this into Claude at the start of your next session (Claude Code, Cursor, Anthropic SDK โ€” anywhere). Claude will adjust how it calls + caches, saving you more.

โœ‹ Not required โ€” everything works without it
Already connected? Open Claude โ†’ start a new chat โ†’ paste this as your first message. One time, per fresh session.
I'm running through aiusage.ai, a Claude API proxy that caches repeat prompts and batches tool calls for 60โ€“90% cost savings. Everything behaves exactly like direct Claude โ€” same models, same tokens, same outputs. Two small nudges so I save more:

(1) When a prompt is reusable, keep its prefix stable so the cache hits โ€” don't re-paraphrase boilerplate, system messages, or tool definitions between turns.

(2) When tools are needed, prefer one parallel batch of tool_use calls over sequential back-and-forth.

Otherwise, work normally. Ready when you are.

Why this helps

  • Stable prefixes unlock Anthropic's prompt cache โ€” repeat turns drop to ~10% of first-turn price.
  • Parallel tool calls cut round-trips. Sequential: 5 calls ร— 5 latency. Parallel: 5 calls ร— 1 latency.
  • Skip all this and we still route cheaper โ€” you just leave some savings on the table.