home.social

#anthropicapi — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #anthropicapi, aggregated by home.social.

  1. I added prompt caching to my Anthropic Batch API workflow. The hit rate was 0%.

    Each model has a minimum cacheable token count — 4,096 for Haiku 4.5. If your cache_control block is below that, the API silently ignores it. Successful response, zero cache reads, no warning.

    My IAB taxonomy prompt was 1,064 tokens. Well under the threshold.

    Full write-up:

    mikenoe.com/posts/prompt-cachi

    #AnthropicAPI #LLM #PromptCaching #AIEngineering

  2. 💼 Early adopters ThomsonReuters saw improved accuracy in their #CoCounsel platform, while #Endex reduced hallucinations from 10% to 0%

    🚀 Now available for #Claude35 Sonnet and Haiku on #AnthropicAPI and @GoogleCloud Vertex #AI