Anthropic Python SDK 0.83.0 Ships Automatic Cache Control
Anthropic published version 0.83.0 of their Python SDK on February 19, 2026. The release contains one feature commit and one documentation update. The feature—automatic cache control at the client level—allows SDK users to enable response caching globally rather than configuring it per request.
The release followed v0.82.0 with a focused scope: a single feature addition (commit a940123) and a mock server documentation update (commit 34ef48c).
What Changed
The new automatic caching mechanism operates at the top level of the SDK client. Prior to this release, any caching behavior required explicit configuration on individual API calls. The cache_control parameter can now be set once during client initialization and applied automatically to eligible requests.
| Version | Cache Configuration | Scope |
|---|---|---|
| ≤ 0.82.0 | Per-request manual | Individual API calls |
| 0.83.0+ | Top-level automatic | Client-wide default |
Practical Implications
For teams running high-volume Claude workloads—automated pipelines, batch processing, repeated prompt patterns—this reduces both boilerplate code and token spend. Identical or near-identical requests that previously hit the API fresh on every call can now return cached responses transparently.
The cost impact depends entirely on workload characteristics. Repetitive prompt structures (template-based generation, testing harnesses, retry loops) benefit most. Conversational or unique-per-request workloads see minimal change.
Upgrade path is straightforward: pip install --upgrade anthropic>=0.83.0, then pass the cache control parameter at client instantiation.
References
Configuration details reflect a production environment at time of writing. Implementation specifics vary based on tooling versions, platform updates, and organizational requirements. Validate approaches against current documentation before deployment.