You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Anthropic cache control is in a Pre-Generally Available (GA) state on Google Vertex. For more see Google Vertex Anthropic prompt caching documentation.
constllmResponse=awaitai.generate({model: claude3Sonnet,// or another Anthropic modelmessages: [{role: 'system',content: [{text: 'This is an important instruction that can be cached.',custom: {cacheControl: {type: 'ephemeral',},},},],},{role: 'user',content: [{text: 'What should I do when I visit Melbourne?'}],},],});
Additional context
The Anthropic Claude models offer prompt caching to reduce latency and costs
when reusing the same content in multiple requests. When you send a query, you
can cache all or specific parts of your input so that subsequent queries can use
the cached results from the previous request. This avoids additional compute and
network costs. Caches are unique to your Google Cloud project and cannot be used
by other projects.
For details about how to structure your prompts, see the Anthropic Prompt
caching
documentation.
The text was updated successfully, but these errors were encountered:
Describe the solution you'd like
Additional context
The Anthropic Claude models offer prompt caching to reduce latency and costs
when reusing the same content in multiple requests. When you send a query, you
can cache all or specific parts of your input so that subsequent queries can use
the cached results from the previous request. This avoids additional compute and
network costs. Caches are unique to your Google Cloud project and cannot be used
by other projects.
For details about how to structure your prompts, see the Anthropic Prompt
caching
documentation.
The text was updated successfully, but these errors were encountered: