Skip to main content

Documentation Index

Fetch the complete documentation index at: https://continue-docs-dependabot-npm-and-yarn-docs-multi-c8c89d9539.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

Discover Anthropic models here
Get an API key from the Anthropic Console

Configuration

config.yaml
name: My Config
version: 0.0.1
schema: v1

models:
  - name: <MODEL_NAME>
    provider: anthropic
    model: <MODEL_ID>
    apiKey: <YOUR_ANTHROPIC_API_KEY>
Check out a more advanced configuration here

How to Enable Prompt Caching with Claude

Anthropic supports prompt caching with Claude, which allows Claude models to cache system messages and conversation history between requests to improve performance and reduce costs. To enable caching of the system message and the turn-by-turn conversation, update your model configuration as follows:
config.yaml
name: My Config
version: 0.0.1
schema: v1

models:
  - name: <MODEL_NAME>
    provider: anthropic
    model: <MODEL_ID>
    apiKey: <YOUR_ANTHROPIC_API_KEY>
    roles:
      - chat
    defaultCompletionOptions:
      promptCaching: true