Anthropic integration
Connect your agents to LLMs from Anthropic.
The Anthropic integration allows Beamlit users to call Anthropic models using a Beamlit endpoint in order to unify access control, credentials and observability management.
The integration must be set up by an admin in the Integrations section in the workspace settings.
Set up the integration
In order to use this integration, you must register an Anthropic access token into your Beamlit workspace settings. The scope of this access token (i.e. the Anthropic workspace it is allowed to access) will be the scope that Beamlit has access to.
First, generate an Anthropic API key from your Anthropic organization settings. Select the workspace to use for this key.
On Beamlit, in Workspace Settings > Anthropic integration, create a new connection and paste this token into the “Access token” section.
Connect to an Anthropic model
Once you’ve set up the integration in the workspace, any workspace member can use it to reference an Anthropic model as an external model API.
When creating a model API, select Connect an external model > Anthropic. You can search for any ****model from the Anthropic catalog.
After the model API is created, you will receive a dedicated global Beamlit endpoint to call the model. Beamlit will forward inference requests to Anthropic, using your Anthropic credentials for authentication and authorization.