You can bring agents developed in TypeScript in any framework (LangChain, CrewAI, or any custom framework) and deploy them on Beamlit by integrating a few lines of the Beamlit SDK and leveraging our other developer tools (the Beamlit CLI, GitHub action, etc.).

Check out this Getting Started tutorial in order to develop and deploy your first Hello World AI agent globally in less than 5 minutes.

Beamlit agent file structure

Overview

To deploy a custom AI agent on Beamlit, you need the following file in your repository:

  • agent.ts (or src/agent.ts): This file must contain an async function that runs your agent. This function serves as the default entry point during serving and deployment.
    • You can specify a different module to serve using the flag bl serve --module your-function-path.your-function-name
    • The function accepts these arguments, which Beamlit passes when a request is made to your deployed agent:
      • request: The Fastify Request that triggers the agent’s execution.
      • args: (Optional) Argument object containing the core agent algorithm. By default, it uses a LangChain ReAct agent.

Mark your agent’s main function with Beamlit’s wrapAgent wrapper to serve it. You can override the default argument values that Beamlit passes upon requesting the agent by using the wrapper parameters (see below).

Some folders are optional but designed to handle specific parts of the agent code on Beamlit:

  • src/functions: all files listed in this folder will be interpreted as custom tools that will all be made available to the agent automatically when serving or deploying the agent. The tool’s main TypeScript function itself is identified with the wrapFunction wrapper.

Quickstart

It is required to have npm installed to use the following command.

You can quickly initialize a new project from scratch by using the CLI command bl create-agent-app. This will create a pre-scaffolded local repo where your entire code can be added. By default, this command creates a boilerplate agent that has:

  • A LangChain ReAct agent as the core agent logic. This agent can be overridden by passing a custom agent as overrideAgent in the options of the wrapAgent in /src/agent.ts
  • A hello-world tool (a function) that returns a placeholder text string. This function can be customized by changing the code in the helloworld function in /src/functions/helloworld.ts, and additional functions can be created in the /src/functions folder.

The wrapAgent higher-order function

The wrapAgent decorator identifies the main function that handles your core agent logic such as how to handle queries and responses, tool calls and such.

import { wrapAgent } from "@beamlit/sdk";

export const agent = wrapAgent(async (request, args) => { ... }, {
  agent: {
    metadata: { name: "example-agent" }
  }
});

Inside the wrapper, you can pass an object of options:

  • agent: this can be either the Beamlit SDK’s Agent object or equivalent dict, and allows to specify all the parameters for the deployment of your agent on Beamlit. Examples include:
    • metadata.name: the name of the agent on Beamlit
    • spec.description: the description of the agent on Beamlit
    • spec.model: the name of a model API deployed in your Beamlit workspace, to use as core chat model for the agent
  • overrideAgent: You can override the default LangChain ReAct agent by passing any custom agent instance. When the deployed agent receives a request, Beamlit will pass this as an argument to the agent’s handler function.
  • overrideModel: You can query an LLM provider directly instead of routing through Beamlit by passing a BaseChatModel object from LangChain. When the deployed agent receives a request, Beamlit will pass this as the llm to the default LangChain ReAct agent. This is used as a more granular alternative to overrideAgent.
  • remoteFunctions: You can pass on additional Beamlit functions that are already deployed so your agent can access them too, in addition to the functions from src/functions/. This parameter takes in an array of string representing the names of the functions on Beamlit. This is used as a more granular alternative to overrideAgent.

Example

Here’s an example that demonstrates overriding the default agent object.

import { FastifyRequest } from 'fastify';
import { ChatOpenAI } from '@langchain/openai';
import { createReactAgent } from '@langchain/langgraph/prebuilt';
import { wrapAgent } from '@beamlit/sdk';
import helloworld from './customfunctions/helloworld';

// Initialize the chat model
const chat = new ChatOpenAI();

// Create a ReAct agent with custom tools
const customAgent = createReactAgent({
  llm: chat,
  tools: [helloworld]
});

// Create the main agent handler
const handler = async (request: FastifyRequest, args: { agent: any }) => {
  // Your main agent logic here
  // ...
};

// Export the wrapped agent
export const agent = wrapAgent(handler, {
  agent: {
    metadata: {
      name: "langchain-example",
    },
    spec: {
      model: "sandbox-openai",
    },
  },
  overrideAgent: customAgent,
}); 

When developing custom agents, you may also need to retrieve instantiated model APIs and functions in your code. The following section shows you how to.

Retrieve existing functions and model APIs

Get model APIs

You can use model APIs deployed on Beamlit by calling getChatModel(). This function returns a model API in LangChain format BaseChatModel

import { getChatModel } from '@beamlit/sdk';
import { createReactAgent } from "@langchain/langgraph/prebuilt";

const chat = await getChatModel("model-name-on-beamlit");

const customAgent = createReactAgent({ llm: chat, tools: [] });

Get functions

You can use functions defined both in your code and deployed on Beamlit by calling getFunctions(). This function returns a list of tools in LangChain format StructuredTool.

import { getFunctions } from "@beamlit/sdk";
import { createReactAgent } from "@langchain/langgraph/prebuilt";

const functions = await getFunctions();

const customAgent = createReactAgent({ llm: chat, tools: functions });

By default, this returns the list of functions defined in the local folder src/functions/. You can pass several parameters:

  • remoteFunctions: to retrieve additional Beamlit functions that are already deployed so your agent can access them too, in addition to the functions from src/functions/. This parameter takes in an array of string representing the names of the functions on Beamlit

Deploy an agent

Learn how to deploy your custom AI agents on Beamlit as a serverless endpoint.