anthropic } from "@ai-sdk/anthropic";import { class Composio<TProvider extends BaseComposioProvider<unknown, unknown, unknown> = OpenAIProvider>
This is the core class for Composio.
It is used to initialize the Composio SDK and provide a global configuration.
Composio } from "@composio/core";import { class VercelProviderVercelProvider } from "@composio/vercel";import { function stepCountIs(stepCount: number): StopCondition<any>stepCountIs,
Generate a text and call tools for a given prompt using a language model.
This function streams the output. If you do not want to stream the output, use `generateText` instead.
@parammodel - The language model to use.@paramtools - Tools that are accessible to and can be called by the model. The model needs to support calling tools.@paramsystem - A system message that will be part of the prompt.@paramprompt - A simple text prompt. You can either use `prompt` or `messages` but not both.@parammessages - A list of messages. You can either use `prompt` or `messages` but not both.@parammaxOutputTokens - Maximum number of tokens to generate.@paramtemperature - Temperature setting.
The value is passed through to the provider. The range depends on the provider and model.
It is recommended to set either `temperature` or `topP`, but not both.@paramtopP - Nucleus sampling.
The value is passed through to the provider. The range depends on the provider and model.
It is recommended to set either `temperature` or `topP`, but not both.@paramtopK - Only sample from the top K options for each subsequent token.
Used to remove "long tail" low probability responses.
Recommended for advanced use cases only. You usually only need to use temperature.@parampresencePenalty - Presence penalty setting.
It affects the likelihood of the model to repeat information that is already in the prompt.
The value is passed through to the provider. The range depends on the provider and model.@paramfrequencyPenalty - Frequency penalty setting.
It affects the likelihood of the model to repeatedly use the same words or phrases.
The value is passed through to the provider. The range depends on the provider and model.@paramstopSequences - Stop sequences.
If set, the model will stop generating text when one of the stop sequences is generated.@paramseed - The seed (integer) to use for random sampling.
If set and supported by the model, calls will generate deterministic results.@parammaxRetries - Maximum number of retries. Set to 0 to disable retries. Default: 2.@paramabortSignal - An optional abort signal that can be used to cancel the call.@paramtimeout - An optional timeout in milliseconds. The call will be aborted if it takes longer than the specified timeout.@paramheaders - Additional HTTP headers to be sent with the request. Only applicable for HTTP-based providers.@paramonChunk - Callback that is called for each chunk of the stream. The stream processing will pause until the callback promise is resolved.@paramonError - Callback that is called when an error occurs during streaming. You can use it to log errors.@paramonStepFinish - Callback that is called when each step (LLM call) is finished, including intermediate steps.@paramonFinish - Callback that is called when all steps are finished and the response is complete.@returnA result object for accessing different stream types and additional information.
streamText } from "ai";// Initialize Composio with Vercel provider (API key from env var COMPOSIO_API_KEY)constconst composio: Composio<VercelProvider>composio = newnew Composio<VercelProvider>(config?: ComposioConfig<VercelProvider> | undefined): Composio<VercelProvider>
Creates a new instance of the Composio SDK.
The constructor initializes the SDK with the provided configuration options,
sets up the API client, and initializes all core models (tools, toolkits, etc.).
@paramconfig - Configuration options for the Composio SDK@paramconfig.apiKey - The API key for authenticating with the Composio API@paramconfig.baseURL - The base URL for the Composio API (defaults to production URL)@paramconfig.allowTracking - Whether to allow anonymous usage analytics@paramconfig.provider - The provider to use for this Composio instance (defaults to OpenAIProvider)@example```typescript
// Initialize with default configuration
const composio = new Composio();
// Initialize with custom API key and base URL
const composio = new Composio({
apiKey: 'your-api-key',
baseURL: 'https://api.composio.dev'
});
// Initialize with custom provider
const composio = new Composio({
apiKey: 'your-api-key',
provider: new CustomProvider()
});
```
Composio({ provider?: VercelProvider | undefined
The tool provider to use for this Composio instance.
@examplenew OpenAIProvider()
provider: new
new VercelProvider({ strict }?: { strict?: boolean;}): VercelProvider
Creates a new instance of the VercelProvider.
This provider enables integration with the Vercel AI SDK,
allowing Composio tools to be used with Vercel AI applications.
@example```typescript
// Initialize the Vercel provider
const provider = new VercelProvider();
// Use with Composio
const composio = new Composio({
apiKey: 'your-api-key',
provider: new VercelProvider()
});
// Use the provider to wrap tools for Vercel AI SDK
const vercelTools = provider.wrapTools(composioTools, composio.tools.execute);
```
VercelProvider() });// Unique identifier of the userconstconst userId: "user_123"userId = "user_123";// Create a session and get native tools for the userconstconst session: ToolRouterSession<unknown, unknown, VercelProvider>session = awaitconst composio: Composio<VercelProvider>composio.Composio<VercelProvider>.create: (userId: string, routerConfig?: ToolRouterCreateSessionConfig) => Promise<ToolRouterSession<unknown, unknown, VercelProvider>>
Creates a new tool router session for a user.
@paramuserId The user id to create the session for@paramconfig The config for the tool router session@returnsThe tool router session@example```typescript
import { Composio } from '@composio/core';
const composio = new Composio();
const userId = 'user_123';
const session = await composio.create(userId, {
manageConnections: true,
});
console.log(session.sessionId);
console.log(session.url);
console.log(session.tools());
```
Get the tools available in the session, formatted for your AI framework.
Requires a provider to be configured in the Composio constructor.
tools();var console: Console
The `console` module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
* A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream.
* A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and
[`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module.
_**Warning**_: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for
more information.
Example using the global `console`:
```js
console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr
```
Example using the `Console` class:
```js
const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err
```
Prints to `stdout` with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html)
(the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)).
```js
const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout
```
See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log("Fetching GitHub issues from the Composio repository...");// Stream the response with tool callingconstconst stream: StreamTextResult<ToolSet, Output<string, string, never>>stream = await
Generate a text and call tools for a given prompt using a language model.
This function streams the output. If you do not want to stream the output, use `generateText` instead.
@parammodel - The language model to use.@paramtools - Tools that are accessible to and can be called by the model. The model needs to support calling tools.@paramsystem - A system message that will be part of the prompt.@paramprompt - A simple text prompt. You can either use `prompt` or `messages` but not both.@parammessages - A list of messages. You can either use `prompt` or `messages` but not both.@parammaxOutputTokens - Maximum number of tokens to generate.@paramtemperature - Temperature setting.
The value is passed through to the provider. The range depends on the provider and model.
It is recommended to set either `temperature` or `topP`, but not both.@paramtopP - Nucleus sampling.
The value is passed through to the provider. The range depends on the provider and model.
It is recommended to set either `temperature` or `topP`, but not both.@paramtopK - Only sample from the top K options for each subsequent token.
Used to remove "long tail" low probability responses.
Recommended for advanced use cases only. You usually only need to use temperature.@parampresencePenalty - Presence penalty setting.
It affects the likelihood of the model to repeat information that is already in the prompt.
The value is passed through to the provider. The range depends on the provider and model.@paramfrequencyPenalty - Frequency penalty setting.
It affects the likelihood of the model to repeatedly use the same words or phrases.
The value is passed through to the provider. The range depends on the provider and model.@paramstopSequences - Stop sequences.
If set, the model will stop generating text when one of the stop sequences is generated.@paramseed - The seed (integer) to use for random sampling.
If set and supported by the model, calls will generate deterministic results.@parammaxRetries - Maximum number of retries. Set to 0 to disable retries. Default: 2.@paramabortSignal - An optional abort signal that can be used to cancel the call.@paramtimeout - An optional timeout in milliseconds. The call will be aborted if it takes longer than the specified timeout.@paramheaders - Additional HTTP headers to be sent with the request. Only applicable for HTTP-based providers.@paramonChunk - Callback that is called for each chunk of the stream. The stream processing will pause until the callback promise is resolved.@paramonError - Callback that is called when an error occurs during streaming. You can use it to log errors.@paramonStepFinish - Callback that is called when each step (LLM call) is finished, including intermediate steps.@paramonFinish - Callback that is called when all steps are finished and the response is complete.@returnA result object for accessing different stream types and additional information.
A prompt. It can be either a text prompt or a list of messages.
You can either use `prompt` or `messages` but not both.
prompt: "Fetch all the open GitHub issues on the composio repository and group them by bugs/features/docs.",stopWhen?: StopCondition<NoInfer<ToolSet>> | StopCondition<NoInfer<ToolSet>>[] | undefined
Condition for stopping the generation when there are tool results in the last step.
When the condition is an array, any of the conditions can be met to stop the generation.
@defaultstepCountIs(1)
stopWhen: function stepCountIs(stepCount: number): StopCondition<any>stepCountIs(10),onStepFinish?: StreamTextOnStepFinishCallback<ToolSet> | undefined
Callback that is called when each step (LLM call) is finished, including intermediate steps.
The tool calls that were made during the generation.
toolCalls) {var console: Console
The `console` module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
* A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream.
* A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and
[`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module.
_**Warning**_: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for
more information.
Example using the global `console`:
```js
console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr
```
Example using the `Console` class:
```js
const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err
```
Prints to `stdout` with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html)
(the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)).
```js
const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout
```
See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
A text stream that returns only the generated text deltas. You can use it
as either an AsyncIterable or a ReadableStream. When an error occurs, the
stream will throw the error.
The `process.stdout` property returns a stream connected to`stdout` (fd `1`). It is a `net.Socket` (which is a `Duplex` stream) unless fd `1` refers to a file, in which case it is
a `Writable` stream.
For example, to copy `process.stdin` to `process.stdout`:
```js
import { stdin, stdout } from 'node:process';
stdin.pipe(stdout);
```
`process.stdout` differs from other Node.js streams in important ways. See `note on process I/O` for more information.
Sends data on the socket. The second parameter specifies the encoding in the
case of a string. It defaults to UTF8 encoding.
Returns `true` if the entire data was flushed successfully to the kernel
buffer. Returns `false` if all or part of the data was queued in user memory.`'drain'` will be emitted when the buffer is again free.
The optional `callback` parameter will be executed when the data is finally
written out, which may not be immediately.
See `Writable` stream `write()` method for more
information.
@sincev0.1.90@paramencoding Only used when data is `string`.
The `console` module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
* A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream.
* A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and
[`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module.
_**Warning**_: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for
more information.
Example using the global `console`:
```js
console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr
```
Example using the `Console` class:
```js
const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err
```
Prints to `stdout` with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html)
(the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)).
```js
const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout
```
See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log("\n\n---");var console: Console
The `console` module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
* A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream.
* A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and
[`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module.
_**Warning**_: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for
more information.
Example using the global `console`:
```js
console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr
```
Example using the `Console` class:
```js
const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err
```
Prints to `stdout` with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html)
(the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)).
```js
const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout
```
See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log("Tip: If prompted to authenticate, complete the auth flow and run again.");
Generate a text and call tools for a given prompt using a language model.
This function streams the output. If you do not want to stream the output, use `generateText` instead.
@parammodel - The language model to use.@paramtools - Tools that are accessible to and can be called by the model. The model needs to support calling tools.@paramsystem - A system message that will be part of the prompt.@paramprompt - A simple text prompt. You can either use `prompt` or `messages` but not both.@parammessages - A list of messages. You can either use `prompt` or `messages` but not both.@parammaxOutputTokens - Maximum number of tokens to generate.@paramtemperature - Temperature setting.
The value is passed through to the provider. The range depends on the provider and model.
It is recommended to set either `temperature` or `topP`, but not both.@paramtopP - Nucleus sampling.
The value is passed through to the provider. The range depends on the provider and model.
It is recommended to set either `temperature` or `topP`, but not both.@paramtopK - Only sample from the top K options for each subsequent token.
Used to remove "long tail" low probability responses.
Recommended for advanced use cases only. You usually only need to use temperature.@parampresencePenalty - Presence penalty setting.
It affects the likelihood of the model to repeat information that is already in the prompt.
The value is passed through to the provider. The range depends on the provider and model.@paramfrequencyPenalty - Frequency penalty setting.
It affects the likelihood of the model to repeatedly use the same words or phrases.
The value is passed through to the provider. The range depends on the provider and model.@paramstopSequences - Stop sequences.
If set, the model will stop generating text when one of the stop sequences is generated.@paramseed - The seed (integer) to use for random sampling.
If set and supported by the model, calls will generate deterministic results.@parammaxRetries - Maximum number of retries. Set to 0 to disable retries. Default: 2.@paramabortSignal - An optional abort signal that can be used to cancel the call.@paramtimeout - An optional timeout in milliseconds. The call will be aborted if it takes longer than the specified timeout.@paramheaders - Additional HTTP headers to be sent with the request. Only applicable for HTTP-based providers.@paramonChunk - Callback that is called for each chunk of the stream. The stream processing will pause until the callback promise is resolved.@paramonError - Callback that is called when an error occurs during streaming. You can use it to log errors.@paramonStepFinish - Callback that is called when each step (LLM call) is finished, including intermediate steps.@paramonFinish - Callback that is called when all steps are finished and the response is complete.@returnA result object for accessing different stream types and additional information.
streamText } from "ai";// Initialize Composio (API key from env var COMPOSIO_API_KEY or pass explicitly: { apiKey: "your-key" })constconst composio: Composio<OpenAIProvider>composio = newnew Composio<OpenAIProvider>(config?: ComposioConfig<OpenAIProvider> | undefined): Composio<OpenAIProvider>
Creates a new instance of the Composio SDK.
The constructor initializes the SDK with the provided configuration options,
sets up the API client, and initializes all core models (tools, toolkits, etc.).
@paramconfig - Configuration options for the Composio SDK@paramconfig.apiKey - The API key for authenticating with the Composio API@paramconfig.baseURL - The base URL for the Composio API (defaults to production URL)@paramconfig.allowTracking - Whether to allow anonymous usage analytics@paramconfig.provider - The provider to use for this Composio instance (defaults to OpenAIProvider)@example```typescript
// Initialize with default configuration
const composio = new Composio();
// Initialize with custom API key and base URL
const composio = new Composio({
apiKey: 'your-api-key',
baseURL: 'https://api.composio.dev'
});
// Initialize with custom provider
const composio = new Composio({
apiKey: 'your-api-key',
provider: new CustomProvider()
});
```
Composio();// Unique identifier of the userconstconst userId: "user_123"userId = "user_123";// Create a tool router session for the userconst {
@paramuserId The user id to create the session for@paramconfig The config for the tool router session@returnsThe tool router session@example```typescript
import { Composio } from '@composio/core';
const composio = new Composio();
const userId = 'user_123';
const session = await composio.create(userId, {
manageConnections: true,
});
console.log(session.sessionId);
console.log(session.url);
console.log(session.tools());
```
create(const userId: "user_123"userId);// Create an MCP client to connect to the Composio tool routerconstconst client: MCPClientclient = awaitfunction createMCPClient(config: MCPClientConfig): Promise<MCPClient>createMCPClient({MCPClientConfig.transport: MCPTransportConfig | MCPTransport
Transport configuration for connecting to the MCP server
The `console` module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
* A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream.
* A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and
[`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module.
_**Warning**_: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for
more information.
Example using the global `console`:
```js
console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr
```
Example using the `Console` class:
```js
const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err
```
Prints to `stdout` with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html)
(the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)).
```js
const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout
```
See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
Generate a text and call tools for a given prompt using a language model.
This function streams the output. If you do not want to stream the output, use `generateText` instead.
@parammodel - The language model to use.@paramtools - Tools that are accessible to and can be called by the model. The model needs to support calling tools.@paramsystem - A system message that will be part of the prompt.@paramprompt - A simple text prompt. You can either use `prompt` or `messages` but not both.@parammessages - A list of messages. You can either use `prompt` or `messages` but not both.@parammaxOutputTokens - Maximum number of tokens to generate.@paramtemperature - Temperature setting.
The value is passed through to the provider. The range depends on the provider and model.
It is recommended to set either `temperature` or `topP`, but not both.@paramtopP - Nucleus sampling.
The value is passed through to the provider. The range depends on the provider and model.
It is recommended to set either `temperature` or `topP`, but not both.@paramtopK - Only sample from the top K options for each subsequent token.
Used to remove "long tail" low probability responses.
Recommended for advanced use cases only. You usually only need to use temperature.@parampresencePenalty - Presence penalty setting.
It affects the likelihood of the model to repeat information that is already in the prompt.
The value is passed through to the provider. The range depends on the provider and model.@paramfrequencyPenalty - Frequency penalty setting.
It affects the likelihood of the model to repeatedly use the same words or phrases.
The value is passed through to the provider. The range depends on the provider and model.@paramstopSequences - Stop sequences.
If set, the model will stop generating text when one of the stop sequences is generated.@paramseed - The seed (integer) to use for random sampling.
If set and supported by the model, calls will generate deterministic results.@parammaxRetries - Maximum number of retries. Set to 0 to disable retries. Default: 2.@paramabortSignal - An optional abort signal that can be used to cancel the call.@paramtimeout - An optional timeout in milliseconds. The call will be aborted if it takes longer than the specified timeout.@paramheaders - Additional HTTP headers to be sent with the request. Only applicable for HTTP-based providers.@paramonChunk - Callback that is called for each chunk of the stream. The stream processing will pause until the callback promise is resolved.@paramonError - Callback that is called when an error occurs during streaming. You can use it to log errors.@paramonStepFinish - Callback that is called when each step (LLM call) is finished, including intermediate steps.@paramonFinish - Callback that is called when all steps are finished and the response is complete.@returnA result object for accessing different stream types and additional information.
Condition for stopping the generation when there are tool results in the last step.
When the condition is an array, any of the conditions can be met to stop the generation.
@defaultstepCountIs(1)
stopWhen: function stepCountIs(stepCount: number): StopCondition<any>stepCountIs(10),
The tool calls that were made during the generation.
toolCalls) {var console: Console
The `console` module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
* A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream.
* A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and
[`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module.
_**Warning**_: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for
more information.
Example using the global `console`:
```js
console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr
```
Example using the `Console` class:
```js
const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err
```
Prints to `stdout` with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html)
(the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)).
```js
const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout
```
See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
A text stream that returns only the generated text deltas. You can use it
as either an AsyncIterable or a ReadableStream. When an error occurs, the
stream will throw the error.
The `process.stdout` property returns a stream connected to`stdout` (fd `1`). It is a `net.Socket` (which is a `Duplex` stream) unless fd `1` refers to a file, in which case it is
a `Writable` stream.
For example, to copy `process.stdin` to `process.stdout`:
```js
import { stdin, stdout } from 'node:process';
stdin.pipe(stdout);
```
`process.stdout` differs from other Node.js streams in important ways. See `note on process I/O` for more information.
Sends data on the socket. The second parameter specifies the encoding in the
case of a string. It defaults to UTF8 encoding.
Returns `true` if the entire data was flushed successfully to the kernel
buffer. Returns `false` if all or part of the data was queued in user memory.`'drain'` will be emitted when the buffer is again free.
The optional `callback` parameter will be executed when the data is finally
written out, which may not be immediately.
See `Writable` stream `write()` method for more
information.
@sincev0.1.90@paramencoding Only used when data is `string`.
The `console` module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
* A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream.
* A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and
[`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module.
_**Warning**_: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for
more information.
Example using the global `console`:
```js
console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr
```
Example using the `Console` class:
```js
const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err
```
Prints to `stdout` with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html)
(the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)).
```js
const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout
```
See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log("\n\n---");var console: Console
The `console` module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
* A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream.
* A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and
[`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module.
_**Warning**_: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for
more information.
Example using the global `console`:
```js
console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr
```
Example using the `Console` class:
```js
const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err
```
Prints to `stdout` with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html)
(the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)).
```js
const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout
```
See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log("Tip: If prompted to authenticate, complete the auth flow and run again.")
Generate a text and call tools for a given prompt using a language model.
This function does not stream the output. If you want to stream the output, use `streamText` instead.
@parammodel - The language model to use.@paramtools - Tools that are accessible to and can be called by the model. The model needs to support calling tools.@paramtoolChoice - The tool choice strategy. Default: 'auto'.@paramsystem - A system message that will be part of the prompt.@paramprompt - A simple text prompt. You can either use `prompt` or `messages` but not both.@parammessages - A list of messages. You can either use `prompt` or `messages` but not both.@parammaxOutputTokens - Maximum number of tokens to generate.@paramtemperature - Temperature setting.
The value is passed through to the provider. The range depends on the provider and model.
It is recommended to set either `temperature` or `topP`, but not both.@paramtopP - Nucleus sampling.
The value is passed through to the provider. The range depends on the provider and model.
It is recommended to set either `temperature` or `topP`, but not both.@paramtopK - Only sample from the top K options for each subsequent token.
Used to remove "long tail" low probability responses.
Recommended for advanced use cases only. You usually only need to use temperature.@parampresencePenalty - Presence penalty setting.
It affects the likelihood of the model to repeat information that is already in the prompt.
The value is passed through to the provider. The range depends on the provider and model.@paramfrequencyPenalty - Frequency penalty setting.
It affects the likelihood of the model to repeatedly use the same words or phrases.
The value is passed through to the provider. The range depends on the provider and model.@paramstopSequences - Stop sequences.
If set, the model will stop generating text when one of the stop sequences is generated.@paramseed - The seed (integer) to use for random sampling.
If set and supported by the model, calls will generate deterministic results.@parammaxRetries - Maximum number of retries. Set to 0 to disable retries. Default: 2.@paramabortSignal - An optional abort signal that can be used to cancel the call.@paramtimeout - An optional timeout in milliseconds. The call will be aborted if it takes longer than the specified timeout.@paramheaders - Additional HTTP headers to be sent with the request. Only applicable for HTTP-based providers.@paramexperimental_generateMessageId - Generate a unique ID for each message.@paramonStepFinish - Callback that is called when each step (LLM call) is finished, including intermediate steps.@paramonFinish - Callback that is called when all steps are finished and the response is complete.@returnsA result object that contains the generated text, the results of the tool calls, and additional information.
generateText } from 'ai';import { const openai: OpenAIProvider
Creates a new instance of the Composio SDK.
The constructor initializes the SDK with the provided configuration options,
sets up the API client, and initializes all core models (tools, toolkits, etc.).
@paramconfig - Configuration options for the Composio SDK@paramconfig.apiKey - The API key for authenticating with the Composio API@paramconfig.baseURL - The base URL for the Composio API (defaults to production URL)@paramconfig.allowTracking - Whether to allow anonymous usage analytics@paramconfig.provider - The provider to use for this Composio instance (defaults to OpenAIProvider)@example```typescript
// Initialize with default configuration
const composio = new Composio();
// Initialize with custom API key and base URL
const composio = new Composio({
apiKey: 'your-api-key',
baseURL: 'https://api.composio.dev'
});
// Initialize with custom provider
const composio = new Composio({
apiKey: 'your-api-key',
provider: new CustomProvider()
});
```
Composio({provider?: VercelProvider | undefined
The tool provider to use for this Composio instance.
@examplenew OpenAIProvider()
provider: new
new VercelProvider({ strict }?: { strict?: boolean;}): VercelProvider
Creates a new instance of the VercelProvider.
This provider enables integration with the Vercel AI SDK,
allowing Composio tools to be used with Vercel AI applications.
@example```typescript
// Initialize the Vercel provider
const provider = new VercelProvider();
// Use with Composio
const composio = new Composio({
apiKey: 'your-api-key',
provider: new VercelProvider()
});
// Use the provider to wrap tools for Vercel AI SDK
const vercelTools = provider.wrapTools(composioTools, composio.tools.execute);
```
// create an auth config for gmail// then create a connected account with an external user id that identifies the userconstconst externalUserId: "your-external-user-id"externalUserId = "your-external-user-id";constconst tools: ToolSettools = awaitconst composio: Composio<VercelProvider>composio.Composio<VercelProvider>.tools: Tools<unknown, unknown, VercelProvider>
Get a specific tool by its slug.
This method fetches the tool from the Composio API and wraps it using the provider.
@paramuserId - The user id to get the tool for@paramslug - The slug of the tool to fetch@paramoptions - Optional provider options including modifiers@returnsThe wrapped tool@example```typescript
// Get a specific tool by slug
const hackerNewsUserTool = await composio.tools.get('default', 'HACKERNEWS_GET_USER');
// Get a tool with schema modifications
const tool = await composio.tools.get('default', 'GITHUB_GET_REPOS', {
modifySchema: (toolSlug, toolkitSlug, schema) => {
// Customize the tool schema
return {...schema, description: 'Custom description'};
}
});
```
Generate a text and call tools for a given prompt using a language model.
This function does not stream the output. If you want to stream the output, use `streamText` instead.
@parammodel - The language model to use.@paramtools - Tools that are accessible to and can be called by the model. The model needs to support calling tools.@paramtoolChoice - The tool choice strategy. Default: 'auto'.@paramsystem - A system message that will be part of the prompt.@paramprompt - A simple text prompt. You can either use `prompt` or `messages` but not both.@parammessages - A list of messages. You can either use `prompt` or `messages` but not both.@parammaxOutputTokens - Maximum number of tokens to generate.@paramtemperature - Temperature setting.
The value is passed through to the provider. The range depends on the provider and model.
It is recommended to set either `temperature` or `topP`, but not both.@paramtopP - Nucleus sampling.
The value is passed through to the provider. The range depends on the provider and model.
It is recommended to set either `temperature` or `topP`, but not both.@paramtopK - Only sample from the top K options for each subsequent token.
Used to remove "long tail" low probability responses.
Recommended for advanced use cases only. You usually only need to use temperature.@parampresencePenalty - Presence penalty setting.
It affects the likelihood of the model to repeat information that is already in the prompt.
The value is passed through to the provider. The range depends on the provider and model.@paramfrequencyPenalty - Frequency penalty setting.
It affects the likelihood of the model to repeatedly use the same words or phrases.
The value is passed through to the provider. The range depends on the provider and model.@paramstopSequences - Stop sequences.
If set, the model will stop generating text when one of the stop sequences is generated.@paramseed - The seed (integer) to use for random sampling.
If set and supported by the model, calls will generate deterministic results.@parammaxRetries - Maximum number of retries. Set to 0 to disable retries. Default: 2.@paramabortSignal - An optional abort signal that can be used to cancel the call.@paramtimeout - An optional timeout in milliseconds. The call will be aborted if it takes longer than the specified timeout.@paramheaders - Additional HTTP headers to be sent with the request. Only applicable for HTTP-based providers.@paramexperimental_generateMessageId - Generate a unique ID for each message.@paramonStepFinish - Callback that is called when each step (LLM call) is finished, including intermediate steps.@paramonFinish - Callback that is called when all steps are finished and the response is complete.@returnsA result object that contains the generated text, the results of the tool calls, and additional information.
generateText({model: LanguageModel
The language model to use.
model: function openai(modelId: OpenAIResponsesModelId): LanguageModelV3
Default OpenAI provider instance.
openai("gpt-5"),messages: ModelMessage[]
A list of messages.
You can either use `prompt` or `messages` but not both.
messages: [ {role: "user"role: "user",content: UserContentcontent: `Send an email to soham.g@composio.dev with the subject 'Hello from composio' and the body 'Congratulations on sending your first email using AI Agents and Composio!'`, }, ],tools?: ToolSet | undefined
The tools that the model can call. The model needs to support calling tools.
tools,});var console: Console
The `console` module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
* A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream.
* A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and
[`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module.
_**Warning**_: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for
more information.
Example using the global `console`:
```js
console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr
```
Example using the `Console` class:
```js
const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err
```
Prints to `stdout` with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html)
(the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)).
```js
const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout
```
See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log("Email sent successfully!", { text: stringtext });