Skip to main content

OpenAssistant

Documentation | Playground

OpenAssistant is a javascript library for building AI assistant with powerful tools and an interactive React chat component.

Why OpenAssistant?

OpenAssistant is built based on Vercel AI SDK and provides:

  • a uniform interface for different AI providers
  • a set of powerful LLM tools (data analysis, visualization, mapping, etc.)
  • an interactive React chat component (optional)

for building your own AI assistant, along with a design allows you to easily create your own tools by :

  • providing your own context (e.g. data, callbacks etc.) for the tool execution
  • providing your own UI component for rendering the tool result
  • passing the result from the tool execution to the tool UI component or next tool execution.

Check out the following examples using OpenAssistant in action:

Getting Started

Installation

Install the core packages:

npm install @openassistant/core

Usage

Then, you can use the OpenAssistant in your application. For example:

import { createAssistant } from '@openassistant/core';

// get the singleton assistant instance
const assistant = await createAssistant({
name: 'assistant',
modelProvider: 'openai',
model: 'gpt-4o',
apiKey: 'your-api-key',
version: '0.0.1',
instructions: 'You are a helpful assistant',
// functions: {{}},
// abortController: null
});

// now you can send prompts to the assistant
await assistant.processTextMessage({
textMessage: 'Hello, how are you?',
streamMessageCallback: ({ isCompleted, message }) => {
console.log(isCompleted, message);
},
});

See the source code of the example 🔗 here.

tip

If you want to use Google Gemini as the model provider, you can do the following:

Install vercel google gemini client:

npm install @ai-sdk/google

Then, you can use update the assistant configuration to use Google Gemini.

OpenAssistant also supports the following model providers:

Model ProviderModelsDependency
OpenAIlink@ai-sdk/openai
Googlemodels@ai-sdk/google
Anthropicmodels@ai-sdk/anthropic
DeepSeekmodels@ai-sdk/deepseek
xAImodels@ai-sdk/xai
Ollamamodelsollama-ai-provider

Add a React Chat Component to your App

You can build your own chat interface along with the assistant instance created by createAssistant(). OpenAssistant also provides a pre-built chat component that you can use in your React application.

Installation

npm install @openassistant/ui

Usage

import { AiAssistant } from '@openassistant/ui';
// for React app without tailwindcss, you can import the css file
// import '@openassistant/ui/dist/index.css';

function App() {
return (
<AiAssistant
modelProvider="openai"
model="gpt-4"
apiKey="your-api-key"
version="v1"
welcomeMessage="Hello! How can I help you today?"
instructions="You are a helpful assistant."
functions={{}}
theme="dark"
useMarkdown={true}
showTools={true}
onMessageUpdated=({messages}) => {
console.log(messages);
}}
/>
);
}

See the source code of the example 🔗 here.

tip

If you are using TailwindCSS, you need to add the following configurations to your tailwind.config.js file:

import { nextui } from '@nextui-org/react';
...

module.exports = {
content: [
...,
'./node_modules/@nextui-org/theme/dist/**/*.{js,ts,jsx,tsx}',
'./node_modules/@openassistant/ui/dist/**/*.{js,ts,jsx,tsx}',
],
theme: {
extend: {},
},
darkMode: 'class',
plugins: [nextui()],
};

See the source code of the example 🔗 here.

Use Tools

OpenAssistant provides a set of tools that helps you build your AI application.

For a quick example:

localQuery in @openassistant/duckdb

This tool helps to query any data that has been loaded in your application using user's prompt.

  • the data in your application will be loaded into a local duckdb instance temporarily
  • LLM will generate SQL query based on user's prompt against the data
  • the SQL query result will be executed in the local duckdb instance
  • the query result will be displayed in a React table component

In your application, the data could be loaded from a csv/json/parquet/xml file. For this example, we will use the SAMPLE_DATASETS in dataset.ts to simulate the data.

export const SAMPLE_DATASETS = {
myVenues: [
{
index: 0,
location: 'New York',
latitude: 40.7128,
longitude: -74.006,
revenue: 12500000,
population: 8400000,
},
...
],
};
  • Import the localQuery tool from @openassistent/duckdb and use it in your application.
  • Provide the getValues function in the context to get the values from your data.
  • Use the tool in your AI assistant chat component
import { localQuery, LocalQueryTool } from '@openassistent/duckdb';

// load your data
// pass the metadata of the data to the assistant instructions
const instructions = `You are a helpful assistant. You can use the following datasets to answer the user's question:
datasetName: myVenues,
variables: index, location, latitude, longitude, revenue, population
`;

// use `LocalQueryTool` for type safety
const localQueryTool: LocalQueryTool = {
...localQuery,
context: {
...localQuery.context,
getValues: (datasetName: string, variableName: string) => {
return SAMPLE_DATASETS[datasetName][variableName];
},
},
};

// use the tool in the chat component
<AiAssistant
modelProvider="openai"
model="gpt-4o"
apiKey="your-api-key"
version="0.0.1"
welcomeMessage="Hello! How can I help you today?"
instructions={instructions}
functions={{ localQuery: localQueryTool }}
/>

🚀 Try it out!

See the source code of the example 🔗 here.

🌟 Features

  • 🤖 One interface for multiple AI providers
    • DeepSeek (Chat and Reasoner)
    • OpenAI (GPT models)
    • Google Gemini
    • Ollama (local AI models)
    • XAI Grok
    • Anthropic Claude
    • AWS Bedrock*
    • Azure OpenAI*

* via server API only, see how-to documentation here

  • 🌟 Easy-to-use Tools to extend your AI assistant

    • DuckDB: in-browser query data using duckdb via prompt
    • ECharts: visualize data using echarts via prompt
    • KeplerGl: create maps using keplergl via prompt
    • GeoDa: apply spatial data analysis using geoda wasm via prompt
  • 🎯 Built-in React chat component

    • Pre-built chat interface
    • Pre-built LLM configuration interface
    • Theme support
    • Take screenshot to ask [Demo]
    • Talk to ask [Demo]
    • Function calling support [Demo]

See the tutorial for more details.

  • 📦 Easy integration
    • CLI tool for adding components
    • TypeScript support
    • Tailwind CSS integration

🎯 Examples

Check out our example projects:

📄 License

MIT © Xun Li