Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.openbunny.ai/llms.txt

Use this file to discover all available pages before exploring further.

This guide walks you through setting up OpenBunny in fully local mode with SQLite. You’ll have a working task extraction system connected to at least one messaging channel.

Prerequisites

  • Node.js 22 or later
  • An OpenRouter API key
  • Credentials for at least one messaging channel (Slack, WhatsApp, Telegram, or Gmail)

Setup

1
Clone the repository
2
git clone https://github.com/openbunny/openbunny.git
cd openbunny
3
Install dependencies
4
cd service
npm install
5
Configure environment
6
Copy the example environment file and add your API key:
7
cp .env.example .env
8
BACKEND=local
SQLITE_PATH=./data/openbunny.db
OPENROUTER_API_KEY=sk-or-v1-your-key-here
OPENROUTER_MODEL=anthropic/claude-sonnet-4
9
Connect a channel
10
Run the interactive setup to configure your first channel:
11
npm run setup
12
This walks you through connecting Slack, WhatsApp, Telegram, or Gmail. See the Channels docs for detailed setup instructions per channel.
13
Start the service
14
npm run dev
15
The service starts on http://localhost:3100. It begins listening to your configured channels immediately.
16
Start the client (optional)
17
In a new terminal:
18
cd ../client
npm install
npm run dev
19
Open http://localhost:3000 to see the Kanban board UI.

Verify it’s working

Check the service health endpoint:
curl http://localhost:3100/health
Send a message in your connected channel that contains an actionable task. Within a few minutes (after the debounce window), OpenBunny will process the conversation and create a task. View your tasks:
curl http://localhost:3100/tasks | jq

Next steps

Architecture

Understand how the components fit together

Configuration

Fine-tune the service behavior

Deploy to cloud

Set up multi-tenant cloud deployment

LLM agent

Learn how the agent processes conversations