A minimalistic Telegram assistant with customizable LLM API providers. It has a very simple installation process: download the binary for your architecture and run it. It will create an Sqlite3 database file, run migrations and start the bot.
Built with:
Go to BotFather and enter /newbot
. Fill in the description and save the token to the TELEGRAM_TOKEN
environment variable. To define the commands for the autocomplete: enter /setcommands
, select your bot, and then paste:
new - Clear the current context and start a new chat.
get_behavior - Display the current system message that defines the bot's behavior.
set_behavior - Set the new system message for defining the bot's behavior.
get_model - Get the current completion model.
set_model - Set the completion model for your bot.
version - Display the current version.
Make sure you have Docker & Docker Compose. On desktop, you can use Docker Desktop or OrbStack.
The Docker Compose file expects your environment variables to be loaded:
cp .env.example .envrc
# edit .envrc
source .envrc
Build the image and run the container:
docker build -t telegram-llm-assistant .
docker-compose up
- Install Rust via RustUp
cp .env.example .envrc
- Edit
.envrc
to set environment variables - Load environment variables from
.envrc
using direnv, orsource .envrc
-equivalent in your shell. - Now you can compile with
cargo build
- Install Rust via RustUp
- Clone the repository
- Run
cargo build --release