Short-term memory. Long-term friendship.
RediBuddy is a lightweight, memory-backed AI assistant built using Redis and OpenAI. It mimics ChatGPT’s multi-chat interface with short-term memory using Redis as the memory store, letting users have quick, ephemeral conversations.
- Interactive chatbot interface
- Leverages Redis to provide super-fast conversational context to LLM.
- Conversations are tied to a unique session name and cleared at the end of the session.
- Node.js + Express (Backend API)
- Redis (Conversation store)
- OpenAI API (GPT-4o-mini)
- HTML + CSS + Vanilla JS (Frontend)
-
OpenAI API Key
Create an account and create an API key.
-
Clone this repository:
git clone https://github.com/booleanhunter/redibuddy.git cd redibuddy
-
Configure environment variables
Create a
.env
file at the root:OPENAI_API_KEY=your_openai_api_key REDIS_URL=your_redis_connection_string SERVER_PORT=3000
📝 Make sure to replace these placeholders with your real values before running the app.
-
Node.js (v18 or higher) Download & Install Node.js
-
Redis
You can either:
-
Install Redis locally: Redis installation guide
-
Or, use docker:
docker pull redis:8.0 docker run --name redibuddy-redis -p 6379:6379 redis:8.0
- Or use Redis cloud (no installation required)
-
-
Install dependencies
cd
into the project root and run:npm install
-
Configure environment variables
Replace the values in your
.env
file with your actual API key, Redis connection string, and desired server port. -
Start the server
npm start
You can skip manual setup and run everything using Docker (Requires Docker to be setup)
docker compose up
To stop the containers later:
docker compose down -v
Visit http://localhost:3000 in your browser (or use the port specified in .env
).
- 🔁 Multiple chats per user (like tabs)
- 🧠 Short-term memory support using Redis
- 🤖 Powered by GPT-4o-mini
- 🧩 Clean modular JS architecture
If you find a bug or have a feature request, open an issue.