A GPT-powered chatbot that suggests games you'll love
python -m venv env
source env/bin/activate
pip install --upgrade pip
pip install -r requirements.txt
On Windows cmd
or PowerShell, instead of using the source
command above to activate the virtual environment, use:
env\Scripts\activate.bat
Install docker using the repository
Prerequisites:
- Windows 11 or Windows 10 version 2004 or higher to support WSL2: How to Install Linux on Windows with WSL
- VS Code with WSL, Remote Development, and Python extensions
Optional:
Mind the End of Your Line (link): GitHub Desktop for Windows may change end of line characters from
LF
toCRLF
to match Windows defaults. Scripts in this repo must useLF
to run correctly in Docker - in particular run.sh will fail with ^M ifCRLF
eol characters are used. In the fileC:\Users\[username]\.gitconfig
section labeled[core]
addautocrlf = false
to preserve line endings in the repo. Install the VS Code extension eol for a visual indicator ofLF
andCRLF
in the editor. A.gitattributes
file in this repo preventsCRLF
eol characters from being checked into.sh
files.
- From a PowerShell prompt, run
wsl --install
from PowerShell and restart Windows - Install Ubuntu from the Microsoft Store and setup a root user (a prompt should launch automatically after installing Ubuntu or by manually launching Ubuntu from Windows Terminal)
- Clone the repository in the Linux filesystem for improved performance, either in Ubuntu shell or with Windows GitHub Desktop app referencing folder
\\wsl.localhost\Ubuntu\<path to repo>
. - Launch VS Code from the repository location in WSL:
code .
and observe VS Code connected to WSL via green status bar icon at lower-right:><WSL: Ubuntu
- Or: Launch VS Code from Windows, choose "Connect to WSL" in the command palette (Ctrl+Shift+P). Then open the repo folder in VS Code Explorer.
- Ensure python, pip, and venv are installed on Ubuntu. In VS Code Terminal or Windows Terminal (Ubuntu):
apt install python3 python3-pip python3.10-venv
- Note: Python 3.10 was the default version installed in Ubuntu 22 LTS as of this writing 9/4/2023.
- In the VS Code command palette, run
Python: Create Environment
. This will create a new venv in the folder .venv by default. - After VS Code creates the environment, navigate to the VS Code Terminal window and install dependencies:
- Note that the terminal prompt is prefixed by
(.venv)
and you are working in the virtual environment created in VS Code - Run
pip install --upgrade pip
- Run
pip install -r requirements.txt
- Note that the terminal prompt is prefixed by
- VS Code setup is complete, main.py can be run in the debugger with F5.
cp config.template.json config.json
Replace values from the template with your actual values:
- YouTube Data API v3 Key
- channel ID(s) (if desired)
- OpenAI API Key
- [Optional] LangSmith API Key (https://smith.langchain.com/settings)
Note: channel_id
can be a single string for one channel ID, or a list
of strings if you want to extract video transcripts from multiple channels.
- Create file:
.streamlit\secrets.toml
- Add OpenAI API key to secrets.toml:
OPENAI_API_KEY = "your_api_key"
Run python main.py -h
for more info and options.
python main.py -m end-to-end
This extracts transcripts to a .json
file, creates and saves FAISS
embeddings to the faiss_index
directory, then starts an interactive
chat session with the extracted documents.
python main.py -m extract-transcipts
Extracts transcripts to a .json
file; defaults to transcripts.json
, specify
with -tf
argument.
python main.py -m create-embeddings
A .json
file with extracted transcripts must exist to create embeddings;
specify with -tf
, defaults to transcripts.json
.
Get an LLM-generated summary of one random video.
python main.py -m summary-demo
A .json
file with extracted transcripts must exist to run the summary demo;
specify with -tf
, defaults to transcripts.json
.
python main.py -m chat-demo
FAISS embeddings must have already been extracted and placed in the
faiss_index
directory.
streamlit run Chat.py
docker build -t gamerec:v1 .
docker run --rm -p 8880:8501 gamerec:v1
Navigate to http://localhost:8880 in your browser to view the application.