This repository provides a quickstart guide for using the Agent Catalog with Capella Model Services and Couchbase.
- Python 3.8+
- Poetry (Installation Guide)
- pip (Python package installer)
- Git (for repository management)
- An OpenAI API Key (or other LLM provider)
- Couchbase Capella account (or local Couchbase installation)
The fastest way to get started is using our automated setup script:
# Clone the repository
git clone https://github.com/couchbaselabs/agent-catalog-quickstart.git
cd agent-catalog-quickstart
# Make the setup script executable
chmod +x scripts/setup.sh
# Run the automated setup script
bash scripts/setup.sh
This script will:
- Install all Agent Catalog libraries and dependencies
- Run
poetry install
at the root and in all example directories - Install required LlamaIndex packages
- Create template environment files for all examples
- Set up git for clean repository state
- Verify the installation
- Provide next steps
If you prefer to install manually or need to troubleshoot:
Install all libraries in the correct dependency order:
# Install core library first
pip install -e agent-catalog/libs/agentc_core
# Install integrations
pip install -e agent-catalog/libs/agentc_integrations/langchain \
-e agent-catalog/libs/agentc_integrations/langgraph \
-e agent-catalog/libs/agentc_integrations/llamaindex
# Install CLI and testing
pip install -e agent-catalog/libs/agentc_cli \
-e agent-catalog/libs/agentc_testing
# Install main package
pip install -e agent-catalog/libs/agentc
# Install LlamaIndex dependencies
pip install llama-index llama-index-vector-stores-couchbase
Important: You must run poetry install
in multiple locations:
# Install root dependencies
poetry install
# Install dependencies for each example agent
cd notebooks/flight_search_agent_langraph && poetry install && cd ../..
cd notebooks/hotel_search_agent_langchain && poetry install && cd ../..
cd notebooks/landmark_search_agent_llamaindex && poetry install && cd ../..
agentc --help
Create a .env
file in each example directory with the following configuration:
# OpenAI API Configuration
OPENAI_API_KEY="your-openai-api-key"
# Couchbase Configuration
CB_CONN_STRING="couchbases://your-cluster.cloud.couchbase.com"
CB_USERNAME="your-username"
CB_PASSWORD="your-password"
CB_BUCKET="vector-search-testing"
CB_SCOPE="agentc_data"
CB_COLLECTION="hotel_data"
CB_INDEX="hotel_data_index"
# Capella API Configuration
CAPELLA_API_ENDPOINT="https://your-endpoint.ai.cloud.couchbase.com"
CAPELLA_API_EMBEDDING_MODEL="intfloat/e5-mistral-7b-instruct"
CAPELLA_API_LLM_MODEL="meta-llama/Llama-3.1-8B-Instruct"
# Agent Catalog Configuration
AGENT_CATALOG_CONN_STRING="couchbase://127.0.0.1"
AGENT_CATALOG_BUCKET="vector-search-testing"
AGENT_CATALOG_USERNAME="your-username"
AGENT_CATALOG_PASSWORD="your-password"
AGENT_CATALOG_CONN_ROOT_CERTIFICATE=""
# Environment variable to prevent tokenizer warnings
TOKENIZERS_PARALLELISM=false
# OpenAI API Configuration
OPENAI_API_KEY="your-openai-api-key"
# Couchbase Configuration
CB_CONN_STRING="couchbase://127.0.0.1"
CB_USERNAME="Administrator"
CB_PASSWORD="password"
CB_BUCKET="default"
CB_SCOPE="_default"
CB_COLLECTION="_default"
CB_INDEX="vector_index"
# Agent Catalog Configuration
AGENT_CATALOG_CONN_STRING="couchbase://127.0.0.1"
AGENT_CATALOG_BUCKET="default"
AGENT_CATALOG_USERNAME="Administrator"
AGENT_CATALOG_PASSWORD="password"
AGENT_CATALOG_CONN_ROOT_CERTIFICATE=""
# Environment variable to prevent tokenizer warnings
TOKENIZERS_PARALLELISM=false
Important: Each example directory needs its own .env
file:
notebooks/flight_search_agent_langraph/.env
notebooks/hotel_search_agent_langchain/.env
notebooks/landmark_search_agent_llamaindex/.env
Navigate to any example directory and initialize:
cd notebooks/hotel_search_agent_langchain
agentc init
agentc index .
Important: The git repository must be clean before publishing:
# Commit any changes first
git add .
git commit -m "Your commit message"
# Then publish
agentc publish
# Run the hotel search agent
python main.py
# Run with specific queries
python main.py "Find hotels in Paris with free breakfast"
This quickstart includes three example agents:
- Flight Search Agent (
notebooks/flight_search_agent_langraph/
) - Searches and books flights using LangGraph - Hotel Search Agent (
notebooks/hotel_search_agent_langchain/
) - Hotel search and support using LangChain - Landmark Search Agent (
notebooks/landmark_search_agent_llamaindex/
) - Searches landmarks and attractions using LlamaIndex
Each example includes:
- Complete source code
- Configuration files
- Test cases
- Documentation
- Own poetry dependencies (requires
poetry install
in each directory)
Command | Description |
---|---|
agentc init |
Initialize agent catalog in current directory |
agentc index . |
Index the current agent directory |
agentc publish |
Publish agent to catalog (requires clean git status) |
agentc --help |
Show all available commands |
agentc env |
Show environment configuration |
-
"command not found: agentc"
- Run the setup script:
bash scripts/setup.sh
- Or install manually following the manual setup steps
- Run the setup script:
-
"No module named 'llama_index.vector_stores'"
- Install LlamaIndex:
pip install llama-index llama-index-vector-stores-couchbase
- Run
poetry install
in the example directory
- Install LlamaIndex:
-
"Could not find the environment variable $AGENT_CATALOG_CONN_STRING"
- Ensure each example directory has its own
.env
file - Include
AGENT_CATALOG_CONN_ROOT_CERTIFICATE=""
in the.env
file
- Ensure each example directory has its own
-
"Cannot publish a dirty catalog to the DB"
- Commit all changes:
git add . && git commit -m "Your message"
- Ensure
git status
shows a clean repository beforeagentc publish
- Commit all changes:
-
Connection errors to Couchbase
- Verify your
.env
configuration in each example directory - Check that your Couchbase cluster is accessible
- Ensure proper credentials and connection strings
- Verify your
-
"Certificate error" when connecting
- For local installations, use
couchbase://127.0.0.1
- For Capella, ensure you're using the correct
couchbases://
connection string - Include
AGENT_CATALOG_CONN_ROOT_CERTIFICATE=""
in your.env
- For local installations, use
-
Poetry dependency issues
- Run
poetry install
in the root directory - Run
poetry install
in each example directory separately - Each example has its own
pyproject.toml
and requires separate installation
- Run
-
Tokenizer parallelism warnings
- Add
TOKENIZERS_PARALLELISM=false
to your.env
files
- Add
- Poetry installed
- Agent Catalog libraries installed (
pip install -e ...
) - LlamaIndex installed (
pip install llama-index llama-index-vector-stores-couchbase
) - Root poetry dependencies installed (
poetry install
in root) - Example poetry dependencies installed (
poetry install
in each example directory) -
.env
files created in each example directory - Environment variables configured with actual credentials
- Git repository in clean state (for publishing)
- Check the
docs/
directory for detailed guides - Look at example implementations in
notebooks/
- Review error messages for specific configuration issues
- Ensure you've run
poetry install
in all required directories
- Create a new directory under
notebooks/
- Add your agent code, prompts, and tools
- Create appropriate configuration files (
pyproject.toml
,.env
) - Run
poetry install
in the new directory - Run
agentc init
andagentc index .
cd notebooks/hotel_search_agent_langchain
python -m pytest tests/
Run evaluations with Arize:
python run_evaluations.py
Each example agent follows this structure:
notebooks/agent_name/
├── main.py # Main agent implementation
├── pyproject.toml # Poetry dependencies (requires poetry install)
├── .env # Environment configuration
├── prompts/ # Agent prompts and templates
├── tools/ # Agent tools and functions
├── data/ # Data loading and processing
├── tests/ # Test cases
└── evals/ # Evaluation scripts
This is a quickstart repository. For contributing to the main Agent Catalog:
- Fork the repository
- Create a feature branch
- Make your changes
- Ensure all poetry dependencies are installed
- Commit changes (required for publishing)
- Submit a pull request
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.