Skip to content

fix: resolve checkpointer error and improve coagents-starter docs #2

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
*.pckl
112 changes: 94 additions & 18 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,39 @@ This example contains a simple starter project which includes two different agen

**These instructions assume you are in the `coagents-starter/` directory**

## Quick Start (Python Agent)

1. **Setup the Python agent:**

```sh
cd agent-py
poetry install
echo "OPENAI_API_KEY=your_key_here" > .env
```

2. **Run the agent:**

```sh
langgraph dev --no-browser --port=8000 --config=langgraph.json --host=0.0.0.0
```

_If you encounter a "No checkpointer set" error:_

```sh
LANGGRAPH_API=true langgraph dev --no-browser --port=8000 --config=langgraph.json --host=0.0.0.0
```

3. **Setup and run the UI (in a new terminal):**

```sh
cd ui
pnpm i
echo "OPENAI_API_KEY=your_key_here" > .env
pnpm run dev
```

4. **Open [http://localhost:3000](http://localhost:3000)** - The UI is already configured to connect to the Python agent running on port 8000.

## Running the Agent

First, install the backend dependencies:
Expand Down Expand Up @@ -31,14 +64,25 @@ OPENAI_API_KEY=...
IMPORTANT:
Make sure the OpenAI API Key you provide, supports gpt-4o.

Then, run the demo:
### Running the Python Agent

You have two options for running the Python agent:

**Option 1: Using LangGraph Dev Server (Recommended)**

```sh
cd agent-py
LANGGRAPH_API=true langgraph dev --no-browser --port=8000 --config=langgraph.json --host=0.0.0.0
```

Python
**Option 2: Using Poetry (Local FastAPI)**

```sh
cd agent-py
poetry run demo
```

The agent code automatically detects which environment it's running in and handles checkpointer configuration accordingly.

## Running the UI

Expand All @@ -61,29 +105,50 @@ Then, create a `.env` file inside `./ui` with the following:
OPENAI_API_KEY=...
```

If you're using the **JS** agent, uncomment the code inside the `app/api/copilotkit/route.ts`, `remoteEndpoints` action:
## Frontend Configuration (`route.ts`)

The UI connects to your agent via the configuration in `ui/app/api/copilotkit/route.ts`. The current setup depends on which agent and method you're using:

### For Python Agent

**If using LangGraph Dev Server (Option 1 - Recommended):**
The current `route.ts` is already configured correctly:

```ts
// Uncomment this if you want to use LangGraph JS, make sure to
// remove the remote action url below too.
//
// langGraphPlatformEndpoint({
// deploymentUrl: "http://localhost:8123",
// langsmithApiKey: process.env.LANGSMITH_API_KEY || "", // only used in LangGraph Platform deployments
// agents: [{
// name: 'sample_agent',
// description: 'A helpful LLM agent.'
// }]
// }),
langGraphPlatformEndpoint({
deploymentUrl: "http://localhost:8000", // matches langgraph dev port
langsmithApiKey: process.env.LANGSMITH_API_KEY || "",
agents: [{ name: "sample_agent", description: "A helpful LLM agent." }],
}),
```

Make sure to comment out the other remote endpoint as this replaces it.
**If using Poetry/FastAPI (Option 2):**
Comment out the `langGraphPlatformEndpoint` and uncomment the basic endpoint:

```ts
// langGraphPlatformEndpoint({ ... }), // Comment this out
{
url: process.env.REMOTE_ACTION_URL || "http://localhost:8000/copilotkit",
},
```

### For JS Agent

Change the `deploymentUrl` port and uncomment as shown:

```ts
langGraphPlatformEndpoint({
deploymentUrl: "http://localhost:8123", // JS agent runs on port 8123
langsmithApiKey: process.env.LANGSMITH_API_KEY || "",
agents: [{ name: 'sample_agent', description: 'A helpful LLM agent.' }]
}),
```

**Running the JS Agent:**

- Run this command to start your LangGraph server `npx @langchain/langgraph-cli dev --host localhost --port 8123`
- Run this command to connect your Copilot Cloud Tunnel to the LangGraph server `npx copilotkit@latest dev --port 8123`


## Usage

Navigate to [http://localhost:3000](http://localhost:3000).
Expand All @@ -98,5 +163,16 @@ Make sure to create the `.env` mentioned above first!

A few things to try if you are running into trouble:

1. Make sure there is no other local application server running on the 8000 port.
2. Under `/agent/greeter/demo.py`, change `0.0.0.0` to `127.0.0.1` or to `localhost`
1. **Port conflicts:** Make sure there is no other local application server running on the 8000 port.

2. **Network issues:** Under `/agent-py/sample_agent/demo.py`, change `0.0.0.0` to `127.0.0.1` or to `localhost`

3. **"No checkpointer set" error:** This happens when the agent can't determine which environment it's running in. The agent code automatically detects this, but if you encounter this error:

- When using `langgraph dev`: Set `LANGGRAPH_API=true` before running: `LANGGRAPH_API=true langgraph dev ...`
- When using `poetry run demo`: The agent should automatically use MemorySaver

4. **Route configuration:** Make sure your `ui/app/api/copilotkit/route.ts` matches your agent setup:
- LangGraph dev server (port 8000) → Use `langGraphPlatformEndpoint`
- FastAPI demo (port 8000) → Use basic endpoint with `/copilotkit` path
- JS agent (port 8123) → Use `langGraphPlatformEndpoint` with port 8123
Binary file removed agent-py/.langgraph_api/.langgraph_checkpoint.1.pckl
Binary file not shown.
Binary file removed agent-py/.langgraph_api/.langgraph_checkpoint.2.pckl
Binary file not shown.
Binary file removed agent-py/.langgraph_api/.langgraph_ops.pckl
Binary file not shown.
Binary file not shown.
Binary file removed agent-py/.langgraph_api/store.pckl
Binary file not shown.
Binary file removed agent-py/.langgraph_api/store.vectors.pckl
Binary file not shown.
Loading