You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: mcp_server/README.md
+37-26
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,15 @@
1
1
# Graphiti MCP Server
2
2
3
-
Graphiti is a framework for building and querying temporally-aware knowledge graphs, specifically tailored for AI agents operating in dynamic environments. Unlike traditional retrieval-augmented generation (RAG) methods, Graphiti continuously integrates user interactions, structured and unstructured enterprise data, and external information into a coherent, queryable graph. The framework supports incremental data updates, efficient retrieval, and precise historical queries without requiring complete graph recomputation, making it suitable for developing interactive, context-aware AI applications.
3
+
Graphiti is a framework for building and querying temporally-aware knowledge graphs, specifically tailored for AI agents
4
+
operating in dynamic environments. Unlike traditional retrieval-augmented generation (RAG) methods, Graphiti
5
+
continuously integrates user interactions, structured and unstructured enterprise data, and external information into a
6
+
coherent, queryable graph. The framework supports incremental data updates, efficient retrieval, and precise historical
7
+
queries without requiring complete graph recomputation, making it suitable for developing interactive, context-aware AI
8
+
applications.
4
9
5
-
This is an experimental Model Context Protocol (MCP) server implementation for Graphiti. The MCP server exposes Graphiti's key functionality through the MCP protocol, allowing AI assistants to interact with Graphiti's knowledge graph capabilities.
10
+
This is an experimental Model Context Protocol (MCP) server implementation for Graphiti. The MCP server exposes
11
+
Graphiti's key functionality through the MCP protocol, allowing AI assistants to interact with Graphiti's knowledge
12
+
graph capabilities.
6
13
7
14
## Features
8
15
@@ -59,7 +66,7 @@ uv run graphiti_mcp_server.py
59
66
With options:
60
67
61
68
```bash
62
-
uv run graphiti_mcp_server.py --model gpt-4o --transport sse
69
+
uv run graphiti_mcp_server.py --model gpt-4o-mini --transport sse
63
70
```
64
71
65
72
Available arguments:
@@ -72,33 +79,34 @@ Available arguments:
72
79
73
80
### Docker Deployment
74
81
75
-
The Graphiti MCP server can be deployed using Docker. The Dockerfile uses `uv` for package management, ensuring consistent dependency installation.
82
+
The Graphiti MCP server can be deployed using Docker. The Dockerfile uses `uv` for package management, ensuring
83
+
consistent dependency installation.
76
84
77
85
#### Environment Configuration
78
86
79
87
Before running the Docker Compose setup, you need to configure the environment variables. You have two options:
80
88
81
89
1.**Using a .env file** (recommended):
82
90
83
-
- Copy the provided `.env.example` file to create a `.env` file:
84
-
```bash
85
-
cp .env.example .env
86
-
```
87
-
- Edit the `.env` file to set your OpenAI API key and other configuration options:
88
-
```
89
-
# Required for LLM operations
90
-
OPENAI_API_KEY=your_openai_api_key_here
91
-
MODEL_NAME=gpt-4o
92
-
# Optional: OPENAI_BASE_URL only needed for non-standard OpenAI endpoints
93
-
# OPENAI_BASE_URL=https://api.openai.com/v1
94
-
```
95
-
- The Docker Compose setup is configured to use this file if it exists (it's optional)
91
+
- Copy the provided `.env.example` file to create a `.env` file:
92
+
```bash
93
+
cp .env.example .env
94
+
```
95
+
- Edit the `.env` file to set your OpenAI API key and other configuration options:
96
+
```
97
+
# Required for LLM operations
98
+
OPENAI_API_KEY=your_openai_api_key_here
99
+
MODEL_NAME=gpt-4o-mini
100
+
# Optional: OPENAI_BASE_URL only needed for non-standard OpenAI endpoints
101
+
# OPENAI_BASE_URL=https://api.openai.com/v1
102
+
```
103
+
- The Docker Compose setup is configured to use this file if it exists (it's optional)
96
104
97
105
2. **Using environment variables directly**:
98
-
- You can also set the environment variables when running the Docker Compose command:
99
-
```bash
100
-
OPENAI_API_KEY=your_key MODEL_NAME=gpt-4o docker compose up
101
-
```
106
+
- You can also set the environment variables when running the Docker Compose command:
107
+
```bash
108
+
OPENAI_API_KEY=your_key MODEL_NAME=gpt-4o-mini docker compose up
109
+
```
102
110
103
111
#### Neo4j Configuration
104
112
@@ -154,7 +162,7 @@ To use the Graphiti MCP server with an MCP-compatible client, configure it to co
154
162
"NEO4J_USER": "neo4j",
155
163
"NEO4J_PASSWORD": "demodemo",
156
164
"OPENAI_API_KEY": "${OPENAI_API_KEY}",
157
-
"MODEL_NAME": "gpt-4o"
165
+
"MODEL_NAME": "gpt-4o-mini"
158
166
}
159
167
}
160
168
}
@@ -192,7 +200,7 @@ Or start the server with uv and connect to it:
192
200
"NEO4J_USER": "neo4j",
193
201
"NEO4J_PASSWORD": "demodemo",
194
202
"OPENAI_API_KEY": "${OPENAI_API_KEY}",
195
-
"MODEL_NAME": "gpt-4o"
203
+
"MODEL_NAME": "gpt-4o-mini"
196
204
}
197
205
}
198
206
}
@@ -215,7 +223,8 @@ The Graphiti MCP server exposes the following tools:
215
223
216
224
## Working with JSON Data
217
225
218
-
The Graphiti MCP server can process structured JSON data through the `add_episode` tool with `source="json"`. This allows you to automatically extract entities and relationships from structured data:
226
+
The Graphiti MCP server can process structured JSON data through the `add_episode` tool with `source="json"`. This
227
+
allows you to automatically extract entities and relationships from structured data:
219
228
220
229
```
221
230
add_episode(
@@ -236,7 +245,8 @@ To integrate the Graphiti MCP Server with the Cursor IDE, follow these steps:
0 commit comments