-
Notifications
You must be signed in to change notification settings - Fork 539
examples/gemini_example.py mentioned in the docs doesn't exist #333
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
We had a misconfiguration in the dependencies. Please see: #336 The instructions here do work for Gemini: https://github.com/getzep/graphiti?tab=readme-ov-file#using-graphiti-with-google-gemini |
yeah but the docs are still wrong, please update it so other people or llms don't get confused. also your MCP instructions only work for OpenAI. |
Please upgrade to the latest version >v0.9.3 and run:
I've tested our example code for configuring Gemini and it works:
We con't currently support the cross-encoder reranker with LLM providers other than OpenAI. We'll investigate adding support for Gemini, and would welcome a contribution. Please let me know if the above still doesn't work for you. |
I think it's pretty self evident that whoever is interested in using Gemini
with graphiti is not going to have another paid OpenAi account just for
reranking, or at least that's going to be a very limited subset of users.
…On Wed, Apr 9, 2025 at 9:32 AM Daniel Chalef ***@***.***> wrote:
Please upgrade to the latest version >v0.9.3 and run:
poetry add "graphiti-core[google-genai]"
# or
uv add "graphiti-core[google-genai]"
I've tested our example code for configuring Gemini and it works:
from graphiti_core.llm_client.gemini_client import GeminiClient, LLMConfig
from graphiti_core.embedder.gemini import GeminiEmbedder, GeminiEmbedderConfig
# Google API key configuration
api_key = "<your-google-api-key>"
# Initialize Graphiti with Gemini clients
graphiti = Graphiti(
"bolt://localhost:7687",
"neo4j",
"password",
llm_client=GeminiClient(
config=LLMConfig(
api_key=api_key,
model="gemini-2.0-flash"
)
),
embedder=GeminiEmbedder(
config=GeminiEmbedderConfig(
api_key=api_key,
embedding_model="embedding-001"
)
)
)
We con't currently support the cross-encoder reranker with LLM providers
other than OpenAI. We'll investigate adding support for Gemini, and would
welcome a contribution.
Please let me know if the above still doesn't work for you.
—
Reply to this email directly, view it on GitHub
<#333 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAR36NUG57CIRF3M7RP5N32YVDRFAVCNFSM6AAAAAB2W35256VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDOOJQGMZTGOJSGM>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
*danielchalef* left a comment (getzep/graphiti#333)
<#333 (comment)>
Please upgrade to the latest version >v0.9.3 and run:
poetry add "graphiti-core[google-genai]"
# or
uv add "graphiti-core[google-genai]"
I've tested our example code for configuring Gemini and it works:
from graphiti_core.llm_client.gemini_client import GeminiClient, LLMConfig
from graphiti_core.embedder.gemini import GeminiEmbedder, GeminiEmbedderConfig
# Google API key configuration
api_key = "<your-google-api-key>"
# Initialize Graphiti with Gemini clients
graphiti = Graphiti(
"bolt://localhost:7687",
"neo4j",
"password",
llm_client=GeminiClient(
config=LLMConfig(
api_key=api_key,
model="gemini-2.0-flash"
)
),
embedder=GeminiEmbedder(
config=GeminiEmbedderConfig(
api_key=api_key,
embedding_model="embedding-001"
)
)
)
We con't currently support the cross-encoder reranker with LLM providers
other than OpenAI. We'll investigate adding support for Gemini, and would
welcome a contribution.
Please let me know if the above still doesn't work for you.
—
Reply to this email directly, view it on GitHub
<#333 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAR36NUG57CIRF3M7RP5N32YVDRFAVCNFSM6AAAAAB2W35256VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDOOJQGMZTGOJSGM>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
There are many different search strategies that do not rely on a cross-encoder. Please see: https://help.getzep.com/graphiti/graphiti/searching |
@CaliLuke Thanks for the feedback. As Daniel mentioned, the To make this work out of the box for openAI (by far the most popular use case, especially for people newer to GenAI), I hacked together a cross-encoder using gpt-4o-mini and logprobs, logit-bias, and setting the max tokens to 1. A similar thing can be done with Gemini or Anthropic, but it isn't a top priority for us currently as we recommend using an actual reranker for production use cases. We use the open source bge-m3 reranker for our implementation, but the Cohere and Voyage AI are also popular cross-encoder reranker providers. We currently don't have pre-made clients for them as we haven't gotten requests from the community for more reranker support. We will likely eventually implement these though. If there is a particular cross-encoder you want to use, we would happily work with you if you want to contribute. You can use the existing cross-encoder clients as a template. If, on the other hand, you don't want to use the cross-encoder at all for reranking, you can either provide an openAI key with a free account and it won't be used as along as you don't use the reranker. Another work around is to create a dummy class that inherits from the Hope this helps with understanding how and why Graphiti works the way it does. |
I am still getting |
I'm having issues running Graphiti using Gemini and the example on how to do that seems to be missing, so I'm stuck. Could you please update the docs with a working procedure we can look at? This is the snippet from the readme doc.
Make sure to replace the placeholder value with your actual Google API key. You can find more details in the example file at examples/gemini_example.py.
Thanks!
The text was updated successfully, but these errors were encountered: