Skip to content

examples/gemini_example.py mentioned in the docs doesn't exist #333

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
CaliLuke opened this issue Apr 8, 2025 · 7 comments
Open

examples/gemini_example.py mentioned in the docs doesn't exist #333

CaliLuke opened this issue Apr 8, 2025 · 7 comments

Comments

@CaliLuke
Copy link

CaliLuke commented Apr 8, 2025

I'm having issues running Graphiti using Gemini and the example on how to do that seems to be missing, so I'm stuck. Could you please update the docs with a working procedure we can look at? This is the snippet from the readme doc.

Make sure to replace the placeholder value with your actual Google API key. You can find more details in the example file at examples/gemini_example.py.
Thanks!

@danielchalef
Copy link
Member

We had a misconfiguration in the dependencies. Please see: #336

The instructions here do work for Gemini: https://github.com/getzep/graphiti?tab=readme-ov-file#using-graphiti-with-google-gemini

@CaliLuke
Copy link
Author

CaliLuke commented Apr 8, 2025

yeah but the docs are still wrong, please update it so other people or llms don't get confused. also your MCP instructions only work for OpenAI.

@danielchalef
Copy link
Member

danielchalef commented Apr 9, 2025

Please upgrade to the latest version >v0.9.3 and run:

poetry add "graphiti-core[google-genai]"

# or

uv add "graphiti-core[google-genai]"

I've tested our example code for configuring Gemini and it works:

from graphiti_core.llm_client.gemini_client import GeminiClient, LLMConfig
from graphiti_core.embedder.gemini import GeminiEmbedder, GeminiEmbedderConfig

# Google API key configuration
api_key = "<your-google-api-key>"

# Initialize Graphiti with Gemini clients
graphiti = Graphiti(
    "bolt://localhost:7687",
    "neo4j",
    "password",
    llm_client=GeminiClient(
        config=LLMConfig(
            api_key=api_key,
            model="gemini-2.0-flash"
        )
    ),
    embedder=GeminiEmbedder(
        config=GeminiEmbedderConfig(
            api_key=api_key,
            embedding_model="embedding-001"
        )
    )
)

We con't currently support the cross-encoder reranker with LLM providers other than OpenAI. We'll investigate adding support for Gemini, and would welcome a contribution.

Please let me know if the above still doesn't work for you.

@CaliLuke
Copy link
Author

CaliLuke commented Apr 9, 2025 via email

@danielchalef
Copy link
Member

There are many different search strategies that do not rely on a cross-encoder. Please see: https://help.getzep.com/graphiti/graphiti/searching

@prasmussen15
Copy link
Collaborator

prasmussen15 commented Apr 9, 2025

@CaliLuke Thanks for the feedback. As Daniel mentioned, the cross_encoder is only used when performing advanced searches and selecting the cross_encoder as the reranker method.

To make this work out of the box for openAI (by far the most popular use case, especially for people newer to GenAI), I hacked together a cross-encoder using gpt-4o-mini and logprobs, logit-bias, and setting the max tokens to 1. A similar thing can be done with Gemini or Anthropic, but it isn't a top priority for us currently as we recommend using an actual reranker for production use cases.

We use the open source bge-m3 reranker for our implementation, but the BGERerankerClient can be used as a template for any open source TEI-compatible reranker simply by changing the URI.

Cohere and Voyage AI are also popular cross-encoder reranker providers. We currently don't have pre-made clients for them as we haven't gotten requests from the community for more reranker support. We will likely eventually implement these though.

If there is a particular cross-encoder you want to use, we would happily work with you if you want to contribute. You can use the existing cross-encoder clients as a template.

If, on the other hand, you don't want to use the cross-encoder at all for reranking, you can either provide an openAI key with a free account and it won't be used as along as you don't use the reranker. Another work around is to create a dummy class that inherits from the CrossEncoderClient and implements a default behavior for rank. This will work as long as you don't use the cross-encoder option in search.

Hope this helps with understanding how and why Graphiti works the way it does.

@faisal00813
Copy link

faisal00813 commented Apr 30, 2025

I am still getting openai.OpenAIError: The api_key client option must be set in graphiti_core in /usr/local/lib/python3.11/dist-packages (0.10.5)
The workaround that worked for me was setting os.environ['OPENAI_API_KEY'] = "<GEMINI_KEY>"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants