Skip to content

Does Graphiti Support DeepSeek LLM? #397

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
SPk127 opened this issue Apr 25, 2025 · 1 comment
Open

Does Graphiti Support DeepSeek LLM? #397

SPk127 opened this issue Apr 25, 2025 · 1 comment

Comments

@SPk127
Copy link

SPk127 commented Apr 25, 2025

Hello Graphiti Team! 👋

I'm exploring integrating alternative models and would appreciate your guidance on two aspects:

  1. DeepSeek LLM Support

    • Does Graphiti officially support DeepSeek language models?
    • If yes, would replacing OpenAIClient's config with model="deepseek-chat" be the correct approach?
    • Are there any working examples available?
  2. HuggingFace Embeddings

    • For embedding models, how should we properly replace the default with HuggingFace open-source models (e.g., intfloat/multilingual-e5)?
    • Would you have a concrete example of configuring this through GraphitiCore initialization?

Thank you for your time and for building such a valuable tool. Looking forward to your insights.

@prasmussen15
Copy link
Collaborator

Hey,

What service are you using for DeepSeek? If you are using something like Ollama to deploy it locally you should be able to deploy it to an openAI compatible endpoint. Then you can just use the openai_generic_client and set the model name to your model (the deploy should also give you the necessary API key).

If you are using a third-party API service to use DeepSeek, we support that as well. We have a groq client for example, but you can also use any openAI compatible API with the above generic client. If the service you use isn't supported, we can also help you submit a new llm_client for your use case.

For the huggingface embeddings, we don't actually have an example of a client for that which is an oversight on our part (we actually use HuggingFace models for our own deployment of Graphiti as part of Zep). I will work on adding that to Graphiti today though. In the meantime, you can look at how we implement an HF model for the cross-encoder and do something similar for the embedder: graphiti_core/cross_encoder/bge_reranker_client.py.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants