Skip to content

E5-mistral-7b-instruct embedding support #2936

@DavidPeleg6

Description

@DavidPeleg6

Hi :)
I noticed in the roadmap that embedding support is intended, and was wondering whether it includes llms such as mistral as well.

Specifically, e5_mistral has the added benefit of including only the adapter in the HF repo. so in this case we could deploy a single pod for both inference as well as truly SOTA embedding without added costs.

I assume it would be easier to implement since decoder only architectures are already supported.
I think e5_mistral the tweak should be to add a function LLMEngine that would take the last hidden state rather than sample on the output yes? if so, i could try and add the pr myself

please let me know if theres anything i could do to help.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions