Skip to content

System.TypeLoadException: 'Could not load type 'LLama.Native.NativeApi' from assembly 'LLamaSharp, #1119

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
CommanderLake opened this issue Mar 1, 2025 · 1 comment

Comments

@CommanderLake
Copy link

CommanderLake commented Mar 1, 2025

I built my own llama.cpp b4743 which works fine with LM Studio but when loading a model with LLamaSharp i get this:
System.TypeLoadException: 'Could not load type 'LLama.Native.NativeApi' from assembly 'LLamaSharp, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null' because the method 'llama_backend_free' has no implementation (no RVA).'

llama_backend_free exists with the correct signature and is properly exported and both are compiled as x64.

I am limited to Visual Studio 2017.

@martindevans
Copy link
Member

Any given version of LLamaSharp only works with exactly one version of llama.cpp. Check the table at the bottom of the readme for the exact version you need.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants