You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I built my own llama.cpp b4743 which works fine with LM Studio but when loading a model with LLamaSharp i get this:
System.TypeLoadException: 'Could not load type 'LLama.Native.NativeApi' from assembly 'LLamaSharp, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null' because the method 'llama_backend_free' has no implementation (no RVA).'
llama_backend_free exists with the correct signature and is properly exported and both are compiled as x64.
I am limited to Visual Studio 2017.
The text was updated successfully, but these errors were encountered:
Any given version of LLamaSharp only works with exactly one version of llama.cpp. Check the table at the bottom of the readme for the exact version you need.
I built my own llama.cpp b4743 which works fine with LM Studio but when loading a model with LLamaSharp i get this:
System.TypeLoadException: 'Could not load type 'LLama.Native.NativeApi' from assembly 'LLamaSharp, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null' because the method 'llama_backend_free' has no implementation (no RVA).'
llama_backend_free exists with the correct signature and is properly exported and both are compiled as x64.
I am limited to Visual Studio 2017.
The text was updated successfully, but these errors were encountered: