-
Notifications
You must be signed in to change notification settings - Fork 47
Phi-3: error loading model hyperparameters #106
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
You're using an old gguf. For more info: ggml-org/llama.cpp#8627 (comment) |
Ah, thank you! Unfortunately this isn't a model I can easily replace, as it's a specialized model (Dutch language). I'll check if there is a new version of it. But if not, is there something I can do to override this manually? // No new version, though I've asked if one is on the horizon. |
You can use play with this script to add the missing metadata: https://github.com/ggerganov/llama.cpp/blob/master/gguf-py/scripts/gguf_set_metadata.py It would be nice to have a default value in llama.cpp code, so old models won't break. I'll have a look on this later |
This should be fixed in the latest release |
Absoutely brilliant. I'm so impressed you made an upstream fix. Thank you! |
Uh oh!
There was an error while loading. Please reload this page.
Just a quick question: I take it this is an issue with the model? Or is there something I can do to fix this? Perhaps add the value manually?
Hmm, I'm acutally pretty sure I was able to run this model in the past. Maybe something changed in llama.cpp?
I did just switch to preloading the model separately from starting it. My preload code:
The text was updated successfully, but these errors were encountered: