Skip to content

MosaicML MPT-7B #136

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
ehartford opened this issue May 8, 2023 · 2 comments
Open

MosaicML MPT-7B #136

ehartford opened this issue May 8, 2023 · 2 comments

Comments

@ehartford
Copy link

As was suggested in ggml-org/llama.cpp#1333

Create a basic inference example for MosaicML MPT-7B model

@lukasmoellerch
Copy link
Contributor

I'll see how much my replit branch (from #131) would have to be adjusted... I think the main difference is the tokenizer and qkv clamping.

@lukasmoellerch
Copy link
Contributor

Looks decent, I'll create one PR for both then.

Screenshot 2023-05-10 at 20 11 37

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants