Presentation on llama.cpp on 25.07.2023 at karlsruhe.ai #2281
Replies: 3 comments 6 replies
-
Thanks for sharing the presentation! The slides look fine to me - I like the "Open-source is a positive sum game" quote :) Definitely let us know how it goes! |
Beta Was this translation helpful? Give feedback.
-
I did some LLaMA vs. LLaMA 2 measurements for the presentation where I calculated perplexity as a function of context length on a source code dump of kafe2, another project that I'm working on. I chose this over Wikitext because I wanted text with a large amount of long-distance relations. These are the results: As it turns out the perplexity of LLaMA 2 7b at 4096 context is actually better than the perplexity of LLaMA 13b at 2048 context. |
Beta Was this translation helpful? Give feedback.
-
The presentation was well-received, ~20 people were in the audience, mostly other KIT members. I had some interesting conversations but unfortunately I didn't meet a language model researcher that I could collaborate with specifically. I uploaded the slides that I used as well as the SVG and Python files that I used for creating the graphics in case someone wants to recycle them for documentation purposes. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I will hold a presentation on llama.cpp (mostly the things that I'm doing myself) on 25.07.2023 at karlsruhe.ai. I am uploading the corresponding slides here. They are still missing some things that I'll measure over the weekend but the slides should be mostly complete in terms of content. I'm currently using a low-effort stable diffusion gen of a llama on a motorcycle for the cover image; if someone gens me something better I'll use it, otherwise I'll probably look into locally installing Stable Diffusion again after not having used it for several months.
The current cover image
@ggerganov please check whether you agree with the way I'm representing the llama.cpp project.
@ikawrakow please check whether you agree with my descriptions of k-quants.
Beta Was this translation helpful? Give feedback.
All reactions