Skip to content
This repository was archived by the owner on Jun 24, 2024. It is now read-only.

Update to the latest llama.cpp ggml #118

Closed
philpax opened this issue Apr 6, 2023 · 2 comments · Fixed by #119
Closed

Update to the latest llama.cpp ggml #118

philpax opened this issue Apr 6, 2023 · 2 comments · Fixed by #119
Assignees
Labels
issue:enhancement New feature or request meta:good-first-issue Good for newcomers

Comments

@philpax
Copy link
Collaborator

philpax commented Apr 6, 2023

Another day, another suite of changes. Since our last update, M1 and AVX inference has improved, as well as a bunch of other improvements: https://github.com/ggerganov/llama.cpp/compare/437e77855a54e69c86fe03bc501f63d9a3fddb0e..HEAD

If you're interested in tackling this, check out our contributing document: https://github.com/rustformers/llama-rs/blob/main/CONTRIBUTING.md

@philpax philpax added issue:enhancement New feature or request meta:good-first-issue Good for newcomers labels Apr 6, 2023
@philpax
Copy link
Collaborator Author

philpax commented Apr 6, 2023

Note that this changes the implementation of the LLaMA model - it is not a drop-in upgrade!

@KerfuffleV2
Copy link
Contributor

KerfuffleV2 commented Apr 7, 2023

I'm working on this one. Just updating the bindings seemed pretty uneventful: ~~~some types changed from i64 to i32. After that fix it compiles and appears to function.~~~ edit: Killl meeee, I accidentally copied from the wrong llama.cpp directory and ended up reverted to an older version...

Still need to do the actual model change part, which hopefully will put #67 to bed in a permanent way.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
issue:enhancement New feature or request meta:good-first-issue Good for newcomers
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants