pantry/projects/github.com/ggerganov/llama.cpp
Jacob Heider 3adaa94a1f fix(llama.cpp)
closes #3915
closes #3917
closes #3919
closes #3923
closes #3924
closes #3926
closes #3927
closes #3928
closes #3929
closes #3931
2023-11-01 23:19:54 -04:00
..
entrypoint.sh Use recommended model 2023-10-26 14:29:00 -04:00
package.yml fix(llama.cpp) 2023-11-01 23:19:54 -04:00
README.md GitHub.com/ggerganov/llama.cpp update (#3696) 2023-10-26 07:24:04 -04:00

getting started

$ pkgx +brewkit -- run llama.cpp
# ^^ default chat prompt with an appropriate hugging face model

If you want to run llama.cpp with your own args pkgx llama.cpp $ARGS is your friend.

converting your own models

We provide a working convert.py from the llama.cpp project. To use it you need to launch it via a tea pkgenv:

pkgx +llama.cpp -- convert.py path/to/your/model
# ^^ the -- is necessary since `convert.py` is a not listed in the llama.cpp
# provides list