pantry/projects/github.com/ggerganov/llama.cpp
Jacob Heider 311c04c3fc fix(llama.cpp)
closes #4689
closes #4688
closes #4687
closes #4686
closes #4675
2024-01-02 10:20:42 -05:00
..
entrypoint.sh Use recommended model 2023-10-26 14:29:00 -04:00
package.yml fix(llama.cpp) 2024-01-02 10:20:42 -05:00
README.md GitHub.com/ggerganov/llama.cpp update (#3696) 2023-10-26 07:24:04 -04:00

getting started

$ pkgx +brewkit -- run llama.cpp
# ^^ default chat prompt with an appropriate hugging face model

If you want to run llama.cpp with your own args pkgx llama.cpp $ARGS is your friend.

converting your own models

We provide a working convert.py from the llama.cpp project. To use it you need to launch it via a tea pkgenv:

pkgx +llama.cpp -- convert.py path/to/your/model
# ^^ the -- is necessary since `convert.py` is a not listed in the llama.cpp
# provides list