pantry/projects/github.com/ggerganov/llama.cpp
Jacob Heider 78b6672091 fix(llama.cpp)
closes #6309
closes #6310
2024-06-03 16:51:45 -04:00
..
entrypoint.sh Use recommended model 2023-10-26 14:29:00 -04:00
package.yml fix(llama.cpp) 2024-06-03 16:51:45 -04:00
README.md GitHub.com/ggerganov/llama.cpp update (#3696) 2023-10-26 07:24:04 -04:00

getting started

$ pkgx +brewkit -- run llama.cpp
# ^^ default chat prompt with an appropriate hugging face model

If you want to run llama.cpp with your own args pkgx llama.cpp $ARGS is your friend.

converting your own models

We provide a working convert.py from the llama.cpp project. To use it you need to launch it via a tea pkgenv:

pkgx +llama.cpp -- convert.py path/to/your/model
# ^^ the -- is necessary since `convert.py` is a not listed in the llama.cpp
# provides list