pantry/projects/github.com/ggerganov/llama.cpp
Jacob Heider 5729b3ca1c fix(llama.cpp)
closes #5628
closes #5630
closes #5631
closes #5632
closes #5636
closes #5637
2024-03-18 17:06:15 -04:00
..
entrypoint.sh Use recommended model 2023-10-26 14:29:00 -04:00
package.yml fix(llama.cpp) 2024-03-18 17:06:15 -04:00
README.md GitHub.com/ggerganov/llama.cpp update (#3696) 2023-10-26 07:24:04 -04:00

getting started

$ pkgx +brewkit -- run llama.cpp
# ^^ default chat prompt with an appropriate hugging face model

If you want to run llama.cpp with your own args pkgx llama.cpp $ARGS is your friend.

converting your own models

We provide a working convert.py from the llama.cpp project. To use it you need to launch it via a tea pkgenv:

pkgx +llama.cpp -- convert.py path/to/your/model
# ^^ the -- is necessary since `convert.py` is a not listed in the llama.cpp
# provides list