pantry/projects/github.com/ggerganov/llama.cpp
Jacob Heider a8574519fc
fix(llama.cpp3240)
closes #6526
2024-06-26 18:26:06 -04:00
..
entrypoint.sh Use recommended model 2023-10-26 14:29:00 -04:00
package.yml fix(llama.cpp3240) 2024-06-26 18:26:06 -04:00
README.md GitHub.com/ggerganov/llama.cpp update (#3696) 2023-10-26 07:24:04 -04:00

getting started

$ pkgx +brewkit -- run llama.cpp
# ^^ default chat prompt with an appropriate hugging face model

If you want to run llama.cpp with your own args pkgx llama.cpp $ARGS is your friend.

converting your own models

We provide a working convert.py from the llama.cpp project. To use it you need to launch it via a tea pkgenv:

pkgx +llama.cpp -- convert.py path/to/your/model
# ^^ the -- is necessary since `convert.py` is a not listed in the llama.cpp
# provides list