llamacpp
Inference of Meta's LLaMA model (and others) in pure C/C++
The llama.cpp library provides a C++ interface for running inference with large language models (LLMs). Initially designed to support Meta's LLaMA model, it has since been extended to work with a variety of other models. This package includes the llama-cli tool to run inference using the library.
There is no official package available for openSUSE Leap 16.0Distributions
openSUSE Tumbleweed
home:lalala123:x86_to_arm-single_claud...
OpenSUSE-fellesskapet
6428
home:lalala123:x86_to_arm-single_deeps...
OpenSUSE-fellesskapet
6428
home:lalala123:x86_to_arm-single_qwen-...
OpenSUSE-fellesskapet
6428
openSUSE Slowroll
openSUSE Leap 16.0
openSUSE Leap 15.6
openSUSE Factory RISCV
SLFO 1.2
Fedora Rawhide (unstable)
Fedora 43
Fedora 42
Fedora 41
Mageia Cauldron (unstable)
Mageia 9
Unsupported distributions
The following distributions are not officially supported. Use these packages at your own risk.