A Java and JVM port of llama.cpp using jextract, enabling local large language model (LLM) inference through native foreign function and memory API interop. Natively supports macOS M-series and Linux x86_64 with GPU acceleration. Platform and hardware support (Windows, ARM, CUDA, etc.) can be extended through custom builds.
Newer Version Available
Licenses
| License | URL |
|---|---|
| Apache License, Version 2.0 | http://www.apache.org/licenses/LICENSE-2.0 |
Developers
| Name | Dev Id | Roles | Organization | |
|---|---|---|---|---|
| Azize ELAMRANI | azize<at>graviteesource.com | aelamrani | ||
| David BRASSELY | david<at>graviteesource.com | brasseld | ||
| Nicolas GERAUD | nicolas<at>graviteesource.com | NicolasGeraud | ||
| Titouan COMPIEGNE | titouan<at>graviteesource.com | tcompiegne |
