A Java and JVM port of llama.cpp using jextract, enabling local large language model (LLM) inference through native foreign function and memory API interop. Natively supports macOS M-series and Linux x86_64 with GPU acceleration. Platform and hardware support (Windows, ARM, CUDA, etc.) can be extended through custom builds.
💡
Newer Version Available
0.3.21.1.1

Scope:
Scope:
Format:
Scope:
Scope:
Scope:
Scope:
Scope:
Scope:

Licenses

LicenseURL
Apache License, Version 2.0 http://www.apache.org/licenses/LICENSE-2.0

Developers

NameEmailDev IdRolesOrganization
Azize ELAMRANIazize<at>graviteesource.comaelamrani
David BRASSELYdavid<at>graviteesource.combrasseld
Nicolas GERAUDnicolas<at>graviteesource.comNicolasGeraud
Titouan COMPIEGNEtitouan<at>graviteesource.comtcompiegne