React Native binding of llama.cpp
Cui-llama.rn is a React Native binding of llama.cpp, providing on-device LLM inference with GPU/NPU acceleration, multimodal support, and parallel decoding capabilities. It's designed for developers building AI-powered mobile applications who need efficient, local language model processing without relying on cloud services.
How the donated funds are distributed
Kivach works on the Obyte network, and therefore you can track all donations.