Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.
LLaMA Node is a Node.js library for inferencing large language models, including LLaMA, RWKV, and their derivatives, using backends like llama.cpp and llama-rs. It is designed for developers who want to run AI models locally on their laptops, supporting various platforms and Node.js versions >= 16.
How the donated funds are distributed
Kivach works on the Obyte network, and therefore you can track all donations.