Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.
LLaMA Node is a Node.js library for running large language models like LLaMA, Alpaca, GPT4All, Vicuna, and RWKV locally on your laptop's CPU. It's designed for developers who want to integrate AI language models into their Node.js applications without relying on cloud services.
How the donated funds are distributed
Kivach works on the Obyte network, and therefore you can track all donations.