hlhr202

hlhr202/llama-node

Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.

Rust
867
65
Apache License 2.0

LLaMA Node is a Node.js library for inferencing large language models, including LLaMA, RWKV, and their derivatives, using backends like llama.cpp and llama-rs. It is designed for developers who want to run AI models locally on their laptops, supporting various platforms and Node.js versions >= 16.

Total donated
Undistributed
Share with your subscribers:

Recipients

How the donated funds are distributed

Support the dependencies

Support the repos that depend on this repository

Top contributors

hlhr202's profile
hlhr202
212 contributions
dinex-dev's profile
dinex-dev
1 contributions
fardjad's profile
fardjad
1 contributions
triestpa's profile
triestpa
1 contributions
tommoffat's profile
tommoffat
1 contributions

Recent events

Kivach works on the Obyte network, and therefore you can track all donations.

No events yet