cogpy

cogpy/llama-node

Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.

0
0
Apache License 2.0

LLaMA Node is a Node.js library for running large language models like LLaMA, Alpaca, GPT4All, Vicuna, and RWKV locally on your laptop's CPU. It's designed for developers who want to integrate AI language models into their Node.js applications without relying on cloud services.

Total donated
Undistributed
Share with your subscribers:

Recipients

How the donated funds are distributed

Support the dependencies

Support the repos that depend on this repository

Top contributors

hlhr202's profile
hlhr202
212 contributions
dinex-dev's profile
dinex-dev
1 contributions
fardjad's profile
fardjad
1 contributions
triestpa's profile
triestpa
1 contributions
tommoffat's profile
tommoffat
1 contributions

Recent events

Kivach works on the Obyte network, and therefore you can track all donations.

No events yet