Vercel AI Provider for running LLMs locally using Ollama
This project provides a Vercel AI SDK provider for running large language models locally using Ollama, enabling developers to integrate local LLM capabilities into their applications. It's designed for developers who want to leverage local LLM processing with Ollama's infrastructure, offering features like text generation, object generation, and tool usage while maintaining compatibility with the Vercel AI SDK ecosystem.
How the donated funds are distributed
Kivach works on the Obyte network, and therefore you can track all donations.