BonkLM - LLM Security Guardrails com Assistente de Configuração Interativo
BonkLM is a comprehensive Node.js security library that protects AI applications from prompt injection, jailbreaks, data leaks, and other malicious attacks. It provides framework-agnostic, provider-agnostic guardrails for LLM applications with multiple validation layers including prompt injection detection, jailbreak prevention, secret and PII guarding, and real-time streaming validation. The project is designed for developers building AI applications who need production-ready security controls to safeguard their LLM integrations from common attack vectors.
How the donated funds are distributed
Kivach works on the Obyte network, and therefore you can track all donations.