Easily compute clip embeddings and build a clip retrieval system with them
This project provides a complete system for building semantic search applications using CLIP embeddings. It includes tools for computing image and text embeddings, creating efficient search indices, filtering results, and serving queries through a Flask backend with a web frontend. The system is designed to handle datasets ranging from thousands to hundreds of millions of items, with GPU acceleration for embedding computation and support for distributed processing. It's aimed at developers and researchers who want to create image-text search applications without building everything from scratch.
How the donated funds are distributed
Kivach works on the Obyte network, and therefore you can track all donations.