The repository provides code for running inference with the Meta Segment Anything Model 2 (SAM 2), links for downloading the trained model checkpoints, and example notebooks that show how to use the model.
How the donated funds are distributed
Kivach works on the Obyte network, and therefore you can track all donations.
SAM 2 is Meta's foundation model for **promptable visual segmentation** that works on both **images and videos**, extending the original SAM with streaming memory for real-time video processing. This repository provides **inference code**, **pre-trained checkpoints**, and **example notebooks** for using