Official repository of "SAMURAI: Adapting Segment Anything Model for Zero-Shot Visual Tracking with Motion-Aware Memory"
SAMURAI is a zero-shot visual tracking system that adapts Meta's Segment Anything Model (SAM 2) for object tracking by incorporating motion-aware memory, enabling it to track objects in videos without any training on tracking-specific data. Developed by researchers at the University of Washington's Information Processing Lab, it achieves state-of-the-art performance across major visual object tracking benchmarks including LaSOT, GOT-10k, and TrackingNet. The project is intended for computer vision researchers and developers working on video analysis and object tracking applications.
How the donated funds are distributed
Kivach works on the Obyte network, and therefore you can track all donations.