Singh, PrajwalPrajwalSinghVashishtha, GautamGautamVashishthaMastan, Indra DeepIndra DeepMastanRaman, ShanmuganathanShanmuganathanRaman2025-08-312025-08-312025-01-01[9798350368741]10.1109/ICASSP49660.2025.108888152-s2.0-105003879425https://d8.irins.org/handle/IITG2025/28371The success of deep learning in supervised fine-grained recognition for domain-specific tasks relies heavily on expert annotations. The Open-Set for fine-grained Self-Supervised Learning (SSL) problem aims to enhance performance on downstream tasks by strategically sampling a subset of images (the Core-Set) from a large pool of unlabeled data (the OpenSet). In this paper, we propose a novel method, BloomCoreset, that significantly reduces sampling time from Open-Set while preserving the quality of samples in the coreset. To achieve this, we utilize Bloom filters as an innovative hashing mechanism to store both low- and high-level features of the fine-grained dataset, as captured by Open-CLIP, in a space-efficient manner that enables rapid retrieval of the coreset from the Open-Set. To show the effectiveness of the sampled coreset, we integrate the proposed method into the state-of-the-art fine-grained SSL framework, SimCore [1]. The proposed algorithm drastically outperforms the sampling strategy of the baseline in [1] with a 98.5% reduction in sampling time with a mere 0.83% average trade-off in accuracy calculated across 11 downstream datasets. We have made the code publicly available.falsebloom filter | classification | coreset | open-set | representation learning | self-supervised learningBloomCoreset: Fast Coreset Sampling using Bloom Filters for Fine-Grained Self-Supervised LearningConference Paper20250cpConference Proceeding0