Skip to content

shayangharib/SPMLL_with_Label_Cardinality

Repository files navigation

Single-positive Multi-label Learning with Label Cardinality [TMLR 2025] [Paper]

This is the official repository for the paper Single-positive Multi-label Learning with Label Cardinality published at TMLR 2025. Follow the steps below to prepare datasets and run experiments with our implementation.


Dataset Preparation

We use the same benchmark datasets as prior work. You can follow the dataset download, formatting, and single-positive label generation instructions from this repository

For convenience, the Python scripts required for preprocessing are included in our preproc folder (copied from the EM repository).

Once the datasets are prepared, follow the instructions below to run our methods.


Running Experiments

To train and evaluate a model, run:

python main.py -d {DATASET} -l {LOSS} -g {GPU} -s {PYTORCH-SEED} -m {IC-MODE}

Command-line arguments:

  1. {dataset}: Dataset to use.

    default: pascal | Options: pascal, coco, nuswide, or cub

  2. {loss}: Loss function / method for training.

    default: ic_loss | Options: bce, an, ic_loss, or cs_loss

  3. {gpu}: GPU index.

    default: 0

  4. {pytorch_seed}: PyTorch random seed.

    default: 0

  5. {ic_mode}: Mode for determining instance cardinality (only used when loss=ic_loss).

    default: true | Options: true or estimate

Example:

Train and evaluate on the PASCAL (VOC) dataset using ic_loss with true instance cardinalities:

python main.py -d pascal -l ic_loss -m true

Citation

If you find our work useful in your research, please consider citing our paper:

@article{
gharib2025singlepositive,
title={Single-positive Multi-label Learning with Label Cardinality},
author={Shayan Gharib and Pierre-Alexandre Murena and Arto Klami},
journal={Transactions on Machine Learning Research},
issn={2835-8856},
year={2025},
url={https://openreview.net/forum?id=XEPPXH2nKu},
note={Expert Certification}
}

Acknowledgement

Our code is mainly built upon EM and ROLE, which are also baselines in our experiments.

We thank the authors of these works for open-source implementations, which facilitated the implementation of our SPMLL methods and ensure fair comparisons.

About

Single-positive Multi-label Learning with Label Cardinality

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages