Skip to content

Unofficial pytorch implementation on logit adjustment loss

Notifications You must be signed in to change notification settings

bodhitrii/logit_adjustment

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Long-tail learning via logit adjustment

Aditya Krishna Menon, Sadeep Jayasumana, Ankit Singh Rawat, Himanshu Jain, Andreas Veit, Sanjiv Kumar.


This is the unofficial implementation of DAR-BN in the paper Long-tail learning via logit adjustment(ICLR 2021) in pytorch.

Dependency

The code is built with following libraries:

  • PyTorch 1.2
  • TensorboardX
  • scikit-learn

Dataset

  • Imbalanced CIFAR. The original data will be downloaded and converted by imbalance_cifar.py.

Training

We provide several training examples with this repo:

  • To train the ERM baseline on long-tailed imbalance with ratio of 100
python cifar_train.py --gpu 0 --imb_type exp --imb_factor 0.01 --loss_type CE --train_rule None
  • To train the ERM Loss along with DAR-BN on long-tailed imbalance with ratio of 100
python cifar_train.py --gpu 0 --imb_type exp --imb_factor 0.01 --loss_type CE --train_rule DAR-BN

Results

Baseline (ERM) Logit adjustment loss
CIFAR-10 LT 79.6 80.74
CIFAR-100 LT 36.2 41.58

About

Unofficial pytorch implementation on logit adjustment loss

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages