CIFAR10 ResNet18


Created: 22 Nov 2022, 05:32 PM | Modified: =dateformat(this.file.mtime,"dd MMM yyyy, hh:mm a") Tags: knowledge, GeneralDL


From my hello-world project

Current progress - CIFAR10 trained to like 75% accuracy using 5 epochs

  • Good practice to make it work to the normal benchmark of up to 94%
  • Need more effort to further work on CIFAR10
  • Understand the training procedures to match the benchmark
  • Well-tuned one should be around 94% acc
  • Normal one should be above 90%
  • Use a LR scheduler! e.g. cosine
  • Change convnet first layer since it is optimised for imagenet at a 7x7 kernel, reduce to 3 x 3
  • Use SGD to boost performance instead of Adam
  • Cosine learning rate scheduler, most recent is one-cycle lr scheduler on torch
  • Epoch can train for 200 epochs
  • Train imagenet to baseline too

http://blog.fpt-software.com/cifar10-94-of-accuracy-by-50-epochs-with-end-to-end-training