ONLINE BATCH SELECTION FOR FASTER TRAINING OF NEURAL NETWORKS

less than 1 minute read

Published:

instead of iterating over the training set for each epoch

  • we need to focus on most important datapoints
  • importance proportion to loss value
  • sort and assign weight(exponentially decay) to each datapoint
    • change the decay weight as training process makes var(loss) smaller
  • random select datapoints according to weights