March 6, 2017

Speed up Deep Learnign threading - Hogwild Training

One of the major issues in training deep learning models is that there is a TON of data and training as a single threaded process takes a lot of time. Here comes Hog Wild Training.

The Idea is that have multiple threads, say 15, and each thread starts training on its own split of dataset(n/15). In real world threads never start at the same time, so


thread one starts and after a few miliseconds thread 2 starts and so on. Once thread one finishes it updates weight W_0 to W_1. then when thread 2 finishes it updates the latest version of the weights W_1 to W_2, and so on. It's like batch processing, but pipelining each batch.

[*] Hogwild! : A Lock-Free Approach to Parallelizing Stochastic Gradient Descent

No comments:

Post a Comment