Mystique : An adaptive Straggler mitigation technique for Distributed Neural Network training Souptik Sen ABSTRACT Distributed Stochastic Gradient Descent (SGD) when run in a synchronous manner, suffers from delays in waiting for the slowest learners (stragglers). Asynchronous meth- ods can alleviate stragglers, but cause gradient staleness that can adversely affect convergence. In this work I propose a new algorithm called Mystique which holds a middle ground between synchronous and asynchronous methods, and performs better than both in high straggler environment.