Parallel Splash Belief Propagation
Joseph E. Gonzalez Department of Machine learning Carnegie Mellon University Pittsburgh, PA 15213
[email protected]
As computer architectures transition towards exponentially increasing parallelism we are forced to adopt parallelism at a fundamental level in machine learning. In this work we focus on parallel inference in graphical models. We demonstrate that the natural, fully synchronous parallelization of belief propagation is highly inefficient. By bounding the achievable parallel performance in chain graphical models we develop a theoretical understanding of the parallel limitations of belief propagation. We then introduce Belief Residual Splash a new parallel belief propagation algorithm which achieves the optimal bounds and demonstrates linear to super-linear scaling on large graphical models.
1