Skip to content

Event-Triggered Communication in Parallel Machine Learning

License

Notifications You must be signed in to change notification settings

LearnCV/eventgrad

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

41 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Event-Triggered Communication in Parallel Machine Learning

The primary objective of this repository is to introduce EventGraD - a novel communication algorithm based on event-triggered communication to reduce communication in parallel machine learning. EventGraD considers the decentralized setting where communication happens only with the neighbor processors at every iteration instead of an AllReduce involving every processor at every iteration. The main idea is to trigger communication in events only when the parameter to be communicated changes by a threshold. For details on how to choose an adaptive threshold and convergence proofs, please refer to the publications. EventGraD saves around 70% of the messages in MNIST and 60% of the messages on CIFAR-10. Please see /dmnist/event/ for the EventGraD code on MNIST and /dcifar10/event for the EventGraD code on CIFAR-10.

PyTorch C++ API meets MPI

The secondary objective of this repository is to serve as a starting point to implement parallel/distributed machine learning using PyTorch C++ (LibTorch) and MPI. Apart from EventGraD, other popular distributed algorithms such as AllReduce based training (/dmnist/cent/) and decentralized training with neighbors(/dmnist/decent/) are covered. The AllReduce based training code was contributed to the pytorch/examples repository here through this pull request.

Publications

  1. Soumyadip Ghosh, Bernardo Aquino and Vijay Gupta, "EventGraD: Event-Triggered Communication in Parallel Machine Learning", Neurocomputing, Nov 2021 (arxiv)

  2. Soumyadip Ghosh and Vijay Gupta, "EventGraD: Event-Triggered Communication in Parallel Stochastic Gradient Descent", Workshop on Machine Learning in HPC Environments (MLHPC), Supercomputing Conference (SC), Virtual, November 2020

About

Event-Triggered Communication in Parallel Machine Learning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 96.0%
  • CMake 4.0%