Skip to content
/ GADMM Public

GADMM: fast and communication efficient framework for distributed machine learning

Notifications You must be signed in to change notification settings

aelgabli/GADMM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GADMM: fast and communication efficient framework for distributed machine learning

For linear regression on synthetic data, run LinearRegression_synthetic.m.

For linear regression on real data, run LinearRegression_real.m.

For logistic regression on synthetic data, run LogisticRegression_synthetic.m.

For logistic regression on real data, run LogisticRegression_real.m.

All codes will run regression tasks using our proposed algorithm (GADMM) and all baseline schemes described in our paper (see below).

For linear regression using D-GADMM (regression over dynamic network) and synthetic dataset run Dynamic_LinearRegression_Synthetic.m.

For linear regression using D-GADMM (regression over dynamic network) and real dataset run dynamic_LinearRegression_Real.m

To Compare D-GADMM with GADMM and standared ADMM (ADMM with parameter server. i.e., star topology) using synthetic dataset, run LinearRegression_gadmm_vs_admm.m

The datasets used in this code are available at: https://tiny.cc/gadmm_dataset

Citation

@article{elgabli2019gadmm, title={GADMM: Fast and communication efficient framework for distributed machine learning}, author={Elgabli, Anis and Park, Jihong and Bedi, Amrit S and Bennis, Mehdi and Aggarwal, Vaneet}, journal={arXiv preprint arXiv:1909.00047}, year={2019} }

About

GADMM: fast and communication efficient framework for distributed machine learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published