HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

$D^2$: Decentralized Training over Decentralized Data

{Ce Zhang Ming Yan Hanlin Tang Ji Liu Xiangru Lian}

$D^2$: Decentralized Training over Decentralized Data

Abstract

While training a machine learning model using multiple workers, each of which collects data from its own data source, it would be useful when the data collected from different workers are unique and different. Ironically, recent analysis of decentralized parallel stochastic gradient descent (D-PSGD) relies on the assumption that the data hosted on different workers are not too different. In this paper, we ask the question: Can we design a decentralized parallel stochastic gradient descent algorithm that is less sensitive to the data variance across workers? In this paper, we present D$^2$, a novel decentralized parallel stochastic gradient descent algorithm designed for large data variance xr{among workers} (imprecisely, “decentralized” data). The core of D$^2$ is a variance reduction extension of D-PSGD. It improves the convergence rate from $Oleft({sigma over sqrt{nT}} + {(nzeta^2)^{frac{1}{3}} over T^{2/3}}ight)$ to $Oleft({sigma over sqrt{nT}}ight)$ where $zeta^{2}$ denotes the variance among data on different workers. As a result, D$^2$ is robust to data variance among workers. We empirically evaluated D$^2$ on image classification tasks, where each worker has access to only the data of a limited set of labels, and find that D$^2$ significantly outperforms D-PSGD.

Benchmarks

BenchmarkMethodologyMetrics
multi-view-subspace-clustering-on-orlDCSC
Accuracy: 0.811

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
$D^2$: Decentralized Training over Decentralized Data | Papers | HyperAI