Document detail
ID

oai:arXiv.org:2404.08847

Topic
Computer Science - Information Ret... Computer Science - Cryptography an... Computer Science - Machine Learnin...
Author
Lim, Juntaek Kwon, Youngeun Hwang, Ranggi Maeng, Kiwan Suh, G. Edward Rhu, Minsoo
Category

Computer Science

Year

2024

listing date

4/17/2024

Keywords
lazydp training recsys private computer
Metrics

Abstract

Differential privacy (DP) is widely being employed in the industry as a practical standard for privacy protection.

While private training of computer vision or natural language processing applications has been studied extensively, the computational challenges of training of recommender systems (RecSys) with DP have not been explored.

In this work, we first present our detailed characterization of private RecSys training using DP-SGD, root-causing its several performance bottlenecks.

Specifically, we identify DP-SGD's noise sampling and noisy gradient update stage to suffer from a severe compute and memory bandwidth limitation, respectively, causing significant performance overhead in training private RecSys.

Based on these findings, we propose LazyDP, an algorithm-software co-design that addresses the compute and memory challenges of training RecSys with DP-SGD.

Compared to a state-of-the-art DP-SGD training system, we demonstrate that LazyDP provides an average 119x training throughput improvement while also ensuring mathematically equivalent, differentially private RecSys models to be trained.

Lim, Juntaek,Kwon, Youngeun,Hwang, Ranggi,Maeng, Kiwan,Suh, G. Edward,Rhu, Minsoo, 2024, LazyDP: Co-Designing Algorithm-Software for Scalable Training of Differentially Private Recommendation Models

Document

Open

Share

Source

Articles recommended by ES/IODE AI