Coded Federated Learning for Communication-Efficient Edge Computing: A Survey

In the era of artificial intelligence and big data, the demand for data processing has surged, leading to larger datasets and computation capability. Distributed machine learning (DML) has been introduced to address this challenge by distributing tasks among multiple workers, reducing the resources...

Full description

Bibliographic Details
Published in:IEEE Open Journal of the Communications Society
Main Authors: Yiqian Zhang, Tianli Gao, Congduan Li, Chee Wei Tan
Format: Article in Journal/Newspaper
Language:English
Published: IEEE 2024
Subjects:
DML
Online Access:https://doi.org/10.1109/OJCOMS.2024.3423362
https://doaj.org/article/d93b936f3870407498572dd7b2b05d78
Description
Summary:In the era of artificial intelligence and big data, the demand for data processing has surged, leading to larger datasets and computation capability. Distributed machine learning (DML) has been introduced to address this challenge by distributing tasks among multiple workers, reducing the resources required for each worker. However, in distributed systems, the presence of slow machines, commonly known as stragglers, or failed links can lead to prolonged runtimes and diminished performance. This survey explores the application of coding techniques in DML and coded edge computing in the distributed system to enhance system speed, robustness, privacy, and more. Notably, the study delves into coding in Federated Learning (FL), a specialized distributed learning system. Coding involves introducing redundancy into the system and identifying multicast opportunities. There exists a tradeoff between computation and communication costs. The survey establishes that coding is a promising approach for building robust and secure distributed systems with low latency.