Hire an Illini

Cong Xie

  • Advisor:
      • Oluwasanmi Koyejo, Indranil Gupta
  • Departments:
  • Areas of Expertise:
      • Federated Learning
      • Distributed System
      • Machine Learning
  • Thesis Title:
      • Toward Communication-Efficient and Secure Distributed Machine Learning
  • Thesis abstract:
      • In recent years, the sizes of both machine-learning models and datasets have been increasing rapidly. Stochastic Gradient Descent (SGD) and its variants are commonly used for training large-scale deep neural networks. To accelerate the training, it is common to distribute the computation on multiple machines. There are two common concerns in the distributed model training: how to make the distributed SGD faster and more scalable, and how to protect the distributed learning system from malicious attacks. We propose a distributed machine-learning system that achieves better scalability by asynchrony and reduction of the communication overhead, and tolerance to failures and attacks on the workers. To be more specific, we use both infrequent synchronization and tensor compression to reduce the communication overhead. Furthermore, we use robust approaches to protect the system from attacks and failures in the worst cases (also known as Byzantine failures). The goal is to produce an efficient and trustable machine-learning system in both cloud computing, and edge computing, with theoretical guarantees and good empirical performance.
  • Downloads:

Contact information:
cx2@illinois.edu