Conferences
-
Haibo Yang, Peiwen Qiu, and Jia Liu,
Taming Fat-Tailed (``Heavier-Tailed" with Potentially Infinite Variance) Noise in Federated Learning,
in Proc. NeurIPS, New Orleans, LA, Dec. 2022 (acceptance rate: 25.6%)
-
Haibo Yang, Zhuqing Liu, Xin Zhang, and Jia Liu,
SAGDA: Achieving O(ε-2) Communication Complexity in Federated Min-Max Learning,
in Proc. NeurIPS, New Orleans, LA, Dec. 2022 (acceptance rate: 25.6%)
-
Xin Zhang, Minghong Fang, Zhuqing Liu, Haibo Yang, Jia Liu, and Zhengyuan Zhu,
NET-FLEET: Achieving Linear Convergence Speedup for Fully Decentralized Federated Learning with Heterogeneous Data,
in Proc. ACM MobiHoc, Seoul, South Korea, Oct. 2022 (acceptance rate: 19.8%).
-
Haibo Yang, Xin Zhang, Prashant Khanduri, and Jia Liu,
Anarchic Federated Learning,
in Proc. ICML, Baltimore, MD, July 2022 (Long Presentation, long presentation rate: 2%, acceptance rate: 21.9%).
-
Jiayu Mao*, Haibo Yang*, Peiwen Qiu, Jia Liu, and Aylin Yener,
CHARLES: Channel-Quality-Adaptive Over-the-Air Federated Learning over Wireless Networks,
in Proc. IEEE SPAWC, Oulu, Finland, June 2022. (equal contribution)
-
Haibo Yang, Peiwen Qiu, Jia Liu, and Aylin Yener,
Over-the-Air Federated Learning With Joint Adaptive Computation and Power Control,
in Proc. IEEE ISIT, Espoo, Finland, June 2022.
-
Prashant Khanduri, Haibo Yang, Mingyi Hong, Jia Liu, Hoi To Wai, and Sijia Liu,
Decentralized Learning for Overparameterized Problems: A Multi-Agent Kernel Approximation Approach,
in Proc. ICLR, Virtual Event, April 2022 (acceptance rate: 32%).
-
Prashant Khanduri, Pranay Sharma, Haibo Yang, Mingyi Hong, Jia Liu, Ketan Rajawat, and Pramod Varshney,
STEM: A Stochastic Two-Sided Momentum Algorithm Achieving Near-Optimal Sample and Communication Complexities for Federated Learning,
in Proc. NeurIPS, Virtual Event, Dec 2021 (acceptance rate: 26%).
-
Haibo Yang, Jia Liu, and Elizabeth S. Bentley,
CFedAvg: Achieving Efficient Communication and Fast Convergence in Non-IID Federated Learning,
in Proc. IEEE/IFIP WiOpt, Virtual Event, Oct. 2021.
-
Haibo Yang, Minghong Fang, and Jia Liu
Achieving Linear Speedup with Partial Worker Participation in Non-IID Federated Learning,
in Proc. ICLR, Virtual Event, May 2021 (acceptance rate: 28.6%).
-
Haibo Yang, Xin Zhang, Minghong Fang, and Jia Liu
Adaptive Multi-Hierarchical signSGD for Communication-Efficient Distributed Optimization,
in Proc. IEEE SPAWC, Atlanta, GA, May 2020.
-
Haibo Yang, Xin Zhang, Minghong Fang, and Jia Liu
Byzantine-Resilient Stochastic Gradient Descent for Distributed Learning: A Lipschitz-Inspired Coordinate-wise Median Approach,
in Proc. IEEE CDC, Nice, France, Dec. 2019.