About Me

👩 I am currently an Assistant Professor at the University of South China (USC). I received my Ph.D. from Beijing University of Posts and Telecommunications in 2025, advised by Prof. Jingyu Wang. In 2023, I was a visiting student at ETH Zurich, working with Prof. Ce Zhang.

📖 My research interests include distributed machine learning and model compression.

😄 Feel free to contact me: ningwanyi@126.com

Publications

  • [TNNLS’25] Wanyi Ning, Jingyu Wang, Qi Qi, Haifeng Sun, Daixuan Cheng, Cong Liu, Lei Zhang, Zirui Zhuang, Jianxin Liao. “Federated Fine-Tuning on Heterogeneous LoRAs With Error-Compensated Aggregation”. [Paper]

  • [ACL’25] Minwei Zhang, Haifeng Sun, Jingyu Wang, Shaolong Li, Wanyi Ning, Qi Qi, Zirui Zhuang, Jianxin Liao. “ClusterAttn: KV Cache Compression under Intrinsic Attention Clustering”. [Paper]

  • [NeurIPS’ 24] Wanyi Ning, Jingyu Wang, Qi Qi, Mengde Zhu, Haifeng Sun, Daixuan Cheng, Jianxin Liao, Ce Zhang. “Fm-delta: Lossless compression for storing massive fine-tuned foundation models”. [Paper].

  • [Euro-Par’24] Mengde Zhu*, Wanyi Ning*, Qi Qi, Jingyu Wang, Zirui Zhuang, Haifeng Sun, Jun Huang, Jianxin Liao. “Fluk: protecting federated learning against malicious clients for internet of vehicles”. [Paper]

  • [TSC’24] Wanyi Ning, Qi Qi, Jingyu Wang, Mengde Zhu, Shaolong Li, Guang Yang, Jianxin Liao. “One Teacher is Enough: A Server-Clueless Federated Learning With Knowledge Distillation”. [Paper]

  • [ICML’23 workshop] Berivan Isik*, Hermann Kumbong*, Wanyi Ning*, Xiaozhe Yao*, Sanmi Koyejo, Ce Zhang. “Gpt-zip: Deep compression of finetuned large language models”. [Paper]

  • [JSAC’21] Wanyi Ning, Haifeng Sun, Xiaoyuan Fu, Xiang Yang, Qi Qi, Jingyu Wang, Jianxin Liao, Zhu Han. “Following the correct direction: Renovating sparsified SGD towards global optimization in distributed edge learning”. [Paper]