ShortBio
Anke Tang is currently pursuing his Ph.D. degree at the School of Computer Science, Wuhan University, under the supervision of Prof. Yong Luo. He received his Bachelor degree at the School of Physics and Technology, Wuhan University in 2020. His research interests include machine learning, transfer learning, and multi-task learning. He has published papers in proceedings at leading conferences, such as IJCAI, ICLR and ICML. He has also served as reviewers for several top-tier conferences and journals.

Publications
My google scholar page: Anke Tang – Google Scholar
Selected Conference
- Li Shen*, Anke Tang*, Yong Luo, Tao Sun, Han Hu, Xiaochun Cao. Targeted Low-rank Refinement: Enhancing Sparse Language Model with Precision. ICML, 2025.
- Yongxian Wei, Anke Tang, Li Shen, Zixuan Hu, Chun Yuan, Xiaochun Cao. Modeling Multi-Task Model Merging as Adaptive Projective Gradient Descent. ICML, 2025.
- Jinluan Yang, Anke Tang, Didi Zhu, Zhengyu Chen, Li Shen, Fei Wu, Mitigating the Backdoor Effect for Multi-Task Model Merging via Safety-Aware Subspace, ICLR, 2025.
- Anke Tang, Li Shen, Yong Luo, Nan Yin, Lefei Zhang, Dacheng Tao, Merging Multi-Task Models via Weight-Ensembling Mixture of Experts, ICML, 2024.
- Anke Tang, Li Shen, Yong Luo, Yibing Zhan, Han Hu, Bo Du, Yixin Chen, Dacheng Tao, Parameter-Efficient Multi-Task Model Fusion with Partial Linearization, ICLR, 2024.
- Anke Tang, Yong Luo, Han Hu, Fengxiang He, Kehua Su, Bo Du, Yixin Chen, Dacheng Tao. Improving Heterogeneous Model Reuse by Density Estimation. IJCAI 2023.
Selected Journal
- Anke Tang, Li Shen, Yong Luo, Shiwei Liu, Han Hu, Bo Do, Dacheng Tao, Data-Adaptive Weight-Ensembling for Multi-Task Model Fusion, International Journal of Computer Vision, 2025.
- Hongling Zheng, Li Shen, Anke Tang, Yong Luo, Han Hu, Bo Du, Yonggang Wen, Dacheng Tao, Learning from models beyond fine-tuning, Nature Machine Intelligence, 2025.
Survey, Benchmark, Perspective, and System Papers
- Anke Tang, Li Shen, Yong Luo, Han Hu, Bo Do, Dacheng Tao, FusionBench: A Comprehensive Benchmark of Deep Model Fusion, 2024.