Publication

DHeLlam: General-Purpose, Automatic Micro-Batch Co-Execution for Distributed LLM Training

DHeLlam targets communication bottlenecks in distributed LLM training and introduces automatic micro-batch co-execution to improve overall training efficiency.

ICCD 2025 / 2025 / Best Publication Award
LLM trainingdistributed systemsmicro-batch co-execution

Authors

Haiquan Wang, Chaoyi Ruan, Jia He, Jiaqi Ruan, Chengjie Tang, Xiaosong Ma, Cheng Li

Abstract

DHeLlam targets communication bottlenecks in distributed LLM training and introduces automatic micro-batch co-execution to improve overall training efficiency.