News

【數據科學系列演講】時間:2024.04.24(Wed.) 15:30-17:20@EC115,講者:陳弘軒副教授/中央大學資工系/講題: Enabling Model Parallelism for Neural Network Training Based on Decoupled Local Losses

                                                    

檢送1130424(星期三數據科學系列演講資訊如下,歡迎蒞臨聽講!

 時間:2024.04.24(Wed.) 15:30-17:20

地點:工程三館EC115教室

職稱/任職單位:陳弘軒副教授/中央大學資工系

-----------------------------------------------------------------------------------------------------

演講主題
Enabling Model Parallelism for Neural Network Training Based on Decoupled Local Losses

演講摘要
Backpropagation (BP) is foundational in deep learning. However, its inefficiency is partially caused by backward locking, making simultaneous gradient computation across layers difficult and reducing training efficiency. In this talk, I will introduce our recent research on simultaneously computing parameter gradients in different layers through pipelining. This approach improves the training efficiency while preserving testing accuracies comparable to BP-trained models.

講者簡介
Dr. Hung-Hsuan Chen is an Associate Professor at the Department of Computer Science and Information Engineering at National Central University. He is interested in data-related research topics, such as machine learning, deep learning, information retrieval, text analysis, and graph analysis. He is also interested in applying these techniques to various application domains, such as recommender systems, digital libraries, and social networks.