Asymptotic Network Independence in Distributed Stochastic Gradient Methods


报告题目:Asymptotic Network Independence in Distributed Stochastic Gradient Methods

报告专家:濮实 (香港中文大学(深圳))

报告时间:20221227日(周上午1000-1100

报告地点:腾讯会议:838-263-043

 

报告摘要:

We provide a discussion of several recent results which, in certain scenarios, are able to overcome a barrier in distributed stochastic optimization for machine learning (ML). Our focus is the so-called asymptotic network independence property, which is achieved whenever a distributed method executed over a network of n nodes asymptotically converges to the optimal solution at a comparable rate to a centralized method with the same computational power as the entire network; it is as if the network is not even there! We explain this property through an example involving the training of ML models and present a short mathematical analysis for comparing the performance of distributed stochastic gradient descent (DSGD) with centralized SGD. We also discuss the transient times for distributed stochastic gradient methods to achieve network independent convergence rates. Finally, we introduce some recent works on distributed random reshuffling (RR) methods.

专家简介:

濮实博士现任香港中文大学(深圳)数据科学学院助理教授。在此之前, 曾任佛罗里达大学、亚利桑那州立大学和波士顿大学博士后研究员。2012年取得北京大学工学学士学位, 2016年取得弗吉尼亚大学系统工程博士学位。主要研究方向为多智能体网络中的分布式优化和机器学习算法。2017年获弗吉尼亚大学Louis T. Rader杰出毕业生荣誉称号。以第一或通信作者在Mathematical ProgrammingIEEE Transactions on Automatic ControlSIAM Journal on Control and OptimizationOperations Research等运筹优化和控制领域的顶级期刊发表10余篇论文,其中一篇代表作入选ESI高被引论文。正在主持国家自然科学基金青年项目、广东省引进青年拔尖人才项目、深圳市优秀科技创新人才培养项目等。2022年以来担任IEEE Control Systems Society会议编委。

邀请人:王治国