主题:Distributed Learning for Sketched Kernel Regression
主讲人:练恒(香港城市大学)
主持人:王国长(BETVLCTOR伟德官网下载)
会议时间:2021年12月30日(周四)10:00-11:00
会议工具:腾讯会议(ID:208-993-047)
摘要
We study distributed learning for regularized least squares regression in a reproducing kernel Hilbert space (RKHS). The divide-and-conquer strategy is a frequently used approach for dealing with very large datasets, which computes an estimate on each subset and then takes an average of the estimators. Existing theoretical constraint on the number of subsets implies the size of each subset can still be large. Random sketching can thus be used to produce the local estimators on each subset to further reduce the computation compared to vanilla divide-and-conquer. In this setting, sketching and divide-and-conquer are complementary to each other in dealing with the large sample size. We show that optimal learning rates can be retained. Simulations are performed to compare sketched and non-standard divide-and-conquer methods.
主讲人简介
练恒,现任香港城市大学数学系副教授,于2000年在中国科学技术大学获得数学和计算机学士学位,2007年在美国布朗大学获得计算机硕士,经济学硕士和应用数学博士学位。先后在新加坡南洋理工大学,澳大利亚新南威尔士大学,和香港城市大学工作。研究方向包括高维数据分析,函数数据分析,机器学习等。在 Journal of the Royal Statistical Society, Series B, Journal of the American Statistical Association 等国际期刊上发表学术论文30多篇。