<acronym id="yasao"></acronym><object id="yasao"></object><sup id="yasao"></sup>
<acronym id="yasao"></acronym>
<sup id="yasao"></sup><sup id="yasao"></sup>
<acronym id="yasao"><center id="yasao"></center></acronym>
<acronym id="yasao"><center id="yasao"></center></acronym>
<acronym id="yasao"><center id="yasao"></center></acronym>


首頁 > 通知動態  > 學術講座
2020年1月7日學術報告(劉凱 助理教授 美國克萊姆森大學)
2020年01月03日16時 人評論







Dr. Kai Liu received his Ph.D. degree from Colorado School of Mines, USA in 2019 and now he is a tenure-track Assistant Professor in Computer Science Division, Clemson University, South Carolina. His research interest lies in Machine Learning with its applications in Artificial Intelligence, Computer Vision, Natural Language Processing, Speech Recognition, Data Mining and Bioinformatics with provable theoretical guarantees. He has published 10+ papers in prestigious conferences such as CVPR/AAAI/ACL/NeurIPS/IJCAI/ SDM/RECOMB etc.


Optimization is one of the most important research areas in machine learning. Many problems such as clustering, dictionary learning, principal component analysis, data recovery and compressed sensing can be transformed into optimization formulation, where Matrix factorization and analysis play an important role. PALM (Proximal Alternating Linearized Minimization) was proposed recently and has been proved to be an efficient algorithm in solving multivariable optimization with nice mathematical guarantee. In this talk, I am going to show the connection of PALM and classical gradient descent method followed by discussing its variations, applications and several open problems.

邀請人:王峰 副教授、蔣華 講師

版權所有 ?武漢大學計算機學院 | copyright ? 2008-2020 School of Computer Science, Wuhan University. All Rights Reserved.