Optimization is one of the most important research areas in machine learning. Many problems such as clustering, dictionary learning, principal component analysis, data recovery and compressed sensing can be transformed into optimization formulation, where Matrix factorization and analysis play an important role. PALM (Proximal Alternating Linearized Minimization) was proposed recently and has been proved to be an efficient algorithm in solving multivariable optimization with nice mathematical guarantee. In this talk, I am going to show the connection of PALM and classical gradient descent method followed by discussing its variations, applications and several open problems.
邀請人：王峰 副教授、蔣華 講師