<tt id="weewe"><wbr id="weewe"></wbr></tt>
<acronym id="weewe"><noscript id="weewe"></noscript></acronym>
當前位置: 首頁 >> 科學研究 >> 學術交流 >> 學術報告 >> 正文

理學院青年學術論壇第197期——High-order multilevel optimization strategies and their application to the training of ANNs

發布者: [發表時間]:2019-03-28 [來源]: [瀏覽次數]:

主講人:Serge Gratton教授

時間:2019年4月9日09:00-10:00

地點:主樓1214會議室

邀請人:孫聰

報告題目:High-order multilevel optimization strategies and their application to the training of ANNs

報告摘要:

Standard iterative optimization methods are based on the approximation of the objective function by a model, given by a truncated Taylor series. Models of order two are classically used. Recently, in the literature a unifying framework has been proposed, to extend the existing theory also to models of higher order. The use of such models comes along with higher costs. We propose a multilevel extension of such methods, to reduce the major cost per iteration, represented by the model minimization. The proposed methods rely on the knowledge of a sequence of approximations to the original objective function, defined on spaces of reduced dimension and cheaper to optimize. We also investigate the application of such techniques to problems where the variables are not related by geometrical constraints. We choose as an important representative of this class the training of artificial neural networks.

主講人介紹:

Prof. Serge Gratton achieved his phD degree in Applied Mathematics in 1998 from University of Toulouse, France. He is now an exceptional full professor , and head of the Parallel Algorithm and Optimization (APO) team in INPT/ENSEEIHT, France. He is the associate editor of SIAM Otpimization, Optimization Methods and Softwares. His research interests include Theory and algorithms for constrained and unconstrained non-convex optimization, Data assimilation and filtering, Numerical linear algebra and High Performance Computing, and so on. He also developed several softwares such as CG and GMRES packages.



北京28开奖