Academics and Research

Academic Communication

Academic Report: Differential Equations and Deep Learning

Subject: Differential Equations and Deep Learning

Speaker: Dr Hailiang Liu(Iowa State University, USA)

Time: 16:00 on July 12th, 2019

Location: Room 217 in Second Teaching Building

Introduction of the Speaker: 

  Dr. Hailiang Liu is a Mathematics Professor at the Iowa State University (ISU) and the Holl Chair in Applied Mathematics from 2002-2012. He received his Master degree in Applied Mathematics from Tsinghua University of China in 1988, and Ph.D. degree from the Chinese Academy in 1995.He received an Alexander von Humboldt-Research Fellowship in 1996 that allowed him to conduct research in Germany from 1997-1999. He joined UCLA as a CAM Assistant Professor from 1999-2002. He then came to Iowa State University as an Associate Professor in 2002, moving up to Full Professor in 2007. Liu’s primary research interests include analysis of applied partial differential equations, the development of novel, high order algorithms for the approximate solution of these problems, and the interplay between analytical theory and computational aspects of such algorithms with applications to shock waves, kinetic transport, and level set closure, propagation of critical thresholds and recovery of high frequency wave fields. Liu serves on the editorial board of the JMAA journal and has given many invited lectures, including the invited addresses in the international conference on hyperbolic problems in 2002 and 2018. Liu published more than 120 research papers.

Abstract:

  Deep learning is machine learning using neural networks with many hidden layers, and it has become a primary tool in a wide variety of practical learning tasks.  In this talk we begin with a simple optimization problem, and show how it can be reformulated as gradient flows, which in turn lead to different optimization solvers.  We further introduce the mathematical formulation of deep residual neural networks as a PDE optimal control problem. We state and prove optimality conditions for the inverse deep learning problem, using the Hamilton-Jacobi-Bellmann equation and the Pontryagin maximum principle.

Baidu
map