Seminars at the Faculty of Informatics

Hessian Matrix-free Lagrange-Newton-Krylov methods for PDE-Constrained Optimization

Speaker:

Widodo Samyono

 

Jarvis Christian College, Hawkins, TX, USA

Date:

Thursday, December 15, 2016

Place:

USI Lugano Campus, room SI-008, Informatics building (Via G. Buffi 13)

Time:

15:30

 

 

Abstract:

Solving the Karush-Kuhn-Tucker (KKT) system in PDE-constrained optimization needs to handle very huge data set. To overcome the storage limitation, we use Hessian Matrix-free Lagrange-Newton-Krylov methods. The preconditioners have an inner-outer structure, taking the form of a Schur complement (block factorization) at the outer level and Schwarz projections at the inner level. However, building an exact Schur complement is prohibitively expensive. Thus, we use Schur complement approximations, including the identity, probing, the Laplacian, the J operator, and a BFGS operator. For exact data the exact Schur complements are superior to the inexact approximations. However, for data with noise the inexact methods are competitive to or even better than the exact in every computational aspect. We also find that nonsymmetric forms of the KKT matrices and preconditioners are competitive to or better than the symmetric forms that are commonly used in the op timization community. In this study, we focus on solving an inverse problem for a parameter identification problem.  Iterative Tikhonov and Total Variation regularizations are proposed and compared to the standard regularizations and each other. For exact data with jump discontinuities the standard and iterative Total Variation regulations are superior to the standard and iterative Tikhonov regularizations. However, in the case of noisy data the proposed iterative Tikhonov regularizations are superior to the standard and iterative Total Variation methods. We also show that in some cases the iterative regularizations are better than the noniterative. To demonstrate the performance of the algorithm, including the effectiveness of the preconditioners and regularizations, synthetic one- and two-dimensional elliptic inverse problems are solved, and we also compare with other methodologies that are available in the literature. The proposed algorithm performs well with regard to robustness, reconstructs the parameter models effectively, and is easily implemented in the framework of the available parallel PDE software PETSc and the automatic differentiation software ADIC. The algorithm is also extendable to three-dimensional problems. In addition, we will discuss our future research directions. 

 

 

Biography:

Prof. Samyono is Assistant Professor at the Mathematics Department, Jarvis Christian College, Hawkins, TX since 2012. He did his Postdoctoral Research at the Applied Mathematics Division, Department of Applied Physics and Applied Mathematics, The Fu Foundation School of Engineering and Applied Science, Columbia University. 

His main research focus is in PDE-constrained optimization, applied mathematics for Physiology and Medicine, and Mathematics in Biological Synthetic Systems.      

 

 

Host:

Prof. Rolf Krause