Low-Rank Exploiting Optimization Methods for Inverse Problems and Machine Learning Open Access

Kan, Kai Fung (Spring 2022)

Permanent URL: https://etd.library.emory.edu/concern/etds/qv33rx97n?locale=pt-BR%2A
Published

Abstract

Due to rapid technological development, datasets of enormous size have emerged in various domains, including inverse problems and machine learning. Many important applications in these domains, e.g., PDE parameter estimation, data classification and regression, are formulated as optimization problems. These problems are often of large-scale and can be computationally intractable to solve. Fortunately, it has been empirically observed that large datasets can be accurately estimated by low-rank approximation. Specifically, they can be approximately expressed using a relatively compact representation whose computation is less demanding. Therefore, an effective way to circumvent the computational obstacle is to exploit the low-rank approximation. In addition, low-rank approximation can serve as a regularization technique to filter out irrelevant features (e.g., noise) from the data since it can capture essential features while discarding less pertinent ones.

This dissertation presents three applications of low-rank exploiting optimization methods for inverse problems and optimization. The first application is a projected Newton-Krylov method which efficiently exploits the low-rank approximation to the Hessian matrix to compute the projection for bound-constrained optimization problems. The second application is a modified Newton-Krylov method geared toward log-sum-exp minimization. It is scalable to large problem sizes thanks to its utilization of the low-rank approximation to the Hessian. In the third application, we apply hybrid regularization, which synergistically combines iterative low-rank approximation schemes and Tikhonov regularization, to effectively and automatically avoid an undesirable phenomenon in machine learning.

Table of Contents

Chapter 1: Introduction

Chapter 2: Preliminaries

Chapter 3: A Projected Newton-Krylov Method for Large-Scale Bound-Constrained Optimization

Chapter 4: A Modified Newton-Krylov Method for Log-Sum-Exp Minimization 

Chapter 5: Avoiding the Double Descent Phenomenon Using Hybrid Regularization

Chapter 6: Conclusion and Future Work

About this Dissertation

Rights statement
  • Permission granted by the author to include this thesis or dissertation in this repository. All rights reserved by the author. Please contact the author for information regarding the reproduction and use of this thesis or dissertation.
School
Department
Degree
Submission
Language
  • English
Research Field
Keyword
Committee Chair / Thesis Advisor
Committee Members
Last modified

Primary PDF

Supplemental Files