Serge Gratton - Academia.edu (original) (raw)

Serge Gratton

Uploads

Papers by Serge Gratton

Research paper thumbnail of Second-order convergence properties of trust-region methods using incomplete curvature information, with an application to multigrid optimization

Convergence properties of trust-region methods for unconstrained nonconvex optimization is consid... more Convergence properties of trust-region methods for unconstrained nonconvex optimization is considered in the case where information on the objective function's local curvature is incomplete, in the sense that it may be restricted to a fixed set of "test directions" and may not be available at every iteration. It is shown that convergence to local "weak" minimizers can still be obtained under some additional but algorithmically realistic conditions. These theoretical results are then applied to recursive multigrid trust-region methods, which suggests a new class of algorithms with guaranteed second-order convergence properties.

Research paper thumbnail of On the approximation of the solution of partial differential equations by artificial neural networks trained by a multilevel Levenberg-Marquardt method

ArXiv, 2019

This paper is concerned with the approximation of the solution of partial differential equations ... more This paper is concerned with the approximation of the solution of partial differential equations by means of artificial neural networks. Here a feedforward neural network is used to approximate the solution of the partial differential equation. The learning problem is formulated as a least squares problem, choosing the residual of the partial differential equation as a loss function, whereas a multilevel Levenberg-Marquardt method is employed as a training method. This setting allows us to get further insight into the potential of multilevel methods. Indeed, when the least squares problem arises from the training of artificial neural networks, the variables subject to optimization are not related by any geometrical constraints and the standard interpolation and restriction operators cannot be employed any longer. A heuristic, inspired by algebraic multigrid methods, is then proposed to construct the multilevel transfer operators. Numerical experiments show encouraging results relate...

Research paper thumbnail of Algorithm 847

ACM Transactions on Mathematical Software, 2005

To recover or approximate smooth multivariate functions, sparse grids are superior to full grids ... more To recover or approximate smooth multivariate functions, sparse grids are superior to full grids due to a significant reduction of the required support nodes. The order of the convergence rate in the maximum norm is preserved up to a logarithmic factor. We describe three possible piecewise multilinear hierarchical interpolation schemes in detail and conduct a numerical comparison. Furthermore, we document the features of our sparse grid interpolation software package spinterp for MATLAB.

Research paper thumbnail of Second-order convergence properties of trust-region methods using incomplete curvature information, with an application to multigrid optimization

Convergence properties of trust-region methods for unconstrained nonconvex optimization is consid... more Convergence properties of trust-region methods for unconstrained nonconvex optimization is considered in the case where information on the objective function's local curvature is incomplete, in the sense that it may be restricted to a fixed set of "test directions" and may not be available at every iteration. It is shown that convergence to local "weak" minimizers can still be obtained under some additional but algorithmically realistic conditions. These theoretical results are then applied to recursive multigrid trust-region methods, which suggests a new class of algorithms with guaranteed second-order convergence properties.

Research paper thumbnail of On the approximation of the solution of partial differential equations by artificial neural networks trained by a multilevel Levenberg-Marquardt method

ArXiv, 2019

This paper is concerned with the approximation of the solution of partial differential equations ... more This paper is concerned with the approximation of the solution of partial differential equations by means of artificial neural networks. Here a feedforward neural network is used to approximate the solution of the partial differential equation. The learning problem is formulated as a least squares problem, choosing the residual of the partial differential equation as a loss function, whereas a multilevel Levenberg-Marquardt method is employed as a training method. This setting allows us to get further insight into the potential of multilevel methods. Indeed, when the least squares problem arises from the training of artificial neural networks, the variables subject to optimization are not related by any geometrical constraints and the standard interpolation and restriction operators cannot be employed any longer. A heuristic, inspired by algebraic multigrid methods, is then proposed to construct the multilevel transfer operators. Numerical experiments show encouraging results relate...

Research paper thumbnail of Algorithm 847

ACM Transactions on Mathematical Software, 2005

To recover or approximate smooth multivariate functions, sparse grids are superior to full grids ... more To recover or approximate smooth multivariate functions, sparse grids are superior to full grids due to a significant reduction of the required support nodes. The order of the convergence rate in the maximum norm is preserved up to a logarithmic factor. We describe three possible piecewise multilinear hierarchical interpolation schemes in detail and conduct a numerical comparison. Furthermore, we document the features of our sparse grid interpolation software package spinterp for MATLAB.

Log In