Skip to content

This repository contains the implementation of different iterative optimization algorithms and a testing environment to assess their performances over three functions: the Chained Rosenbrock, the Ascher—Russel Boundary value problem, and the Broyden Function.

Notifications You must be signed in to change notification settings

LorMart/Unconstrained_Optimization_Methods

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 

Repository files navigation

Unconstrained Optimization Methods

This repository contains the implementation of various iterative optimization algorithms and a testing environment to assess their performance across three specific functions.

Image 1 Image 2 Image 3

Methods Implemented

The following optimization methods are implemented:

  • Nonlinear Conjugate Gradient Methods (NLCGM):

    • Hestenes-Stiefel (HS)
    • Restarted Polak–Ribiere (PR)
    • Fletcher–Reeves (FR)
  • Newton's Method

Benchmarks Available

The performance of the implemented algorithms is evaluated on the following benchmark problems:

  • Broyden Tridiagonal Function

  • First Formula

  • Troesch Problem

  • Troesch Problem fk1 Constants

  • Chained Rosenbrock Function

  • Equation

Gradient and Hessian Computation

The repository includes scripts for symbolic computation of gradients and Hessians to streamline the optimization process. These scripts utilize the syms package in MATLAB for symbolic computation:

  • Gradient Computation:

    • The script "Generate [problem's name] Gradient.m" computes the generic form of the Gradient's entries and saves it as a MATLAB function.
    • This function is then used iteratively in "Eval problem's name Gradient.m" to compute the entire Gradient efficiently.
  • Hessian Computation:

    • The Hessians are computed similarly. Since all three benchmark problems have band matrices, only a few symbolic derivatives are required (maximum of 10 per problem) to derive general expressions for each diagonal of the Hessian matrix.
    • Specific expressions are derived and applied to the boundary entries of the Gradient and Hessian based on the problem’s definition.

By using symbolic computation only for a few entries and deriving generic functions for repeated use, the repository allows for efficient computation of both the Gradient and Hessian, speeding up the optimization process.


Performance Indicators

The performance of the algorithms can be assessed through the following indicators:

  • Efficiency: Quantified by both computational time and iterations to convergence.
  • Reliability: Measured using the Success Rate, which assesses the algorithm's capacity to perform well in different situations.
  • Quality of the Solution: Evaluated based on whether a known minimum point exists. If so, the Fixed-Target Method is used; otherwise, the Fixed-Cost Approach with metrics such as the norm of the gradient and the value of the function in the minimum can be applied.
  • The repository allows for the calculation of these metrics, to evaluate the computational effort required and the results obtained.

Starting Points

The repository provides functionality to generate random starting points for a fair comparison of algorithms. These starting points are generated using the script Generate problem’s name Random start pts.m, which evaluates multiple Gaussian distributions with a fixed mean and standard deviation.

Interface and Tools

The interface includes a plotting function to visualize the performance and progress of the optimization methods. Additionally, a Python script is available for processing the data obtained from the optimization runs, enabling further analysis and evaluation of results.

Image 1 Image 2 Image 3

About

This repository contains the implementation of different iterative optimization algorithms and a testing environment to assess their performances over three functions: the Chained Rosenbrock, the Ascher—Russel Boundary value problem, and the Broyden Function.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published