Skip to content

Commit

Permalink
Update version and documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
blankjul committed Nov 19, 2023
1 parent e43a069 commit cef2a9a
Show file tree
Hide file tree
Showing 116 changed files with 2,061 additions and 4,726 deletions.
11 changes: 11 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
repos:
- repo: https://github.com/srstevenson/nb-clean
rev: 3.1.0
hooks:
- id: nb-clean
args:
- --remove-empty-cells
- --preserve-cell-metadata
- tags
- format
- --
130 changes: 27 additions & 103 deletions docs/source/algorithms/hyperparameters.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -2,35 +2,21 @@
"cells": [
{
"cell_type": "raw",
"metadata": {
"pycharm": {
"name": "#%% raw\n"
},
"raw_mimetype": "text/restructuredtext"
},
"metadata": {},
"source": [
".. _nb_algorithms_hyperparameters:"
]
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
}
},
"metadata": {},
"source": [
"# Hyperparameters"
]
},
{
"cell_type": "raw",
"metadata": {
"pycharm": {
"name": "#%% raw\n"
},
"raw_mimetype": "text/restructuredtext"
},
"metadata": {},
"source": [
".. admonition:: Info\n",
" :class: myOwnStyle\n",
Expand All @@ -40,29 +26,15 @@
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
}
},
"metadata": {},
"source": [
"Most algoriths have **hyperparameters**. For some optimization methods the parameters are already defined and can directly be optimized. For instance, for Differential Evolution (DE) the parameters can be found by:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"execution": {
"iopub.execute_input": "2022-08-01T02:36:48.308627Z",
"iopub.status.busy": "2022-08-01T02:36:48.308160Z",
"iopub.status.idle": "2022-08-01T02:36:48.364512Z",
"shell.execute_reply": "2022-08-01T02:36:48.363614Z"
},
"pycharm": {
"name": "#%%\n"
}
},
"metadata": {},
"outputs": [],
"source": [
"import json\n",
Expand All @@ -75,40 +47,22 @@
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
}
},
"metadata": {},
"source": [
"If not provided directly, when initializing a `HyperparameterProblem` these variables are directly used for optimization."
]
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
}
},
"metadata": {},
"source": [
"Secondly, one needs to define what exactly should be optimized. For instance, for a single run on a problem (with a fixed random seed) using the well-known parameter optimization toolkit [Optuna](https://optuna.org), the implementation may look as follows:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"execution": {
"iopub.execute_input": "2022-08-01T02:36:48.369753Z",
"iopub.status.busy": "2022-08-01T02:36:48.369415Z",
"iopub.status.idle": "2022-08-01T02:36:59.415863Z",
"shell.execute_reply": "2022-08-01T02:36:59.414988Z"
},
"pycharm": {
"name": "#%%\n"
}
},
"metadata": {},
"outputs": [],
"source": [
"from pymoo.algorithms.hyperparameters import SingleObjectiveSingleRun, HyperparameterProblem\n",
Expand Down Expand Up @@ -141,29 +95,15 @@
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
}
},
"metadata": {},
"source": [
"Of course, you can also directly use the `MixedVariableGA` available in our framework:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"execution": {
"iopub.execute_input": "2022-08-01T02:36:59.419480Z",
"iopub.status.busy": "2022-08-01T02:36:59.419084Z",
"iopub.status.idle": "2022-08-01T02:37:05.995629Z",
"shell.execute_reply": "2022-08-01T02:37:05.994612Z"
},
"pycharm": {
"name": "#%%\n"
}
},
"metadata": {},
"outputs": [],
"source": [
"from pymoo.algorithms.hyperparameters import SingleObjectiveSingleRun, HyperparameterProblem\n",
Expand Down Expand Up @@ -198,29 +138,15 @@
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
}
},
"metadata": {},
"source": [
"Now, optimizing the parameters for a **single random seed** is often not desirable. And this is precisely what makes hyper-parameter optimization computationally expensive. So instead of using just a single random seed, we can use the `MultiRun` performance assessment to average over multiple runs as follows:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"execution": {
"iopub.execute_input": "2022-08-01T02:37:06.000183Z",
"iopub.status.busy": "2022-08-01T02:37:05.999864Z",
"iopub.status.idle": "2022-08-01T02:37:21.459474Z",
"shell.execute_reply": "2022-08-01T02:37:21.458554Z"
},
"pycharm": {
"name": "#%%\n"
}
},
"metadata": {},
"outputs": [],
"source": [
"from pymoo.algorithms.hyperparameters import HyperparameterProblem, MultiRun, stats_single_objective_mean\n",
Expand Down Expand Up @@ -255,29 +181,15 @@
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
}
},
"metadata": {},
"source": [
"Another way of performance measure is the number of evaluations until a specific goal has been reached. For single-objective optimization, such a goal is most likely until a minimum function value has been found. Thus, for the termination, we use `MinimumFunctionValueTermination` with a value of `1e-5`. We run the method for each random seed until this value has been reached or at most `500` function evaluations have taken place. The performance is then measured by the average number of function evaluations (`func_stats=stats_avg_nevals`) to reach the goal."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"execution": {
"iopub.execute_input": "2022-08-01T02:37:21.462989Z",
"iopub.status.busy": "2022-08-01T02:37:21.462728Z",
"iopub.status.idle": "2022-08-01T02:37:38.013305Z",
"shell.execute_reply": "2022-08-01T02:37:38.012403Z"
},
"pycharm": {
"name": "#%%\n"
}
},
"metadata": {},
"outputs": [],
"source": [
"from pymoo.algorithms.hyperparameters import HyperparameterProblem, MultiRun, stats_avg_nevals\n",
Expand Down Expand Up @@ -313,7 +225,19 @@
]
}
],
"metadata": {},
"metadata": {
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3"
}
},
"nbformat": 4,
"nbformat_minor": 4
}
34 changes: 23 additions & 11 deletions docs/source/algorithms/index.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,7 @@
"cells": [
{
"cell_type": "raw",
"metadata": {
"raw_mimetype": "text/restructuredtext"
},
"metadata": {},
"source": [
".. _nb_algorithms:"
]
Expand All @@ -18,9 +16,7 @@
},
{
"cell_type": "raw",
"metadata": {
"raw_mimetype": "text/restructuredtext"
},
"metadata": {},
"source": [
".. toctree::\n",
" :hidden:\n",
Expand Down Expand Up @@ -52,7 +48,8 @@
" moo/age2\n",
" moo/rvea\n",
" moo/sms\n",
" moo/dnsga2\n"
" moo/dnsga2\n",
" moo/kgb\n"
]
},
{
Expand All @@ -64,9 +61,7 @@
},
{
"cell_type": "raw",
"metadata": {
"raw_mimetype": "text/restructuredtext"
},
"metadata": {},
"source": [
".. admonition:: Overview\n",
" :class: myOwnStyle\n",
Expand All @@ -77,7 +72,24 @@
]
}
],
"metadata": {},
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3"
}
},
"nbformat": 4,
"nbformat_minor": 4
}
Loading

0 comments on commit cef2a9a

Please sign in to comment.