Optimization

This notebook walks through the steps of setting up / running optimizations using basico. We start as usual:

[1]:

from basico import *


Model

The first step is to load a model (this can be done as usual using load_model, load_biomodel or by load_example) or create a new one. Here I’ll create one with a typical optimization problem, the himmelblau function:

$f(x,y) = (x^2 + y - 11)^2 + (x + y^2 -7)^2$

In basico this is easily done using global parameters for x and y, and then an assignment for the function

[2]:

new_model(name="Himmelblau",
notes="""A model implementing the himmelau function

Maxima is known to be at (-0.270845, -0.923039) with
max value 181.617

4 Minima: (3,2), (-2.805118, 3.131313),
(-3.779310, -3.2383186),
(3.584428, -1.848126) with value 0

""");

[3]:

add_parameter('x', initial_value=0)
expression='({Values[x].InitialValue}^2+{Values[y].InitialValue}-11)^2+({Values[x].InitialValue}+{Values[y].InitialValue}^2-7)^2');


The Setup

Now we setup the parameters to be varied during the optimization. For each item we need to specify, what to vary, as well as the lower and upper bounds. The utility function get_opt_item_template allows to retrieve all the global / local parameters and sets default bounds:

[4]:

get_opt_item_template(include_global=True)

[4]:

[{'name': 'Values[x].InitialValue',
'lower': 0.001,
'upper': 1000,
'start': 0.0},
{'name': 'Values[y].InitialValue',
'lower': 0.001,
'upper': 1000,
'start': 0.0}]


lets use them:

[5]:

set_opt_parameters(get_opt_item_template(include_global=True))


the next thing is to set up the objective function. Any expression with the names of model elements will work, here we want to minimize the value of the global parameter f:

[6]:

set_objective_function(expression='Values[f].InitialValue', minimize=True)


additional settings can be modified using set_opt_settings, such as specifying the method to be used and their parameters:

[7]:

set_opt_settings(settings={
'method': {
'name': PE.LEVENBERG_MARQUARDT
}})


to verify the setup you can use get_opt_parameters to retrieve all the parameters and bounds and get_opt_settings to retrieve all settings:

[8]:

get_opt_settings()

[8]:

{'scheduled': False,
'update_model': False,
'problem': {'Maximize': False,
'Randomize Start Values': False,
'Calculate Statistics': True},
'method': {'Iteration Limit': 2000,
'Tolerance': 1e-06,
'name': 'Levenberg - Marquardt'},
'report': {'filename': '',
'report_definition': 'Optimization',
'append': True,
'confirm_overwrite': True},
'expression': 'Values[f].InitialValue',


Running the optimization

Now that everything is set up, we can simply run the optimization:

[9]:

run_optimization()

[9]:

lower upper sol
name
Values[x] 0.001 1000 2.999999
Values[y] 0.001 1000 2.000000

we got close to one of the minima, to see more information about the run, you can use:

[10]:

get_opt_statistic()

[10]:

{'obj': 8.352969862744543e-11,
'f_evals': 321,
'failed_evals_exception': 0,
'failed_evals_nan': 0,
'cpu_time': 0.015625,
'evals_per_sec': 4.867601246105919e-05}


Customn Output

normally when you run an optimization run_optimization will return a data frame of the best parameters found, just as when you run get_opt_solution. So, since we get to the results in any case, there is an optional parameter that you can pass to run_optimization, to collect any element you would like during the run.

This is an advanced feature, as for many things we only have Common Names, that are a bit wieldy to use, still lets do that here.

NOTE: this will only work for real valued CN’s right now

In the next run, i collect the number of function evaluation and the objective function value:

[46]:

run_optimization(output=[
'Values[x].InitialValue',
'Values[y].InitialValue',
])

[46]:

0 0.001000 0.001000 1.699640e+02
1 0.878625 1.377625 9.616761e+01
2 2.114618 2.658079 1.973172e+01
3 2.190241 2.643311 1.741108e+01
4 2.306777 2.613025 1.395570e+01
5 2.462676 2.553946 9.611970e+00
6 2.637788 2.451821 5.248821e+00
7 2.797605 2.308513 2.017803e+00
8 2.911797 2.159174 4.605138e-01
9 2.972685 2.054027 4.865900e-02
10 2.994771 2.010190 1.717417e-03
11 2.999458 2.000959 1.609699e-05
12 2.999970 2.000046 4.086548e-08
13 2.999998 2.000001 1.517342e-10
14 2.999999 2.000000 8.410543e-11
15 2.999999 2.000000 8.353344e-11
16 2.999999 2.000000 8.352971e-11
17 2.999999 2.000000 8.352970e-11