Optimization

This notebook walks through the steps of setting up / running optimizations using basico. We start as usual:

[1]:
from basico import *

Model

The first step is to load a model (this can be done as usual using load_model, load_biomodel or by load_example) or create a new one. Here I’ll create one with a typical optimization problem, the himmelblau function:

\[f(x,y) = (x^2 + y - 11)^2 + (x + y^2 -7)^2\]

In basico this is easily done using global parameters for x and y, and then an assignment for the function

[2]:
new_model(name="Himmelblau",
          notes="""A model implementing the himmelau function

          Maxima is known to be at (-0.270845, -0.923039) with
          max value 181.617

          4 Minima: (3,2), (-2.805118, 3.131313),
                    (-3.779310, -3.2383186),
                    (3.584428, -1.848126) with value 0

          """);
[3]:
add_parameter('x', initial_value=0)
add_parameter('y', initial_value=0)
add_parameter('f', type='assignment',
              expression='({Values[x].InitialValue}^2+{Values[y].InitialValue}-11)^2+({Values[x].InitialValue}+{Values[y].InitialValue}^2-7)^2');

The Setup

Now we setup the parameters to be varied during the optimization. For each item we need to specify, what to vary, as well as the lower and upper bounds. The utility function get_opt_item_template allows to retrieve all the global / local parameters and sets default bounds:

[4]:
get_opt_item_template(include_global=True, default_lb=-1)
[4]:
[{'name': 'Values[x].InitialValue', 'lower': -1, 'upper': 1000, 'start': 0.0},
 {'name': 'Values[y].InitialValue', 'lower': -1, 'upper': 1000, 'start': 0.0}]

lets use them:

[5]:
set_opt_parameters(get_opt_item_template(include_global=True, default_lb=-1))

the next thing is to set up the objective function. Any expression with the names of model elements will work, here we want to minimize the value of the global parameter f:

[6]:
set_objective_function(expression='Values[f].InitialValue', minimize=True)

additional settings can be modified using set_opt_settings, such as specifying the method to be used and their parameters:

[7]:
set_opt_settings(settings={
    'subtask': T.TIME_COURSE,
    'method': {
        'name': PE.LEVENBERG_MARQUARDT
}})

to verify the setup you can use get_opt_parameters to retrieve all the parameters and bounds and get_opt_settings to retrieve all settings:

[8]:
get_opt_settings()
[8]:
{'scheduled': False,
 'update_model': False,
 'problem': {'Maximize': False,
  'Randomize Start Values': False,
  'Calculate Statistics': True},
 'method': {'Iteration Limit': 2000,
  'Tolerance': 1e-06,
  'name': 'Levenberg - Marquardt'},
 'report': {'filename': '',
  'report_definition': 'Optimization',
  'append': True,
  'confirm_overwrite': True},
 'expression': 'Values[f].InitialValue',
 'subtask': 'Time-Course'}

Running the optimization

Now that everything is set up, we can simply run the optimization:

[9]:
run_optimization()
[9]:
lower upper sol
name
Values[x] -1 1000 2.999999
Values[y] -1 1000 2.000000

we got close to one of the minima, to see more information about the run, you can use:

[10]:
get_opt_statistic()
[10]:
{'obj': 8.352969862744543e-11,
 'f_evals': 321,
 'failed_evals_exception': 0,
 'failed_evals_nan': 0,
 'constraint_evals': 0,
 'failed_constraint_evals': 0,
 'cpu_time': 0.001112,
 'evals_per_sec': 3.46417445482866e-06}

Customn Output

normally when you run an optimization run_optimization will return a data frame of the best parameters found, just as when you run get_opt_solution. So, since we get to the results in any case, there is an optional parameter that you can pass to run_optimization, to collect any element you would like during the run.

This is an advanced feature, as for many things we only have Common Names, that are a bit wieldy to use, still lets do that here.

NOTE: this will only work for real valued CN’s right now

In the next run, i collect the number of function evaluation and the objective function value:

[11]:
run_optimization(output=[
    'Values[x].InitialValue',
    'Values[y].InitialValue',
    'CN=Root,Vector=TaskList[Optimization],Problem=Optimization,Reference=Best Value'
])
[11]:
Values[x].InitialValue Values[y].InitialValue TaskList[Optimization].(Problem)Optimization.Best Value
0 0.000000 0.000000 1.700000e+02
1 0.875001 1.375001 9.641837e+01
2 2.108645 2.656495 1.987741e+01
3 2.185752 2.641869 1.750903e+01
4 2.303988 2.611787 1.400292e+01
5 2.461318 2.552965 9.623309e+00
6 2.637356 2.451114 5.245471e+00
7 2.797570 2.308062 2.014153e+00
8 2.911844 2.158941 4.593865e-01
9 2.972712 2.053942 4.851944e-02
10 2.994778 2.010172 1.711901e-03
11 2.999459 2.000957 1.604411e-05
12 2.999970 2.000046 4.073483e-08
13 2.999998 2.000001 1.515858e-10
14 2.999999 2.000000 8.410451e-11
15 2.999999 2.000000 8.353344e-11
16 2.999999 2.000000 8.352971e-11
17 2.999999 2.000000 8.352970e-11

Constraints

You can further constrain the optimization problem, by defining constraints that will be evaluated when the model is simulated. This is useful for example, when you want to ensure concentrations are in a specific range.

You can modify constraints using the set_opt_contraints function and retrieve them using get_opt_constraints. As example here, we constrain the solution to be greater than 170 when maximizing:

[29]:
set_opt_constraints([
    {'name': 'Values[f]', 'lower': 170, 'upper': 200}
])
[30]:
get_opt_constraints()
[30]:
lower upper start cn
name
Values[f] 170 200 181.616522 CN=Root,Model=Himmelblau,Vector=Values[f],Refe...
[31]:
settings = get_opt_settings()
settings['problem']['Maximize'] = True
run_optimization(settings=settings)
get_opt_statistic()
[31]:
{'obj': 181.6165215225808,
 'f_evals': 282,
 'failed_evals_exception': 0,
 'failed_evals_nan': 0,
 'constraint_evals': 33,
 'failed_constraint_evals': 0,
 'cpu_time': 0.001709,
 'evals_per_sec': 6.060283687943263e-06}

Note: Keep in mind, that setting constraints will make finding the solution harder for the algorithms. So when in doubt, it might be a good idea to take them out.

[ ]: