Retrieving all Task / Method Settings

This notebook, lists all the settings available for all the tasks available. To change the settings the approach is

  • find all the methods available for a task using get_valid_methods (added in basico 0.77)

  • retrieve the current settings using get_task_settings

  • change the the task by specifying a different method using set_task_settings(task, settings={'method': {'name': name}} )

  • retrieve the new method parameters using get_task_settings again

The print_task_info function below does that.

[1]:
from basico import *
new_model()

from IPython.display import display, Markdown
import json
def print_task_info(task: str):
    display(Markdown(f"### {task}"))
    method_names = get_valid_methods(task)
    display(Markdown(f"**Valid Methods**"))
    for name in method_names:
        display(Markdown(f"- {name}"))

    settings = get_task_settings(task)
    display(Markdown(f"**Problem**"))
    problem = json.dumps(settings['problem'], indent=4)
    display(Markdown(f"```python\n {problem}\n```"))
    for name in method_names:
        display(Markdown(f"#### Method: {name}"))
        set_task_settings(task, settings={'method': {'name': name}} )
        settings = json.dumps(get_task_settings(task)['method'], indent=4)
        display(Markdown(f"```python\n {settings}\n```"))

Without further ado, here all the tasks and their method settings currently availble:

[2]:
for task in T.all_task_names():
    print_task_info(task)

Steady-State

Valid Methods

  • Enhanced Newton

Problem

 {
    "JacobianRequested": true,
    "StabilityAnalysisRequested": true
}

Method: Enhanced Newton

 {
    "Resolution": 1e-09,
    "Derivation Factor": 0.001,
    "Use Newton": true,
    "Use Integration": true,
    "Use Back Integration": false,
    "Accept Negative Concentrations": false,
    "Iteration Limit": 50,
    "Maximum duration for forward integration": 1000000000.0,
    "Maximum duration for backward integration": 1000000.0,
    "Target Criterion": "Distance and Rate",
    "name": "Enhanced Newton"
}

Time-Course

Valid Methods

  • Deterministic (LSODA)

  • Deterministic (RADAU5)

  • Stochastic (Gibson + Bruck)

  • Stochastic (Direct method)

  • Stochastic (τ-Leap)

  • Stochastic (Adaptive SSA/τ-Leap)

  • Hybrid (Runge-Kutta)

  • Hybrid (LSODA)

  • Hybrid (RK-45)

  • SDE Solver (RI5)

Problem

 {
    "AutomaticStepSize": false,
    "StepNumber": 100,
    "StepSize": 0.01,
    "Duration": 1.0,
    "TimeSeriesRequested": true,
    "OutputStartTime": 0.0,
    "Output Event": false,
    "Start in Steady State": false,
    "Use Values": false,
    "Values": ""
}

Method: Deterministic (LSODA)

 {
    "Integrate Reduced Model": false,
    "Relative Tolerance": 1e-06,
    "Absolute Tolerance": 1e-12,
    "Max Internal Steps": 100000,
    "Max Internal Step Size": 0.0,
    "name": "Deterministic (LSODA)"
}

Method: Deterministic (RADAU5)

 {
    "Integrate Reduced Model": false,
    "Relative Tolerance": 0.0001,
    "Absolute Tolerance": 1e-06,
    "Max Internal Steps": 1000000000,
    "Initial Step Size": 0.001,
    "name": "Deterministic (RADAU5)"
}

Method: Stochastic (Gibson + Bruck)

 {
    "Max Internal Steps": 1000000,
    "Subtype": 2,
    "Use Random Seed": false,
    "Random Seed": 1,
    "name": "Stochastic (Gibson + Bruck)"
}

Method: Stochastic (Direct method)

 {
    "Max Internal Steps": 1000000,
    "Use Random Seed": false,
    "Random Seed": 1,
    "name": "Stochastic (Direct method)"
}

Method: Stochastic (τ-Leap)

 {
    "Epsilon": 0.001,
    "Max Internal Steps": 10000,
    "Use Random Seed": false,
    "Random Seed": 1,
    "name": "Stochastic (\u03c4-Leap)"
}

Method: Stochastic (Adaptive SSA/τ-Leap)

 {
    "Epsilon": 0.03,
    "Max Internal Steps": 1000000,
    "Use Random Seed": false,
    "Random Seed": 1,
    "name": "Stochastic (Adaptive SSA/\u03c4-Leap)"
}

Method: Hybrid (Runge-Kutta)

 {
    "Max Internal Steps": 1000000,
    "Lower Limit": 800.0,
    "Upper Limit": 1000.0,
    "Partitioning Interval": 1,
    "Use Random Seed": false,
    "Random Seed": 1,
    "Runge Kutta Stepsize": 0.001,
    "name": "Hybrid (Runge-Kutta)"
}

Method: Hybrid (LSODA)

 {
    "Max Internal Steps": 1000000,
    "Lower Limit": 800.0,
    "Upper Limit": 1000.0,
    "Partitioning Interval": 1,
    "Use Random Seed": false,
    "Random Seed": 1,
    "Integrate Reduced Model": false,
    "Relative Tolerance": 1e-06,
    "Absolute Tolerance": 1e-12,
    "Max Internal Step Size": 0.0,
    "name": "Hybrid (LSODA)"
}

Method: Hybrid (RK-45)

 {
    "Max Internal Steps": 100000,
    "Relative Tolerance": 1e-06,
    "Absolute Tolerance": 1e-09,
    "Partitioning Strategy": "User specified Partition",
    "Use Random Seed": false,
    "Random Seed": 1,
    "name": "Hybrid (RK-45)"
}

Method: SDE Solver (RI5)

 {
    "Internal Steps Size": 0.0001,
    "Max Internal Steps": 10000,
    "Force Physical Correctness": true,
    "Absolute Tolerance": 1e-06,
    "Tolerance for Root Finder": 1e-06,
    "name": "SDE Solver (RI5)"
}

Scan

Valid Methods

  • Scan Framework

Problem

 {
    "Subtask": 1,
    "Subtask Output": "subTaskDuring",
    "Adjust initial conditions": false,
    "Continue on Error": false
}

Method: Scan Framework

 {
    "name": "Scan Framework"
}

Elementary Flux Modes

Valid Methods

  • EFM Algorithm

Problem

{}

Method: EFM Algorithm

 {
    "name": "EFM Algorithm"
}

Optimization

Valid Methods

  • Current Solution Statistics

  • Differential Evolution

  • Evolution Strategy (SRES)

  • Evolutionary Programming

  • Genetic Algorithm

  • Genetic Algorithm SR

  • Hooke & Jeeves

  • Levenberg - Marquardt

  • Nelder - Mead

  • Particle Swarm

  • Praxis

  • Random Search

  • Scatter Search

  • Simulated Annealing

  • Steepest Descent

  • Truncated Newton

Problem

 {
    "Subtask": "CN=Root,Vector=TaskList[Steady-State]",
    "Maximize": false,
    "Randomize Start Values": false,
    "Calculate Statistics": true
}

Method: Current Solution Statistics

 {
    "name": "Current Solution Statistics"
}

Method: Differential Evolution

 {
    "Number of Generations": 2000,
    "Population Size": 10,
    "name": "Differential Evolution"
}

Method: Evolution Strategy (SRES)

 {
    "Number of Generations": 200,
    "Population Size": 20,
    "Pf": 0.475,
    "name": "Evolution Strategy (SRES)"
}

Method: Evolutionary Programming

 {
    "Number of Generations": 200,
    "Population Size": 20,
    "name": "Evolutionary Programming"
}

Method: Genetic Algorithm

 {
    "Number of Generations": 200,
    "Population Size": 20,
    "name": "Genetic Algorithm"
}

Method: Genetic Algorithm SR

 {
    "Number of Generations": 200,
    "Population Size": 20,
    "Pf": 0.475,
    "name": "Genetic Algorithm SR"
}

Method: Hooke & Jeeves

 {
    "Iteration Limit": 50,
    "Tolerance": 1e-05,
    "Rho": 0.2,
    "name": "Hooke & Jeeves"
}

Method: Levenberg - Marquardt

 {
    "Iteration Limit": 2000,
    "Tolerance": 1e-06,
    "name": "Levenberg - Marquardt"
}

Method: Nelder - Mead

 {
    "Iteration Limit": 200,
    "Tolerance": 1e-05,
    "Scale": 10.0,
    "name": "Nelder - Mead"
}

Method: Particle Swarm

 {
    "Iteration Limit": 2000,
    "Swarm Size": 50,
    "Std. Deviation": 1e-06,
    "name": "Particle Swarm"
}

Method: Praxis

 {
    "Tolerance": 1e-05,
    "name": "Praxis"
}
 {
    "Number of Iterations": 100000,
    "name": "Random Search"
}
 {
    "Number of Iterations": 200,
    "name": "Scatter Search"
}

Method: Simulated Annealing

 {
    "Start Temperature": 1.0,
    "Cooling Factor": 0.85,
    "Tolerance": 1e-06,
    "name": "Simulated Annealing"
}

Method: Steepest Descent

 {
    "Iteration Limit": 100,
    "Tolerance": 1e-06,
    "name": "Steepest Descent"
}

Method: Truncated Newton

 {
    "name": "Truncated Newton"
}

Parameter Estimation

Valid Methods

  • Current Solution Statistics

  • Differential Evolution

  • Evolution Strategy (SRES)

  • Evolutionary Programming

  • Genetic Algorithm

  • Genetic Algorithm SR

  • Hooke & Jeeves

  • Levenberg - Marquardt

  • NL2SOL

  • Nelder - Mead

  • Particle Swarm

  • Praxis

  • Random Search

  • Scatter Search

  • Simulated Annealing

  • Steepest Descent

  • Truncated Newton

Problem

 {
    "Maximize": false,
    "Randomize Start Values": false,
    "Calculate Statistics": true,
    "Steady-State": "CN=Root,Vector=TaskList[Steady-State]",
    "Time-Course": "CN=Root,Vector=TaskList[Time-Course]",
    "Create Parameter Sets": false,
    "Use Time Sens": false,
    "Time-Sens": ""
}

Method: Current Solution Statistics

 {
    "name": "Current Solution Statistics"
}

Method: Differential Evolution

 {
    "Number of Generations": 2000,
    "Population Size": 10,
    "name": "Differential Evolution"
}

Method: Evolution Strategy (SRES)

 {
    "Number of Generations": 200,
    "Population Size": 20,
    "Pf": 0.475,
    "name": "Evolution Strategy (SRES)"
}

Method: Evolutionary Programming

 {
    "Number of Generations": 200,
    "Population Size": 20,
    "name": "Evolutionary Programming"
}

Method: Genetic Algorithm

 {
    "Number of Generations": 200,
    "Population Size": 20,
    "name": "Genetic Algorithm"
}

Method: Genetic Algorithm SR

 {
    "Number of Generations": 200,
    "Population Size": 20,
    "Pf": 0.475,
    "name": "Genetic Algorithm SR"
}

Method: Hooke & Jeeves

 {
    "Iteration Limit": 50,
    "Tolerance": 1e-05,
    "Rho": 0.2,
    "name": "Hooke & Jeeves"
}

Method: Levenberg - Marquardt

 {
    "Iteration Limit": 2000,
    "Tolerance": 1e-06,
    "name": "Levenberg - Marquardt"
}

Method: NL2SOL

 {
    "Iteration Limit": 2000,
    "name": "NL2SOL"
}

Method: Nelder - Mead

 {
    "Iteration Limit": 200,
    "Tolerance": 1e-05,
    "Scale": 10.0,
    "name": "Nelder - Mead"
}

Method: Particle Swarm

 {
    "Iteration Limit": 2000,
    "Swarm Size": 50,
    "Std. Deviation": 1e-06,
    "name": "Particle Swarm"
}

Method: Praxis

 {
    "Tolerance": 1e-05,
    "name": "Praxis"
}

Method: Random Search

 {
    "Number of Iterations": 100000,
    "name": "Random Search"
}

Method: Scatter Search

 {
    "Number of Iterations": 200,
    "name": "Scatter Search"
}

Method: Simulated Annealing

 {
    "Start Temperature": 1.0,
    "Cooling Factor": 0.85,
    "Tolerance": 1e-06,
    "name": "Simulated Annealing"
}

Method: Steepest Descent

 {
    "Iteration Limit": 100,
    "Tolerance": 1e-06,
    "name": "Steepest Descent"
}

Method: Truncated Newton

 {
    "name": "Truncated Newton"
}

Metabolic Control Analysis

Valid Methods

  • MCA Method (Reder)

Problem

{}

Method: MCA Method (Reder)

 {
    "Modulation Factor": 1e-09,
    "Use Reder": true,
    "Use Smallbone": true,
    "name": "MCA Method (Reder)"
}

Lyapunov Exponents

Valid Methods

  • Wolf Method

Problem

 {
    "ExponentNumber": 3,
    "DivergenceRequested": true,
    "TransientTime": 0.0
}

Method: Wolf Method

 {
    "Orthonormalization Interval": 1.0,
    "Overall time": 1000.0,
    "Relative Tolerance": 1e-06,
    "Absolute Tolerance": 1e-12,
    "Max Internal Steps": 10000,
    "name": "Wolf Method"
}

Time Scale Separation Analysis

Valid Methods

  • ILDM (LSODA,Deuflhard)

  • ILDM (LSODA,Modified)

  • CSP (LSODA)

Problem

 {
    "StepNumber": 100,
    "StepSize": 0.01,
    "Duration": 1.0,
    "TimeSeriesRequested": true,
    "OutputStartTime": 0.0
}

Method: ILDM (LSODA,Deuflhard)

 {
    "Deuflhard Tolerance": 0.0001,
    "name": "ILDM (LSODA,Deuflhard)"
}

Method: ILDM (LSODA,Modified)

 {
    "Deuflhard Tolerance": 0.0001,
    "name": "ILDM (LSODA,Modified)"
}

Method: CSP (LSODA)

 {
    "Integrate Reduced Model": true,
    "Ratio of Modes Separation": 0.9,
    "Maximum Relative Error": 0.001,
    "Maximum Absolute Error": 1e-06,
    "Refinement Iterations Number": 1000,
    "name": "CSP (LSODA)"
}

Sensitivities

Valid Methods

  • Sensitivities Method

Problem

 {
    "SubtaskType": 1
}

Method: Sensitivities Method

 {
    "Delta factor": 0.001,
    "Delta minimum": 1e-12,
    "name": "Sensitivities Method"
}

Moieties

Valid Methods

  • Householder Reduction

Problem

{}

Method: Householder Reduction

 {
    "name": "Householder Reduction"
}

Cross Section

Valid Methods

  • Deterministic (LSODA)

Problem

 {
    "AutomaticStepSize": false,
    "StepNumber": 100,
    "StepSize": 0.01,
    "Duration": 1.0,
    "TimeSeriesRequested": true,
    "OutputStartTime": 0.0,
    "Output Event": false,
    "Start in Steady State": false,
    "Use Values": false,
    "Values": "",
    "LimitCrossings": false,
    "NumCrossingsLimit": 0,
    "LimitOutTime": false,
    "LimitOutCrossings": false,
    "PositiveDirection": true,
    "NumOutCrossingsLimit": 0,
    "LimitUntilConvergence": false,
    "ConvergenceTolerance": 1e-06,
    "Threshold": 0.0,
    "DelayOutputUntilConvergence": false,
    "OutputConvergenceTolerance": 1e-06,
    "SingleVariable": ""
}

Method: Deterministic (LSODA)

 {
    "Integrate Reduced Model": false,
    "Relative Tolerance": 1e-06,
    "Absolute Tolerance": 1e-12,
    "Max Internal Steps": 100000,
    "Max Internal Step Size": 0.0,
    "name": "Deterministic (LSODA)"
}

Linear Noise Approximation

Valid Methods

  • Linear Noise Approximation

Problem

{}

Method: Linear Noise Approximation

 {
    "name": "Linear Noise Approximation"
}

Time-Course Sensitivities

Valid Methods

  • LSODA Sensitivities

Problem

 {
    "AutomaticStepSize": false,
    "StepNumber": 100,
    "StepSize": 0.01,
    "Duration": 1.0,
    "TimeSeriesRequested": true,
    "OutputStartTime": 0.0,
    "Output Event": false,
    "Start in Steady State": false,
    "Use Values": false,
    "Values": ""
}

Method: LSODA Sensitivities

 {
    "Integrate Reduced Model": false,
    "Relative Tolerance": 1e-06,
    "Absolute Tolerance": 1e-12,
    "Max Internal Steps": 10000,
    "Max Internal Step Size": 0.0,
    "name": "LSODA Sensitivities"
}
[ ]: