Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.ionworks.com/llms.txt

Use this file to discover all available pages before exploring further.

Running an Optimization

This guide walks you through the complete workflow for running an optimization in Ionworks Studio.

Prerequisites

Before starting an optimization, ensure you have:

Step 1: Name your optimization (optional)

You can give your optimization a descriptive name to identify it in the optimizations list, or leave the name blank to have one generated automatically.
  1. Navigate to your project’s Optimizations page
  2. Click New Optimization and select a template. You can choose from the built-in system templates (Design or Charge) or any project templates you’ve created.
  3. Optionally enter a name for your optimization (e.g., “Fast charge protocol v2” or “Electrode thickness sweep”)
If you leave the name blank, Ionworks generates a unique name for you (e.g., “Optimization a1b2c3d4”). You can always rename it later from the optimization detail page.
Choose a name that describes the goal or configuration of the run. This makes it easier to compare results when you have multiple optimizations in a project.

Step 2: Choose cell and model

Select the cell specification and parameterized model for your optimization.

Cell specification

In the Cell & Model section, select your cell from the dropdown. The cell specification provides:
  • Nominal capacity (used to convert C-rates to currents)
  • Voltage limits (used as defaults in experiments)
  • Chemistry information

Parameterized model

Next, select the parameterized model that will be used for simulations during optimization.
  1. Select a model from the dropdown
  2. Only models compatible with your selected cell will appear
  3. The model’s parameter values become the baseline for optimization
The parameterized model provides the starting point for all parameters. During optimization, only the parameters you explicitly add to the optimization will be varied—all others remain at their model values.

Step 3: Configure objectives

Objectives define what you want to achieve. Each objective includes an experiment, goals, and optional constraints.

Add an objective

  1. Click Add Objective to create a new objective
  2. Give it a descriptive name (e.g., “charge_time”, “energy_density”)

Define the experiment

The experiment specifies the simulation protocol to run. Enter it in the experiment editor in UCP format:
- CC Charge:
    - Charge:
        mode: Current
        value: Input["I_charge"]
        ends:
          - "Voltage > 4.2"
- CV Hold:
    - Charge:
        mode: Voltage
        value: 4.2
        ends:
          - "C-rate < 0.05"
Use Input["parameter_name"] syntax to reference parameters you want to optimize. The editor is the same multi-line code editor used on the Protocols page. It supports syntax highlighting, multi-line entry, and large protocols. As you type, it parses your UCP so formatting issues surface before you submit. You can paste an existing UCP protocol straight in or start from a template and edit in place.
If your UCP experiment includes an initial_temperature in the global configuration, the optimizer automatically sets the simulation’s initial and ambient temperature to that value (converted from Celsius to Kelvin). Similarly, initial_state_type and initial_state_value are used to set the initial state of charge or voltage. You do not need to add these as separate parameters.

Add custom variables (optional)

If you need to optimize based on derived quantities:
  1. Click Add Variable in the Custom Variables section
  2. Enter a Variable Name (e.g., “Anode Potential”)
  3. Enter a PyBaMM Expression (e.g., pybamm.CoupledVariable("Negative electrode surface potential difference [V]"))
  4. The expression is validated when you click away from the field
Custom variables become available in the variable dropdowns for goals and constraints.

Add goals

Goals define what to maximize or minimize:
  1. Click Add Goal
  2. Configure the goal:
    • Name: Descriptive name (e.g., “minimize_time”)
    • Weight: Relative importance (default: 1.0)
    • Action: “Maximize” or “Minimize”
    • Variable: The output variable to optimize
    • Metric Type: How to extract the value (see below)
    • Value (for crossing metrics): The crossing point
Metric Type Options:
TypeUse CaseValue Field
MaximumPeak value (e.g., max temperature)Not needed
MinimumLowest value (e.g., min anode potential)Not needed
MeanAverage valueNot needed
SumAccumulated totalNot needed
PointBasedSingle-point valuesNot needed
LastFinal value of the variableNot needed
TimeValue at specific timeTime in seconds (-1 for end)
SOCValue at specific SOCSOC as fraction (0-1)
VoltageValue at specific voltageVoltage in V

Evaluate per cycle or step

For multi-cycle experiments you can wrap any goal or constraint metric with Evaluate per step/cycle to apply it only to selected cycles or steps:
  • Per Cycle — pick a list of cycle indices and (optionally) one step within each cycle.
  • Per Step — pick a list of absolute step indices in the flattened experiment.
The form shows the experiment’s bounds (total cycles, steps per cycle, total steps) and rejects out-of-range indices. See Iterative metrics for examples.

Validate against experiment steps

Each objective has a Validate against experiment steps toggle (enabled by default). When enabled, the optimizer checks that the simulation completes all steps defined in the experiment protocol. If any step is skipped or not reached, the evaluation is penalized. Disable this toggle when your optimization may cause simulations to terminate before completing all steps — for example, if the optimizer is exploring aggressive parameter combinations that trigger early cutoff conditions. In these cases, the optimizer can still extract useful information from partial simulations without being penalized for incomplete experiments.

Add Constraints

Constraints define limits that must be respected:
  1. Click Add Constraint
  2. Configure the constraint:
    • Name: Descriptive name (e.g., “no_plating”)
    • Penalty: Weight for violations (default: 1e6)
    • Action: “GreaterThan” or “LessThan”
    • Constraint Value: The threshold
    • Variable: The output variable to constrain
    • Metric Type: How to extract the value
Example: Prevent lithium plating
Name: anode_potential_constraint
Action: GreaterThan
Constraint Value: 0
Variable: Negative electrode surface potential difference [V]
Metric Type: Minimum
This ensures the minimum anode potential stays above 0V throughout the simulation.

Step 4: Configure parameters

Parameters define what the optimizer can adjust.

Add parameters

  1. Click Add Parameter
  2. Select a parameter from the dropdown (grouped by category)
  3. Set the Initial Value (defaults to model value)
  4. Set the Lower Bound (minimum allowed value)
  5. Set the Upper Bound (maximum allowed value)
Parameters referenced in your experiment as Input["parameter_name"] will automatically appear in the “Experiment Inputs” group. Model parameters appear in their respective groups (Cell, Anode, Cathode, etc.).

Parameter validation

Before the optimization runs, Ionworks validates that every parameter you add is actually used by at least one objective. Valid parameter uses include:
  • Direct model parameters (e.g., Positive electrode thickness [m])
  • Parameters referenced in experiment steps via Input["parameter_name"]
  • Parameters linked to model parameters through expressions
  • Parameters used in constraint or penalty expressions
  • Initial state-of-charge parameters (e.g., Initial SOC [%], Initial voltage [V])
  • Temperature parameters from the UCP global configuration (e.g., Initial temperature [K], Ambient temperature [K])
If validation fails, you’ll see an error listing:
  • The unused parameters
  • The parameters the model actually uses
  • Suggestions for similar parameter names (in case of typos)
Example error:
The following fit parameters are not used by any objective's model
and are not referenced by any expression that maps to a model
parameter: ['Electrolyte viscosity']. The model uses these parameters:
['Positive electrode thickness [m]', ...]. Remove the unused parameters
or add expressions that link them to model parameters.
If you need a parameter that isn’t directly in the model, you can link it via an expression. For example, define a scale_factor parameter and use it in an expression like "Positive electrode thickness [m]": 100e-6 * scale_factor.

Parameter bounds

Choose bounds that are:
  • Physically realistic
  • Within manufacturing capabilities
  • Wide enough to allow meaningful optimization
  • Narrow enough to avoid unphysical solutions

Step 5: Configure algorithm

Fine-tune the optimization algorithm for your problem:

Optimizer

Select the optimization algorithm. The default is Differential Evolution, a global optimizer that reliably finds good solutions without requiring hyperparameter tuning.
  • Differential Evolution (default) — A population-based global optimizer that uses adaptive mutation and crossover to explore the search space. Robust on noisy or multimodal problems. Uses 1 multistart by default since it performs global search internally.
  • XNES / CMA-ES — Population-based evolution strategies suited for different problem structures. Use multiple multistarts (default: 4) for broader coverage.
  • PSO — Particle swarm optimizer that explores broadly via swarm intelligence. Uses 1 multistart by default.
  • Nelder-Mead — A gradient-free simplex method. Fast but may converge to local optima, so multiple multistarts (default: 4) are recommended.
See the Optimization overview for a comparison table.

Async evaluation mode

When distributed workers are available, population-based optimizers automatically use async evaluation mode. In this mode, the optimizer submits candidate evaluations to workers and processes results as they arrive rather than waiting for the full batch to complete. This improves throughput when individual simulations vary in duration. You do not need to configure this manually — async mode is enabled automatically when the infrastructure supports it. See Parallelization for details.

Number of multistarts

Controls how many independent optimization runs to perform. The default depends on the optimizer — global optimizers (Differential Evolution, PSO) default to 1, while local optimizers default to 4.
ValueTrade-off
1Fastest; sufficient for global optimizers
2-4Good balance for local optimizers
8-16High reliability, longer runtime
32Maximum reliability, for critical optimizations
Recommendation: For Differential Evolution (the default), start with 1 multistart. For local optimizers like XNES or CMA-ES, start with 4. Increase if results seem inconsistent across runs.

Step 6: Run optimization

  1. Review your configuration in the Form Data Preview section
  2. Click Create Optimization
  3. You’ll be redirected to the optimization detail page where you can monitor progress

During optimization

While the optimization runs, you can monitor:
  • Status: Current phase (pending, processing, completed, failed, or canceled)
  • Progress: Which multistart is running
  • Iteration History: Cost function values over iterations

Optimization phases

  1. Pending: Job queued
  2. Processing: Running multistart optimizations
  3. Combining Results: Finding best result across all starts
  4. Completed: Final results available
  5. Canceled: Optimization was canceled by a user
Terminal error states (can replace Completed):
  • Failed: An error occurred during processing
  • Canceled: Job was canceled before completion

Step 7: Review results

Once completed, review the optimization results:

Optimal parameters

The table shows the optimal values found for each parameter, compared to the initial/baseline values.

Performance comparison

Compare key metrics between:
  • Baseline: Results using initial parameter values
  • Optimized: Results using optimal parameter values

Iteration history

The convergence plot shows how the cost function evolved during optimization. Look for:
  • Incremental reductions (good)
  • Large positive jumps (constraint violations)
  • Long flat regions (optimizer has likely converged)
The optimization algorithms are implemented to minimize cost. As such, the goals and constraints are formulated to align with this convention when integrated with the optimization loop. This ensures that convergence is always presented as minimization.

Time series plots

Compare simulation outputs (voltage, current, etc.) between baseline and optimized cases to understand how the optimization improved performance.

Canceling an optimization

You can cancel a running optimization if it is still in the Pending or Processing state.

From the detail page

  1. Navigate to the optimization you want to cancel
  2. Click the Cancel button (visible only while the optimization is active)
  3. Confirm the cancellation in the dialog

From the list page

  1. Navigate to your project’s Optimizations page
  2. Select one or more active optimizations
  3. Click the Cancel bulk action
  4. Confirm the cancellation
Optimizations that have already finished are automatically skipped during bulk cancellation. Once confirmed, the optimization job and all of its child jobs (individual multistart runs) are moved to a Canceled status. Any work that was already completed before cancellation is lost.
Canceling an optimization cannot be undone. You will need to create and run a new optimization if you still need results.

Editing an optimization

After creating an optimization, you can update its name and description at any time. This is useful for keeping your optimization list organized as you iterate on different configurations.

How to edit

You can open the edit dialog from two places:
  • From the optimizations list — click the three-dot menu on any optimization row and select Edit
  • From the optimization detail page — click the Edit button in the header
In the dialog, you can change the optimization’s name and description. Click Save Changes to apply.
Only the name and description can be edited. To change parameters, objectives, constraints, or algorithm configuration, clone the optimization and create a new one with the updated settings.

Cloning an optimization

Click Clone on any completed optimization to create a new optimization pre-filled with the same cell, model, parameters, objectives, and algorithm settings. This is useful when you want to re-run with minor adjustments — for example, narrowing parameter bounds based on initial results or adding a new constraint.

Deleting an optimization

You can permanently delete an optimization and all its results from the optimization detail page.
  1. Navigate to the optimization you want to delete
  2. Click the Delete button
  3. Confirm the deletion in the dialog
This action cannot be undone. The optimization and all its results are permanently removed.

Best Practices

Start simple

Begin with:
  • Fewer parameters (2-4)
  • Wider bounds
  • Lower multistarts (2-4)
  • Fewer iterations (50)
Increase complexity once you understand the problem.

Check constraint satisfaction

If optimal results violate constraints:
  • Increase penalty values
  • Narrow parameter bounds
  • Check that constraints are physically achievable

Validate results

After optimization:
  • Run a simulation with optimal parameters to verify
  • Check that results are physically reasonable
  • Consider manufacturing tolerances

Iterate

Optimization is often iterative:
  1. Run initial optimization
  2. Review results and refine bounds
  3. Add/remove parameters or constraints
  4. Re-run with adjusted configuration

Troubleshooting

Partial Solutions on Solver Failure

When the solver encounters a numerical error mid-simulation (e.g., instabilities for certain parameter combinations), it returns a partial solution containing results up to the point of failure instead of crashing. The objective uses this partial time-series directly, giving the optimizer real data from the portion of the simulation that succeeded. Partial solutions improve convergence because the optimizer receives meaningful gradient information from the successful portion of the simulation, rather than losing the evaluation entirely.

Optimization Fails

  • Check that parameter bounds are realistic
  • Verify experiment syntax is valid
  • Ensure constraints are achievable
  • Check model compatibility with experiment

Poor convergence

  • Increase max iterations
  • Narrow parameter bounds
  • Simplify the objective function
  • Check for competing constraints

Inconsistent results

  • Increase number of multistarts
  • Check for multiple local optima
  • Review constraint penalties

Next steps