Documentation Index
Fetch the complete documentation index at: https://docs.ionworks.com/llms.txt
Use this file to discover all available pages before exploring further.
Studies
Studies are focused investigations within a project where you run simulations, compare results, and analyze battery performance. They provide a structured way to organize your experimental work.What is a Study?
A study represents a specific investigation or analysis within a project. For example:- Charge Protocol Optimization - Testing different charging rates and strategies
- Temperature Performance Analysis - Evaluating cell behavior across temperature ranges
- Cycle Life Comparison - Comparing degradation under different usage patterns
- Model Validation - Testing model predictions against experimental data and generating validation reports
Study Workflow
1. Create a Study
Studies live inside projects. Navigate to your project and create a new study with a descriptive name that captures what you’re investigating.2. Run Simulations
Within a study, you can:- Select a cell and model
- Choose a protocol or experiment template
- Configure parameters (temperature, C-rate, voltages, etc.)
- Run the simulation
3. Analyze Results
Once simulations complete, you can:- View time-series data (voltage, current, temperature, etc.)
- Compare multiple simulations side-by-side
- Export data for further analysis
- Visualize trends and patterns
4. Validate against experimental data
If you have experimental measurement data uploaded for your cell, you can generate a validation report to compare simulation predictions against real-world results. See Validation report below.Validation report
The validation report lets you compare simulation results against experimental measurement data directly within a study. This is useful when you want to verify that your model accurately reproduces real-world battery behavior.When to use validation
Use a validation report when you want to:- Verify that a parameterized model accurately predicts experimental results
- Quantify the agreement between simulation and measurement using error metrics
- Visually compare simulated and measured voltage, current, or other variables
- Evaluate model performance across different operating conditions (e.g., C-rates, temperatures)
Setting up validation
Run simulations in your study
Run one or more simulations using the cell, model, and protocol that match your experimental conditions. For example, if your experimental data is a 1C constant current discharge at 25°C, run a simulation with the same protocol and conditions.
Map simulations to measurements
Open the validation setup from your study. Select which simulation to compare against which experimental measurement. Each simulation can be paired with a corresponding measurement from your uploaded cell data.
Choose initial conditions (optional)
For each validation row, select how the simulation should be initialized. The default Auto-detect option works well when the measurement starts from a known state; see Initial conditions below for the full set of options.
Your experimental data must be uploaded to Ionworks Studio before you can use it for validation. See the Data Overview for how to upload and manage measurement data.
Initial conditions
When you set up a validation, you can choose how the simulation is initialized for each simulation-to-measurement pair. This controls the starting state of the cell model so that it matches the conditions under which the experimental data was collected.| Option | Description |
|---|---|
| Auto-detect | The initial condition is inferred from the first data point of the measurement. This is the default and works well when the measurement starts from a known state. |
| Custom voltage | Specify a starting voltage manually. Use this when the auto-detected value doesn’t match the true starting condition, or when you want to override it for consistency. |
| Custom state of charge | Specify a starting state of charge (SOC) as a value between 0 and 1. Use this when you know the SOC at the start of the experiment but the open-circuit voltage relationship makes voltage-based initialization unreliable. |
Managing validation rows
Each simulation-to-measurement pair is displayed as a row in the validation setup. You can remove individual rows by clicking the delete button on a row, which prompts a confirmation dialog before removing it.Drive cycle validation
Validation reports support drive cycle protocols in addition to standard experiment templates. This means you can validate your model against real-world driving profiles or other complex time-varying protocols. When you map a simulation that uses a drive cycle protocol to a measurement, the validation report compares the simulated response against the measured data across the full drive cycle, including variable current and power profiles, using the same overlay plots and error metrics.Reading the report
The validation report provides:- Overlay plots comparing simulated and measured data for each mapped pair, so you can visually inspect how well the model tracks the experiment. Grid lines are included for easier reading.
- Error metrics that quantify the agreement between simulation and measurement, such as root mean squared error (RMSE) and mean absolute error (MAE)
- Per-simulation breakdowns so you can identify which operating conditions the model handles well and where it diverges
Key Features
Smart Simulation Reuse
Before running a new simulation, the system checks if an identical one has already been completed. If found, it reuses the existing results instantly - saving time and computational resources.Non-Destructive Associations
Studies only provide “views” into simulation results. This means:- A simulation can be associated with multiple studies
- Removing a simulation from a study doesn’t delete the results
- You can reorganize your work without losing data
Collaborative Research
All studies within a project are accessible to organization members. This enables:- Team collaboration on investigations
- Sharing of results across teams
- Consistent methodology across experiments
Organization Hierarchy
Best Practices
Name Studies Descriptively
Use clear, descriptive names that indicate what you’re investigating:- ✅ “Fast Charging Safety Analysis - 25°C”
- ✅ “Model Validation vs Arbin Data”
- ❌ “Study 1”
- ❌ “Test”
Group Related Simulations
Keep related simulations together in the same study. For example, if you’re testing different C-rates at 25°C, run all those variations in one study for easy comparison.Use Multiple Studies for Different Conditions
Create separate studies when investigating substantially different conditions:- One study for 25°C performance
- Another study for 45°C performance
- A third study for low-temperature behavior
Next Steps
- Learn about Simulations
- Explore Protocols and built-in templates
- Understand Projects and Studies hierarchy
- Upload experimental data for use in validation — see Uploading data