Methods

Optimization

Optimization translates your experimental data into actionable process settings. Whether you are adjusting manufacturing parameters to maximize yield, balancing competing quality metrics, or identifying the exact operational range for stable production, this family identifies the specific input combinations that meet your performance requirements.

Optimization uses the mathematical models generated by Response Surface Methodology to navigate the landscape of your process data. When you ask Lattice to improve a outcome, our three-stage architecture initiates: first, the LLM analyzes your request to identify which constraints and objectives are being targeted. Second, our deterministic engine executes the Derringer-Suich desirability algorithms to search the model’s coordinate space, utilizing multi-start numerical methods to ensure we find the true peak rather than a local trap. Finally, the LLM interprets the result, presenting the exact set-points and expected outcomes in plain language, while alerting you if the underlying model lacks the precision to make a confident recommendation.

When to choose this family

How optimization functions

Optimization works by mapping your process outcomes to a 'desirability' scale. This scale converts raw performance numbers—like percentage yield or parts-per-million defects—into a standardized score ranging from 0 (unacceptable) to 1 (ideal).

By applying these scores, the system can mathematically aggregate multiple objectives. It searches the available factor space to find a single point—or a range of points—that yields the highest combined score, effectively turning complex multi-variable decisions into a specific set of operational instructions.

Optimization vs. simple predictive modeling

While predictive modeling simply tells you what might happen if you change an input, optimization assumes the role of an active assistant. It doesn't just show you the potential outcomes; it identifies the best possible input coordinates based on your defined constraints.

Unlike standard regression that stops at showing a trend line or a contour plot, optimization tools actively navigate the response surface to return a concrete solution. It resolves the ambiguity of 'what if' by providing a direct answer to 'what should I do'.

Common mistakes to avoid

A common error is requesting an optimal point based on a low-confidence model. If the R-squared value of your underlying data is low, the optimization might point to a setting that looks perfect on paper but fails in reality.

Another mistake is neglecting to account for process noise. Always look for a 'process window' rather than a single 'perfect point' if your equipment has any margin of error, as a point estimate in a highly sensitive region may be difficult to maintain in a real-world production environment.

Frequently asked questions

What happens if the model suggests an optimal point that seems unrealistic?
Lattice provides a confidence warning if the underlying statistical model has low predictive power. We suggest reviewing the R-squared metric displayed with the output and considering whether your current dataset captures enough variation to justify the suggested setting.
Can I prioritize one objective over another?
Yes. When setting up multi-objective tasks, you can assign weights to your targets. A higher weight forces the system to prioritize that specific metric, effectively allowing you to tell the system, for example, that hitting a specific purity target is more important than achieving maximum yield.

Methods in this family