The Cloud Lets Engineers Access Powerful Multiphysics Solvers

3 years ago   •   7 min read


April 29, 2021 Ken Strandberg

Digitally prototyping complex designs, such as large physical structures, biological features, and micro-electromechanical systems (MEMS) requires supercomputers running sophisticated multiphysics solvers.

Many physical phenomena – electrical, thermal, mechanical, material, and others – must be simultaneously simulated (possibly with thousands to millions of degrees of freedom) in 3D. Accurate digital prototypes reduce and sometimes eliminate the costs of building physical prototypes. Additionally, highly detailed studies with multi-parametric sweeps across many design options help engineers optimize new, complex designs quickly. Historically, only large-budget projects could afford these costly analyses. Now, engineers can perform them with powerful Software-as-a-Service (SaaS) offerings.

“Our multiphysics solvers describe the physical universe digitally,” explained Ian Campbell, CEO of OnScale which offers a SaaS engineering simulation platform. “We focus on multi-dimensional problems that are very challenging in engineering. Our solvers simultaneously look at design problems where there are often combinations of interacting systems. We also focus on problems that are difficult or impossible to solve with existing desktop technologies. That usually leads us to very large problems in the high-tech space.”

OnScale’s solvers were originally developed by engineering firm Weidlinger Associates (now part of Thornton Tomasetti). The solvers were designed to run on massively parallel computing systems using the MPI library. OnScale adapted the solvers to run on Google Cloud C2 and M2 instances with Intel Xeon Scalable processors, optimizing them to scale across thousands of nodes. OnScale customer projects vary across domains and industries, including life sciences, material testing, and energy.


With OnScale’s code optimizations and Google’s instance advances, compute capacity for customer projects can be easily scaled to any workload demand. According to OnScale, this has resulted in as much as a 2X speedup per cost increase (as measured by core hour). In an OnScale mechanical simulation with 2 million degrees of freedom (DoFs), for about four times the cost of computation, runtime shrank from about 11 minutes to 1.5 minutes — about 8X faster.

An approximate 4X increase in cost (core hours) results in about 8X faster solution. (Courtesy of OnScale.)

As a cloud solution, engineers can set up their parametric sweep, simulate many different cases, create a massive, simulated dataset, and then train an algorithm without moving it from the cloud to their own workstation. Many of the OnScale’s customers are using their simulated data to train algorithms to do various things that enhance and accelerate the design process. Campbell calls it SimAI.

“Training algorithms with data from stress tested physical prototypes, such as sensors, is a very onerous, time-consuming, and extremely expensive process,” added Campbell. “Instead, customers create digital prototypes of not only the device, but the entire system and then subject it to shock, thermal conditions, vibration, and other physical phenomena. They use that simulation data to train an embedded algorithm. When they build their first physical prototype, they integrate the algorithm into the embedded system, such as a Smartphone, and it just works the first time.”


“We’ve benchmarked many Google Cloud configurations based on Intel Xeon processors,” said Campbell, “so we know how our software scales on their hardware. We created a machine learning engine and trained our models on a half-million simulations. That means when an engineer sets up a new simulation study, we can create the best cloud configuration based on his or her needs, optimizing for accuracy, cost, and runtime.”

Engineers choose the priorities of their study across three parameters: accuracy, budget, and time to solution. The service returns an estimate of accuracy, cost, and runtime, and the customer can then optimize the job to their business and engineering needs. At runtime, OnScale’s Cloud Orchestrator builds the best supercomputer configuration for the customer. Then, during runtime, if more capacity is needed, the job can be paused, a new instance built, and the job started where it left off – all transparent to the engineer.

The system keeps on learning and OnScale continues to enhance their solution. After a simulation study completes, metadata about platform performance is fed back to the machine learning engine to further tune OnScale’s ability to accurately predict the best cluster configurations for future simulations.


Polytec provides non-destructive testing (NDT) services and equipment for industry and research. The company’s engineers utilize OnScale tools to compare the simulation results of a digital prototype to measurements from a physical prototype.

“Engineers doing modeling understand exactly how their structure works,” explained Jerome Eichenberger from Polytec. “But there are a lot of unknowns when it comes to predicting behavior of structures, especially critical structures. There are many unknowns with boundary conditions, dimensional tolerances, material homogeneity, and structures that behave in a linear and nonlinear manner. Customers look to us to validate their simulations.”

Polytec used their NDT technologies to experimentally compare accuracy of a simulation to a physical design. Polytec engineers drilled holes in a stainless steel test block and mounted a transducer on top. The transducer injected high-frequency elastic waves into the material, and Polytec instruments captured reflections from the holes. Correlating simulation results with physical measurements, the simulation accurately predicted the experimental data captured by Polytec.

Comparing the results of a digital prototype with a physical prototype of a stainless steel test block. (Courtesy of Polytec.)


Elasticity and other structural properties of many human tissues determine their functions. The eye’s structural mechanics affect how it collects and transfers visual information to the brain. Ophthalmologists measure elasticity changes to diagnose eye disease or monitor treatments. However, quantitative methods to measure the eye’s mechanical properties are very limited today.

Researchers at the University of Washington use OnScale to design breakthrough eye imaging technology with optical coherence elastography (OCE). This non-contact, noninvasive method allows clinicians to quantify and detect changes in corneal elasticity and intraocular pressure not possible before. Their technology noninvasively measures mechanical properties of the cornea using propagating mechanical waves over the cornea. The cornea is excited through air by an air-coupled acoustic transducer. Using optical coherence tomography (OCT), they can then image propagating mechanical waves, which allows mapping corneal elasticity. Of particular importance to OCT technology is propagation of shear waves. Shear waves can be three-dimensionally imaged and thus provide a noninvasive map of the tissue’s mechanical properties.

The UW team’s model for measuring shear waves across the cornea

The research team used OnScale to model excitation and propagation of shear waves in a digital prototype of the human eye, of which the cornea presents unique challenges. With OnScale, researchers were able to construct a reliable two-dimensional finite element model that closely mirrors their experimental OCE system and measurements. Their research can eventually apply to other tissues where elasticity measurement is important for diagnosis and treatment.

UW team’s simulated and experimental results


Vestas designs and manufactures wind turbine solutions for the energy industry. A major component is the very large blade (up to 80 meters in length) made from multiple composite materials – plastics, carbon fiber, resin, and etc. The blades are subject to many stresses across their length and especially at the mount. Vestas had to design various tools, such as ultrasound testing, to validate the design and quality of manufacturing and look for anomalies. But designing ultrasound testing for composites is not as well understood as it is for other materials.

“There are many complicated phenomena when using ultrasound on composites,” explained Jason Hawkins, a test engineer at Vestas. “Depending on which direction the wave is moving, you have different velocities, and that can be very complicated to read on your screen. You can improve the detection process if you can actually simulate it. You can see what processes, such as reflected primary and secondary waves, are resulting in what you’re measuring. We couldn’t do that before.”

Having the capability to run numerical simulations in the cloud gives engineers new insight into how the material is responding to ultrasound excitation for NDT. With this new knowledge, they are able to create inspections that were not possible before and improve design and manufacturing.


OnScale continues to evolve its cloud software, exploring the use of emerging hardware to boost the performance is their established solvers.

“We use what we call flat buffers,“ explained Campbell, “which gives us the ability to have multiple different solvers or pieces of software that can look at the same memory. We don’t need complex memory management schemes. That makes our software run incredibly efficient. And that becomes more effective with technologies like Intel Optane persistent memory.”

“Google large memory instances will allow us to move seamlessly from running a massive simulation on a large cluster to visualization of simulation results on a much smaller machine,” he said. “With persistent memory capacity, we won’t have to first move data from memory to storage and then reload it back to memory in a new container for visualization. That can save a significant amount of time and reduce costs.”

In the meantime, OnScale cloud-based multiphysics solvers give more researchers and engineers access to powerful digital prototyping capabilities to accelerate their work.

Ken Strandberg is a technical storyteller. He writes articles, white papers, seminars, web-based training, video and animation scripts, and technical marketing and interactive collateral for emerging technology companies, Fortune 100 enterprises, and multi-national corporations. Strandberg’s technology areas include software, HPC, industrial technologies, design automation, networking, medical technologies, semiconductors, and telecom. He can be reached at

Spread the word