# Building Better Engines with AI

### Deep Learning Speeds Up Simulations for Cleaner Combustion

David Schmidt is protecting the environment, but not in the way he first intended. In engineering graduate school, his interest was nuclear fusion. A persuasive Ph.D. advisor guided him toward the physics of fuel injection, a process central to both inertial confinement fusion reactors and internal combustion engines, the advisor’s other line of research. With practical fusion reactors still decades away, Schmidt veered toward engines.

While electric cars may seem to be taking over, internal combustion engines (ICEs) will remain on the roads, seas, and tarmacs for decades to come. Schmidt’s work makes them cleaner and more efficient. “A new problem, even if you’re not initially that interested in it, after a while gets its hooks into you,” he says. “And pretty soon, you start thinking about it day and night.”

A mechanical and industrial engineering professor at the University of Massachusetts at Amherst, Schmidt calls fuel injection “very extreme.” Fuel goes through holes the width of a human hair “and comes screaming out at least Mach one, with enough momentum to punch a hole through your hand.” This makes studying physical engines expensive and a bit dangerous. Simulations have improved our ability to understand what happens inside them.

Running high-fidelity simulations of engines can still take a week on an expensive computing cluster. So, over the past two years, Schmidt has led ICEnet, a consortium based at UMass Amherst that accelerates the process with neural networks. His lab made significant progress in developing tools to study both turbulence and combustion. Using machine learning, he says, they’re getting “a better, more accurate answer than we deserve for the amount of compute power we’re putting in.”

### Plug It In

The consortium consists of Schmidt’s lab at UMass; Siemens, AVL, and Convergent Science, makers of engine simulation software; Cummins, maker of engines; NVIDIA, maker of graphical processing units; and MathWorks.

Siemens, AVL, and Convergent Science design simulation software for clients, including Cummins. The programs, called computational fluid dynamics (CFD) solvers, represent an entire engine or an individual engine component by dividing it into a million or more tiny cells. Within each cell, at each time step, the simulation combines the effects of myriad factors to decide the cell’s current state—its temperature, pressure, and so on. The factors determining this state come from a basic accounting of mass, momentum, and energy conservation, plus plug-in software modules. ICEnet is developing two modules: one calculates the physics of turbulence, and another calculates combustion chemistry.

The beauty of the plug-in system is that you don’t have to generate a new CFD solver, which can consist of millions of lines of code representing piston motion, sparkplugs, and so on. “You don’t want to change that existing code base,” Schmidt says. “So, it’s designed to let you easily swap modules.”

ICEnet built an industry-grade end-to-end workflow that users can tune to suit their specific CFD workflows.

ICEnet uses a collection of CFD solvers called OpenFOAM, which is open source. But their modules will just as easily plug into other CFD solvers, like those used by Siemens, AVL, and Convergent Science. The three companies likely won’t use the modules exactly as written but will adapt them to suit their own needs.

To develop the modules, Schmidt is using MATLAB^{®}, for several reasons. According to Peetak Mitra, a graduate student in Schmidt’s lab, it’s user-friendly, clients are familiar with it, it’s bug-free, MathWorks provides support, and it generates C++ code, the language of CFD solvers, better than other machine learning frameworks such as PyTorch.

The project involved not just developing new algorithms but making them reliable enough for everyday use. “Typically, in academia, we get something to work once or twice and declare victory,” Schmidt says. “And it’s software that’s held together by chewing gum and string.”

“What ICEnet built,” says Shounak Mitra, the deep learning product manager at MathWorks, “was an industry-grade end-to-end workflow that users can tune to suit their specific CFD workflows in this domain.”

### All Mixed Up

Schmidt likes to quote a poem by the mathematician and meteorologist Lewis Fry Richardson: “Big whirls have little whirls that feed on their velocity, / and little whirls have lesser whirls and so on to viscosity.” Turbulence—the mixing of fluid, gas, or plasma—occurs on many scales. And what happens on one scale influences what happens on the others. Accurate simulations of turbulence are so slow because they need to capture the tiniest dynamics. ICEnet speeds up the process by estimating them.

What happens in each cell is calculated using a formula with inputs related to density, pressure, temperature, velocity, and strain. The output is the gaseous velocity. A simulation might have a million cells, so that’s a million of these calculations per time step (which lasts anywhere from a thousandth to a millionth of a second). ICEnet reduces the number of cells by a factor of two or more. It recovers the fine-grained information that’s lost by using fast neural networks.

The team first runs high-resolution simulations and trains a neural network to estimate the behavior of the simulation. Then, in the low-resolution simulation, they add to the formula a term calculated by the net, called a *learned correction*. This correction recovers most of the information lost by reducing the resolution. Other researchers completely replace simulations with trained networks, but Schmidt finds that that approach reduces accuracy too much. He needs the traditional formulas to enforce conservation of mass, momentum, and energy—like balancing a checkbook. “So, what we’ve done is we’ve married the existing simulation technology to machine learning.”

In initial studies, the system didn’t always accelerate simulations. They’d reduced a million cells to a few hundred thousand but still had to run the network a hundred thousand times each for a million time steps, or a total of a hundred billion times. So Peetak Mitra found a way to simplify the net. First, he used AutoML, in which software explores different neural network architectures, finding ones that are both accurate and efficient. Because not all consortium partners are experts in machine learning, “What we wanted to do was lower that barrier,” Peetak Mitra says. “We built this one-click process where you hit Run in MATLAB, and it automatically designs the network based on your data.”

Then Peetak Mitra devised a new way to prune networks, removing unimportant nodes and connections. Pruning reduced network size by 90%, making it 10 times faster—while simultaneously increasing accuracy. That’s because large networks adapt to any information in the system, which makes them good at generalizing to many scenarios, but they can learn from the noise in the system. If you’re applying machine learning to a regular environment—similar types of cylinders—you can afford to shrink the network, thereby filtering out the noise.

The team also used a process called quantization, which reduces the excessive precision of the network’s values. Looking at these ways of shrinking the network, Shounak Mitra says, “NVIDIA applauded them.”

Challenges remain. When combining a network with the rest of a CFD solver, it takes a while to train a network and optimize its performance. “Imagine trying to learn to play soccer, and you can kick the ball only once every few hours,” Schmidt says. They’re currently working on shortcuts to speed up the process.

### Highly Combustible

Schmidt’s team has also found innovative ways to use deep learning to study combustion. Just as in turbulence modeling, a standard system applies formulas to the cells to decide their states at each time step. But here, there’s a whole system of differential equations for each cell, balancing convection, temperature, pressure, and dozens of chemicals and hundreds of chemical reactions.

With faster simulations, partners can run more experiments, quickly iterating on engine design, with the ultimate goal of increasing engine efficiency and reducing emissions.

Once again, they run high-fidelity simulations using these equations and train a neural network to quickly approximate what happens in a cell. But in this case, they then completely replace the chemical reactions with the trained network, one per cell. For this, Majid Haghshenas, a postdoc in Schmidt’s lab, has developed a novel approach: they do not use the same trained network in each cell, nor do they use a different one for each cell. Instead, they use a technique that clusters trained networks into about 40 groups. For each cluster, they create one representative network. They replace the system of equations with one of these 40 or so networks based on its inputs. Using 40 different networks instead of a hundred thousand reduces the size of the overall system. It also makes it more accurate than using a hundred thousand copies of the same network.

One big challenge is that chemical concentrations can vary by a factor of a billion, and the length of chemical reactions can span a similar range. How do you learn about slow reactions with high concentrations without missing smaller-scale dynamics? One method is to compute using non-linear transformations of concentration, but they are still perfecting this solution.

They still have time to work out kinks, but Schmidt estimates that the turbulence model will be about 5 times faster than the previous method, and the combustion model will be 100 times faster. With faster simulations, partners can run more experiments, quickly iterating on engine design, with the ultimate goal of increasing engine efficiency and reducing emissions. “You can try anything that comes to mind,” Haghshenas says.

### You Spin Me Round

As ICEnet winds down, Schmidt looks toward further ways to improve the environment. Fluid dynamics also governs the way air flows over and around objects. One application is car aerodynamics. As autonomous vehicles proliferate, they may begin to “platoon,” one car closely following another to stay in its slipstream and reduce drag, improving fuel efficiency. CFD solvers can compute the right distance.

Schmidt’s team also hopes to apply their methods to wind turbines. Upwind turbines block the wind and create turbulence that can reduce the efficiency of downstream turbines while increasing strain on their blades. CFD solvers can calculate the optimal angle at which to point turbines to reduce such interference.

Wind turbines are not nuclear fusion, but whirls, big and little, still contain endless complexity.

### Other Automotive Feature Stories

### Read Other Stories

AI

### Cataract Patients See Better Results with AI

AI Modeling Technique from the Automotive Industry Improves Eye Surgery Results

Green Technology / Control Systems

### Removing Millions of Tons of CO_{2} Emissions at Seaports Each Year

Electrifying Commercial Vehicles with Hydrogen Fuel Cells

Biotech

### When You Know the Answer, Deep Learning Can Determine the Question

Machine Learning Starts with a Biochip’s Function and Works Backward to Design Its Form