Partial differential equations (PDEs) are workhorses of science and engineering. They describe a vast range of phenomena, from flow around a ship’s hull, to acoustics in a concert hall, to heat diffusion in a material. In the past several years, researchers have exploited the pattern-finding power of machine learning to create new frameworks for solving PDEs.
In a recent advance, a multi-disciplinary team of researchers developed a machine learning framework that adapts to changes in the geometry of the physical settings of PDEs. Called DIMON, the new framework was motivated by an effort to build digital twins of the human heart.
The development of DIMON is a significant step, said Reinhard Laubenbacher, Dean’s Professor and Director of the Laboratory for Systems Medicine at the University of Florida. DIMON, he said, could “make it feasible to use [cardiac digital twins] in real time during surgery, providing actionable information to surgeons.” He also noted that DIMON’s usefulness is not limited to cardiac modeling, but could extend to many applications for which the fast solution of PDEs is essential.
Combining PDE Approximation and Neural Networks
Most PDEs of interest are not solvable in closed form. For this reason, researchers have focused decades of effort on creating numerical methods for approximating PDE solutions. Given a domain over which a PDE is defined, these methods typically discretize the domain into small regions and then use simple functions like polynomials to mimic solutions in those regions. Finer discretization brings greater accuracy, but also greater cost.
“Even for simple linear PDEs, numerical methods need a very dense discretization” and can require long run times, said DIMON team member Nicolas Charon, assistant professor in the Department of Mathematics at the University of Houston. What’s more, “Sometimes you need to solve a PDE multiple times because the solutions depend on the boundary and initial conditions of the PDE.”
Once those computations are completed, however, the results can be used as training data for a neural network. This is the basic idea behind a raft of new machine learning frameworks for solving PDEs that have been developed in recent years. One of the first and best known is called DeepONet, short for “deep operator network.”
In the context of PDE solving, an “operator” is a mathematical construction that maps input data about initial and boundary conditions to the corresponding solution of the PDE. DeepONet trains on a batch of numerical solutions for a PDE with varying initial and boundary conditions, learns the general pattern of the associated operator, and then predicts new solutions with new initial and boundary conditions.
Bringing in the Influence of Geometry
The geometry of the domain of a PDE also influences solutions. For example, a single PDE governs heat diffusion in both a sphere and a long, thin rod, but the solutions are different for the two different geometries—and would require different discretizations and computations by a numerical solver.
This is where DIMON comes in. Building on the deep operator network approach, DIMON enables fast PDE solving while adapting to changes in the geometry of PDE domains. The name DIMON stands for Diffeomorphic Mapping Operator Learning. Two domains are “diffeomorphic” if there is a mapping that smoothly transforms one domain into the other, without creating any holes or creases, and with each point in each domain mapped to exactly one point in the other domain.
DIMON works by solving a PDE over a template domain, then predicting new solutions on other domains that are diffeomorphic to the template. “We have the PDE on the template domain, and then we can go back and forth between the template to each of the domains we want to work on” by using the diffeomorphic mapping, Charon explained. “Essentially, it is extending the operator framework theory by incorporating the geometry.”
Efficient Shape Analysis Key to Speed-up
DIMON grew out of research in the group of Natalia Trayanova, a professor of biomedical engineering at Johns Hopkins University, who specializes in computational and modeling approaches to cardiac research. In December 2024, she and five co-authors published a paper about DIMON in Nature Computational Science. The lead author was Minglang Yin, a postdoctoral researcher in Trayanova’s group. Previously, Yin studied under George Karniadakis of Brown University, who was one of the creators of DeepONet and recipient of the 2021 SIAM/ACM Prize in Computational Science and Engineering. Yin wrote his doctoral thesis on computational models and scientific machine learning for aortic dissection.
The motivation for DIMON was to enable heart digital twins for improving the treatment of arrhythmia, a disorder characterized by irregularity in electrical signals in the heart. Those signals can be modeled by solving the PDEs that govern them. With traditional numerical methods, Yin said, such modeling poses a big computational challenge. “Single simulations take 12 hours or even 24 hours on hundreds of CPUs, which is very time- and resource-intensive,” he said. DIMON can drastically reduce those costs. “The training for DIMON is very simple and takes 10 minutes for the heart. And it can be done on a laptop.”
That dramatic speed-up depends on efficient representation of the geometry of PDE domains. This is where Charon’s expertise in shape analysis has played a crucial role.
A simple shape like an ellipse can be completely described with just two parameters: width and height. “But it’s not so trivial when you have a bunch of data and there are no parameters; you are given meshes, discretizations of geometric shapes, and you want to find a low-dimensional representation,” Charon said. With cardiac models, for example, shapes of individual human hearts are represented as reams of MRI data. “I came into the [Johns Hopkins collaboration] when they were asking, ‘how can we model in a simple way geometric transformations between a collection of shapes and the template domain?’”
Using a method called LDDMM (Large Deformation Diffeomorphic Metric Mapping), Charon came up with ways to reduce the number of parameters needed to represent the domains, while retaining the geometric features salient to the PDEs. Instead of millions of parameters to represent a human heart, for example, DIMON reduces the number to 128, or even 64. That reduction, said Yin, “is a key to making DIMON work.”
Limitations and Advantages
“As with most deep learning approaches, DIMON usually performs well for input data that does not differ too much from the training set,” said Charon. Venturing far from the training set can give poor results. For applications requiring high accuracy, traditional numerical methods, which can zero in on solutions with increasing precision, are needed. Nevertheless, a framework like DIMON is useful for uncovering general patterns of solutions. “It gives you a solution that is decent enough,” said Charon.
It also offers the advantage of versatility. Yin noted that DIMON is not limited to PDEs associated with cardiac electrophysiology; the Nature Computational Science paper demonstrates its use on Laplace’s equation and reaction-diffusion equations in two dimensions. He added, “DIMON can be extended to other types of PDEs, such as heat equations or even the Navier-Stokes equations that govern fluid motion.”
DIMON’s most significant contribution is to bring full, real-time cardiac digital twins one step closer to realization. Such digital twins could support cardiac surgical procedures, and perhaps eventually drug therapies. Said Yin, “I think it’s going to be a great advantage for clinicians to use in their daily pipelines.”
Allyn Jackson is a journalist specializing in science and mathematics, who is based in Germany.