In an earlier article in Express Computer, authors Romi Mahajan and Dharmesh Godha discussed the notion that verticalization is the new frontier in AI. This rings true in the area of Nuclear Fusion, specifically in accelerating the pathways to Commercially Viable Fusion (CVF), the holy grail of energy.

Over the past three years, over $6 billion of private investment has gone into Fusion. This is on top of the tens of billions being spent by government institutions, laboratories, and universities around the world. Fusion is, to put it simply, a hot area.

While there are many approaches to Fusion- from different “confinement” approaches to altogether different machine designs- one thing is common: All of the various efforts can be enhanced with simulation, modeling, and extrapolation. In this sense, data is the watchword and is front and center in Fusion. Data is the cartilage that connects theoretical exploration with experimentation.

Here, AI and Machine-Learning come in. A few areas of interest-

1. Expensive “high-fidelity” computational models for Fusion take a long time to run, driving cost overruns. With data accumulation, ML can train surrogate models to reproduce the expensive modeling rapidly; these can be used for optimization and rapid prediction.

2. Fusion “control”–experimentalists have an abundance of external “knobs” that can influence aspects of plasma dynamics. ML enables efficient models that can be used with control algorithms.

3. Physics-Informed Neural Networks (PINNs) incorporate physical constraints into neural network architecture. PINNs can use small subsets of observational data to generalize and predict related dynamics that are otherwise inaccessible.

There are many other highly-specific and verticalized applications of AI to Fusion, which we are exploring in our partnership with Sapientai.

The quest for CVF requires cutting edge physics combined with cutting-edge experimentation and computation. AI is a boon in this regard.