Supercomputers predict new turbulent interactions in fusion plasmas
For decades, developing practical fusion energy has been impeded by anomalously high levels of heat loss from magnetically confined plasmas. Using data from dedicated experiments, advanced codes, and the supercomputing power, researchers are developing a greater understanding of the underlying physics of these heat losses. By more completely capturing the dynamics of plasma turbulence across an unprecedented range of spatial and temporal scales, researchers have reproduced experimental levels of heat loss observed experimentally where they previously could not.
This work provides an explanation for a mystery that is half a century old and represents a significant advancement in our understanding of heat loss from fusion plasmas. Ultimately, these results will allow for the development of more robust and reliable models that can be used to optimize the design and operation of fusion reactors and push the development of fusion energy forward.
A collaboration of researchers from MIT, University of California at San Diego, and General Atomics have performed the first set of realistic plasma turbulence simulations that simultaneously capture the dynamics of small- and large-scale turbulence in fusion plasmas. Motivated by a decades-old observation that measured electron heat losses from fusion plasmas exceed predictions from theory, a set of dedicated experiments were performed on the Alcator C-Mod tokamak at the MIT Plasma Science and Fusion center.
Using the data from these experiments and over 100,000,000 CPU hours on the National Energy Research Scientific Computing Center supercomputers, the researchers performed what were arguably the highest fidelity plasma turbulence simulations ever. These simulations demonstrated for the first time that large- and small-scale turbulence likely coexist in the core of typical fusion plasmas and exhibit significant nonlinear interactions.
The synergistic interactions between large-scale turbulent eddies (generally assumed to be responsible for the ion heat losses) and small-scale turbulent structures, known at ETG streamers, enhance the total loss of heat from the plasma and provide an explanation for the anomalously high heat losses observed experimentally.
This research demonstrated that only simulations that simultaneously capture both the small- and large-scale turbulence, as well as their interactions, can explain previously unexplained experimental quantities, providing strong evidence that the observations from simulation are indeed representative of reality.
Most importantly, these simulations provide a clear path forward for the development of models that will be used to improve the performance, operation, and design of future fusion reactors.
This work was supported by the U.S. Department of Energy (DOE) Office of Science Fusion Energy Sciences program under contracts DE-FC02-99ER54512, DE-SC0006957, and DE-FG02-06ER54871 and in part by an appointment to the DOE Fusion Energy Postdoctoral Research Program administered by ORISE. Simulations were carried out at the National Energy Research Scientific Computing Center (NERSC), supported by the DOE Office of Science under contract number DE-AC02-05CH11231. The simulations were enabled by an Advanced Scientific Computing Research Leadership Computing Challenge (ALCC) award. The research was performed as part of the activities of the Scientific Discovery through Advanced Computing (SciDAC) Center for the Simulation of Plasma MicroturbulenceExternal link.
Story Source:
The above post is reprinted from materials provided by Department of Energy, Office of Science. Note: Materials may be edited for content and length.
Leave a Reply