Scaling deep learning for whole-core reactor simulation

Forrest Shriver, Justin Watson

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

A deep learning architecture for predicting the normalized pin powers within 2D pressurized water reactors, called LatticeNet, has been developed and shown to be performant for a variety of relevant conditions within a single 2D reflective assembly. However, many neutronics scenarios of interest involve regions composed of multiple assemblies, up to and including full-core scenarios. It is not immediately obvious that scaling LatticeNet up to these full-core scenarios will achieve the same performance as seen in single-assembly scenarios, due to the problem-tailored nature of neural networks. It is also simple to show that the original implementation of LatticeNet does not easily scale up to multi-assembly regions due to the enormous compute demands of the original proposed architecture. In this work, we address these issues by first proposing several variants of LatticeNet which address the issue of scaling compute needs, and show the theoretical performance benefits gained from these architectures. We then evaluate the actual benefit of the proposed variants on multi-assembly regions containing roughly the same variation outlined in the original paper proposing LatticeNet. We show that the proposed architecture changes do not result in significantly increased error, and that these changes result in much more manageable training times relative to the original LatticeNet architecture.

Original languageEnglish (US)
Article number104134
JournalProgress in Nuclear Energy
Volume146
DOIs
StatePublished - Apr 2022

All Science Journal Classification (ASJC) codes

  • Nuclear Energy and Engineering
  • Safety, Risk, Reliability and Quality
  • Energy Engineering and Power Technology
  • Waste Management and Disposal

Fingerprint

Dive into the research topics of 'Scaling deep learning for whole-core reactor simulation'. Together they form a unique fingerprint.

Cite this