Empirical Scaling of Scientific Machine Learning Models
June 4, 2024Lawrence Livermore National Laboratory Computer Science/Mathematics, 2023–24
Liaison(s): Helgi Ingolfsson, Robert Blake, Bridgette Michelle Davilla
Advisor(s): Naim Matasci
Students(s): Sasha Rothstein (TL-F), Megan Li, Mukta Ubale, Stephanie Huang, William Yik (TL-S)
Lawrence Livermore National Laboratory conducts multiscale scientific simulations to answer questions where experiments are infeasible. To reduce computational costs, accurate yet expensive computations can be approximated using neural networks. To this end, it is important to know how much data is needed to achieve specified accuracy. Previous research shows that neural networks obey empirical scaling laws. Using this result, we developed software that measures how accuracy scales as a function of data and network size for each specific scientific domain.