Daniel Duffy (Lead System Architect) and William Putman (Research Meteorologist) of the NASA Center for Climate Simulation and the Global Modeling and Assimilation Office at the NASA Goddard Space Flight Center present the challenges facing global climate modeling today. In particular they discuss the requirements of high resolution modeling: the need for powerful computing resources and the creation of terabytes of data to be stored and analyzed.
Today’s climate simulations use a resolution of 50 to 100 km globally, as fine as 25 km for global weather prediction. The goal is to increase the resolution to 1 km to reach global cloud resolving scales, which will in turn require nearly 10 million conventional Xeon compute cores. According to Daniel this will be possible only by adopting accelerator hardware like GPUs and Xeon Phis. For William what matters most is keeping a single codebase that can be understood by all scientists involved.
A two year simulation with a resolution of 10 km currently produces about 400 TB of data. Daniel explains how the convergence of HPC and big data creates new challenges for the providers of computing services.
(Interview recorded during SC12)