Video – Discussion Between SDSC and Argonne About the Challenges of Visualizing Large Data Sets

In the two previous posts we presented the simulation of the spatial structure of the light emitted in early galaxies. We asked Rick Wagner of SDSC to discuss with Venkartram Vishwanath of Argonne the challenges of creating visualizations of very large numerical scale.

The simulation of Rick produces for example 256 GB of data for a small set of the field and up to many TB of data for the entire one. Traditionally they write the snapshots to disk and analyze them later. According to Rick this approach is not sustainable in the future since larger and larger sets of data will be produced.

Venkartram agrees that one challenge of next generation simulations is that I/O will not keep up with the growth rate of computing capability. In his group at Argonne they are now working on efficient infrastructure and software to reduce the amount of data being written to storage to perform analysis, as well as in-situ visualization while the simulation is progress. This will facilitate the transformation of the data into insight.

Venkartram is developing methods that will allow a non-intrusive integration of the simulation with the visualization. Not a single line of the code has to be modified. The data is buffered, staged and written out, maintaining the integrity of the data formats that are produced. This method can also be used to increase the speed data is written to disk.