2015
Ahrens, James
Increasing Scientific Data Insights about Exascale Class Simulations under Power and Storage Constraints Journal Article
In: vol. 35, no. 2, pp. 8–11, 2015, ISSN: 0272-1716 (print), 1558-1756 (electronic), (LA-UR-pending).
Abstract | Links | BibTeX | Tags: exascale, increasing scientific data insights
@article{Ahrens:2015:ISD,
title = {Increasing Scientific Data Insights about Exascale Class Simulations under Power and Storage Constraints},
author = {James Ahrens},
url = {https://datascience.dsscale.org/wp-content/uploads/2016/06/IncreasingScientificDataInsightsAboutExascaleClassSimulationsUnderPowerAndStorageConstrains.pdf},
issn = {0272-1716 (print), 1558-1756 (electronic)},
year = {2015},
date = {2015-00-01},
volume = {35},
number = {2},
pages = {8--11},
abstract = {Over the past three decades, supercomputing systems have progressed to compute the results of extremely accurate scientific simulations. These simulations help us understand complex real-world phenomena such as our climate, energy sources, and the progression of natual disasters. Additionally, computing power supports the computation of hither-quality simulations, and that in turn provides higher fidelity results. Using the number of floating-point operations per second (flops) as a measure of progress, we have progressed through terascale machines that compute 10**12 flops to petascale machines that compute 10**15 flops. A number of open source efforts provide a robust scalable visualization and analysis capability such as ParaView (www.paraview.org) and Visit (https://visit.llnl.gov) for these levels of performance. These tools traditionally focus on a postprocessing approach. That is, during a simulation run, representative results are written to storage for later visualization. ...continued in full paper below.},
note = {LA-UR-pending},
keywords = {exascale, increasing scientific data insights},
pubstate = {published},
tppubtype = {article}
}
Over the past three decades, supercomputing systems have progressed to compute the results of extremely accurate scientific simulations. These simulations help us understand complex real-world phenomena such as our climate, energy sources, and the progression of natual disasters. Additionally, computing power supports the computation of hither-quality simulations, and that in turn provides higher fidelity results. Using the number of floating-point operations per second (flops) as a measure of progress, we have progressed through terascale machines that compute 10**12 flops to petascale machines that compute 10**15 flops. A number of open source efforts provide a robust scalable visualization and analysis capability such as ParaView (www.paraview.org) and Visit (https://visit.llnl.gov) for these levels of performance. These tools traditionally focus on a postprocessing approach. That is, during a simulation run, representative results are written to storage for later visualization. ...continued in full paper below.
: . .
1.
Ahrens, James
Increasing Scientific Data Insights about Exascale Class Simulations under Power and Storage Constraints Journal Article
In: vol. 35, no. 2, pp. 8–11, 2015, ISSN: 0272-1716 (print), 1558-1756 (electronic), (LA-UR-pending).
@article{Ahrens:2015:ISD,
title = {Increasing Scientific Data Insights about Exascale Class Simulations under Power and Storage Constraints},
author = {James Ahrens},
url = {https://datascience.dsscale.org/wp-content/uploads/2016/06/IncreasingScientificDataInsightsAboutExascaleClassSimulationsUnderPowerAndStorageConstrains.pdf},
issn = {0272-1716 (print), 1558-1756 (electronic)},
year = {2015},
date = {2015-00-01},
volume = {35},
number = {2},
pages = {8--11},
abstract = {Over the past three decades, supercomputing systems have progressed to compute the results of extremely accurate scientific simulations. These simulations help us understand complex real-world phenomena such as our climate, energy sources, and the progression of natual disasters. Additionally, computing power supports the computation of hither-quality simulations, and that in turn provides higher fidelity results. Using the number of floating-point operations per second (flops) as a measure of progress, we have progressed through terascale machines that compute 10**12 flops to petascale machines that compute 10**15 flops. A number of open source efforts provide a robust scalable visualization and analysis capability such as ParaView (www.paraview.org) and Visit (https://visit.llnl.gov) for these levels of performance. These tools traditionally focus on a postprocessing approach. That is, during a simulation run, representative results are written to storage for later visualization. ...continued in full paper below.},
note = {LA-UR-pending},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
Over the past three decades, supercomputing systems have progressed to compute the results of extremely accurate scientific simulations. These simulations help us understand complex real-world phenomena such as our climate, energy sources, and the progression of natual disasters. Additionally, computing power supports the computation of hither-quality simulations, and that in turn provides higher fidelity results. Using the number of floating-point operations per second (flops) as a measure of progress, we have progressed through terascale machines that compute 10**12 flops to petascale machines that compute 10**15 flops. A number of open source efforts provide a robust scalable visualization and analysis capability such as ParaView (www.paraview.org) and Visit (https://visit.llnl.gov) for these levels of performance. These tools traditionally focus on a postprocessing approach. That is, during a simulation run, representative results are written to storage for later visualization. ...continued in full paper below.