banner research

Scientific Computing

Numerical simulation of real-world phenomena provides fertile ground for building interdisciplinary relationships. The SCI Institute has a long tradition of building these relationships in a win-win fashion – a win for the theoretical and algorithmic development of numerical modeling and simulation techniques and a win for the discipline-specific science of interest. High-order and adaptive methods, uncertainty quantification, complexity analysis, and parallelization are just some of the topics being investigated by SCI faculty. These areas of computing are being applied to a wide variety of engineering applications ranging from fluid mechanics and solid mechanics to bioelectricity.


Martin Berzins

Parallel Computing

Mike Kirby

Finite Element Methods
Uncertainty Quantification

Valerio Pascucci

Scientific Data Management

Chris Johnson

Problem Solving Environments

Ross Whitaker


Chuck Hansen


Scientific Computing Project Sites:

Publications in Scientific Computing:

Interpreting Performance Data Across Intuitive Domains
M. Schulz, J.A. Levine, P.-T. Bremer, T. Gamblin, V. Pascucci. In International Conference on Parallel Processing, Taipei, Taiwan, IEEE, pp. 206--215. 2011.
DOI: 10.1109/ICPP.2011.60

GPU-Based Interactive Cut-Surface Extraction From High-0rder Finite Element Fields
B. Nelson, R. Haimes, R.M. Kirby. In IEEE Transactions on Visualization and Computer Graphics (IEEE Visualization Issue), Vol. 17, No. 12, pp. 1803--1811. 2011.

We present a GPU-based ray-tracing system for the accurate and interactive visualization of cut-surfaces through 3D simulations of physical processes created from spectral/hp high-order finite element methods. When used by the numerical analyst to debug the solver, the ability for the imagery to precisely reflect the data is critical. In practice, the investigator interactively selects from a palette of visualization tools to construct a scene that can answer a query of the data. This is effective as long as the implicit contract of image quality between the individual and the visualization system is upheld. OpenGL rendering of scientific visualizations has worked remarkably well for exploratory visualization for most solver results. This is due to the consistency between the use of first-order representations in the simulation and the linear assumptions inherent in OpenGL (planar fragments and color-space interpolation). Unfortunately, the contract is broken when the solver discretization is of higher-order. There have been attempts to mitigate this through the use of spatial adaptation and/or texture mapping. These methods do a better job of approximating what the imagery should be but are not exact and tend to be view-dependent. This paper introduces new rendering mechanisms that specifically deal with the kinds of native data generated by high-order finite element solvers. The exploratory visualization tools are reassessed and cast in this system with the focus on image accuracy. This is accomplished in a GPU setting to ensure interactivity.

A Toolkit for Forward/Inverse Problems in Electrocardiography within the SCIRun Problem Solving Environment
B.M. Burton, J.D. Tate, B. Erem, D.J. Swenson, D.F. Wang, D.H. Brooks, P.M. van Dam, R.S. MacLeod. In Proceedings of the 2011 IEEE Int. Conf. Engineering and Biology Society (EMBC), pp. 267--270. 2011.
DOI: 10.1109/IEMBS.2011.6090052
PubMed ID: 22254301
PubMed Central ID: PMC3337752

Computational modeling in electrocardiography often requires the examination of cardiac forward and inverse problems in order to non-invasively analyze physiological events that are otherwise inaccessible or unethical to explore. The study of these models can be performed in the open-source SCIRun problem solving environment developed at the Center for Integrative Biomedical Computing (CIBC). A new toolkit within SCIRun provides researchers with essential frameworks for constructing and manipulating electrocardiographic forward and inverse models in a highly efficient and interactive way. The toolkit contains sample networks, tutorials and documentation which direct users through SCIRun-specific approaches in the assembly and execution of these specific problems.

Morse Set Classification and Hierarchical Refinement using Conley Index
Guoning Chen, Qingqing Deng, Andrzej Szymczak, Robert S. Laramee, and Eugene Zhang. In IEEE Transactions on Visualization and Computer Graphics (TVCG), Vol. 18, No. 5, pp. 767--782. June, 2011.
DOI: 10.1109/TVCG.2011.107
PubMed ID: 21690641

Morse decomposition provides a numerically stable topological representation of vector fields that is crucial for their rigorous interpretation. However, Morse decomposition is not unique, and its granularity directly impacts its computational cost. In this paper, we propose an automatic refinement scheme to construct the Morse Connection Graph (MCG) of a given vector field in a hierarchical fashion. Our framework allows a Morse set to be refined through a local update of the flow combinatorialization graph, as well as the connection regions between Morse sets. The computation is fast because the most expensive computation is concentrated on a small portion of the domain. Furthermore, the present work allows the generation of a topologically consistent hierarchy of MCGs, which cannot be obtained using a global method. The classification of the extracted Morse sets is a crucial step for the construction of the MCG, for which the Poincaré index is inadequate. We make use of an upper bound for the Conley index, provided by the Betti numbers of an index pair for a translation along the flow, to classify the Morse sets. This upper bound is sufficiently accurate for Morse set classification and provides supportive information for the automatic refinement process. An improved visualization technique for MCG is developed to incorporate the Conley indices. Finally, we apply the proposed techniques to a number of synthetic and real-world simulation data to demonstrate their utility.

A wildland fire modeling and visualization environment,
J. Mandel, J.D. Beezley, A. Kochanski, V.Y. Kondratenko, L. Zhang, E. Anderson, J. Daniels II, C.T. Silva, C.R. Johnson. In Proceedings of the Ninth Symposium on Fire and Forest Meteorology, pp. (published online). 2011.

Cardiac Position Sensitivity Study in the Electrocardiographic Forward Problem Using Stochastic Collocation and Boundary Element Methods
D.J. Swenson, S.E. Geneser, J.G. Stinstra, R.M. Kirby, R.S. MacLeod. In Annals of Biomedical Engineering, Vol. 39, No. 12, pp. 2900--2910. 2011.
DOI: 10.1007/s10439-011-0391-5
PubMed ID: 21909818
PubMed Central ID: PMC336204

The electrocardiogram (ECG) is ubiquitously employed as a diagnostic and monitoring tool for patients experiencing cardiac distress and/or disease. It is widely known that changes in heart position resulting from, for example, posture of the patient (sitting, standing, lying) and respiration significantly affect the body-surface potentials; however, few studies have quantitatively and systematically evaluated the effects of heart displacement on the ECG. The goal of this study was to evaluate the impact of positional changes of the heart on the ECG in the specific clinical setting of myocardial ischemia. To carry out the necessary comprehensive sensitivity analysis, we applied a relatively novel and highly efficient statistical approach, the generalized polynomial chaos-stochastic collocation method, to a boundary element formulation of the electrocardiographic forward problem, and we drove these simulations with measured epicardial potentials from whole-heart experiments. Results of the analysis identified regions on the body-surface where the potentials were especially sensitive to realistic heart motion. The standard deviation (STD) of ST-segment voltage changes caused by the apex of a normal heart, swinging forward and backward or side-to-side was approximately 0.2 mV. Variations were even larger, 0.3 mV, for a heart exhibiting elevated ischemic potentials. These variations could be large enough to mask or to mimic signs of ischemia in the ECG. Our results suggest possible modifications to ECG protocols that could reduce the diagnostic error related to postural changes in patients possibly suffering from myocardial ischemia.

Analysis of Large-Scale Scalar Data Using Hixels
D. Thompson, J.A. Levine, J.C. Bennett, P.-T. Bremer, A. Gyulassy, V. Pascucci, P.P. Pebay. In Proceedings of the 2011 IEEE Symposium on Large-Scale Data Analysis and Visualization (LDAV), Providence, RI, pp. 23--30. 2011.
DOI: 10.1109/LDAV.2011.6092313

Scalable Parallel Building Blocks for Custom Data Analysis
T. Peterka, R. Ross, A. Gyulassy, V. Pascucci, W. Kendall, H.-W. Shen, T.-Y. Lee, A. Chaudhuri. In Proceedings of the 2011 IEEE Symposium on Large-Scale Data Analysis and Visualization (LDAV), pp. 105--112. October, 2011.
DOI: 10.1109/LDAV.2011.6092324

We present a set of building blocks that provide scalable data movement capability to computational scientists and visualization researchers for writing their own parallel analysis. The set includes scalable tools for domain decomposition, process assignment, parallel I/O, global reduction, and local neighborhood communicationtasks that are common across many analysis applications. The global reduction is performed with a new algorithm, described in this paper, that efficiently merges blocks of analysis results into a smaller number of larger blocks. The merging is configurable in the number of blocks that are reduced in each round, the number of rounds, and the total number of resulting blocks. We highlight the use of our library in two analysis applications: parallel streamline generation and parallel Morse-Smale topological analysis. The first case uses an existing local neighborhood communication algorithm, whereas the latter uses the new merge algorithm.

Adaptive Extraction and Quantification of Geophysical Vortices
S. Williams, M. Petersen, P.-T. Bremer, M. Hecht, V. Pascucci, J. Ahrens, M. Hlawitschka, B. Hamann. In IEEE Transactions on Visualization and Computer Graphics, Proceedings of the 2011 IEEE Visualization Conference, Vol. 17, No. 12, pp. 2088--2095. 2011.

PIDX: Efficient Parallel I/O for Multi-resolution Multi-dimensional Scientific Datasets
S. Kumar, V. Vishwanath, P. Carns, B. Summa, G. Scorzelli, V. Pascucci, R. Ross, J. Chen, H. Kolla, R. Grout. In Proceedings of The IEEE International Conference on Cluster Computing, pp. 103--111. September, 2011.

Minimum Information about a Cardiac Electrophysiology Experiment (MICEE): Standardised reporting for model reproducibility, interoperability, and data sharing
T.A. Quinn, S. Granite, M.A. Allessie, C. Antzelevitch, C. Bollensdorff, G. Bub, R.A.B. Burton, E. Cerbai, P.S. Chen, M. Delmar, D. DiFrancesco, Y.E. Earm, I.R. Efimov, M. Egger, E. Entcheva, M. Fink, R. Fischmeister, M.R. Franz, A. Garny, W.R. Giles, T. Hannes, S.E. Harding, P.J. Hunter, s, G. Iribe, J. Jalife, C.R. Johnson, R.S. Kass, I. Kodama, G. Koren, P. Lord, V.S. Markhasin, S. Matsuoka, A.D. McCulloch, G.R. Mirams, G.E. Morley, S. Nattel, D. Noble, S.P. Olesen, A.V. Panfilov, N.A. Trayanova, U. Ravens, S. Richard, D.S. Rosenbaum, Y. Rudy, F. Sachs, F.B. Sachse, D.A. Saint, U. Schotten, O. Solovyova, P. Taggart, L. Tung, A. Varrò, P.G. Volders, K. Wang, J.N. Weiss, E. Wettwer, E. White, R. Wilders, R.L. Winslow, P. Kohl. In Progress in Biophysics and Molecular Biology, Vol. 107, No. 1, Elsevier, pp. 4--10. October, 2011.
DOI: 10.1016/j.pbiomolbio.2011.07.001
PubMed Central ID: PMC3190048

Cardiac experimental electrophysiology is in need of a well-defined Minimum Information Standard for recording, annotating, and reporting experimental data. As a step toward establishing this, we present a draft standard, called Minimum Information about a Cardiac Electrophysiology Experiment (MICEE). The ultimate goal is to develop a useful tool for cardiac electrophysiologists which facilitates and improves dissemination of the minimum information necessary for reproduction of cardiac electrophysiology research, allowing for easier comparison and utilisation of findings by others. It is hoped that this will enhance the integration of individual results into experimental, computational, and conceptual models. In its present form, this draft is intended for assessment and development by the research community. We invite the reader to join this effort, and, if deemed productive, implement the Minimum Information about a Cardiac Electrophysiology Experiment standard in their own work.

Keywords: Minimum Information Standard; Cardiac electrophysiology; Data sharing; Reproducibility; Integration; Computational modelling

Quantifying variability in radiation dose due to respiratory-induced tumor motion
S.E. Geneser, J.D. Hinkle, R.M. Kirby, Bo Wang, B. Salter, S. Joshi. In Medical Image Analysis, Vol. 15, No. 4, pp. 640--649. 2011.
DOI: 10.1016/

Using Hybrid Parallelism to improve memory use in Uintah
Q. Meng, M. Berzins, J. Schmidt. In Proceedings of the TeraGrid 2011 Conference, Salt Lake City, Utah, ACM, July, 2011.
DOI: 10.1145/2016741.2016767

The Uintah Software framework was developed to provide an environment for solving fluid-structure interaction problems on structured adaptive grids on large-scale, long-running, data-intensive problems. Uintah uses a combination of fluid-flow solvers and particle-based methods for solids together with a novel asynchronous task-based approach with fully automated load balancing. Uintah's memory use associated with ghost cells and global meta-data has become a barrier to scalability beyond O(100K) cores. A hybrid memory approach that addresses this issue is described and evaluated. The new approach based on a combination of Pthreads and MPI is shown to greatly reduce memory usage as predicted by a simple theoretical model, with comparable CPU performance.

Keywords: Uintah, C-SAFE, parallel computing

Establishing Multiscale Models for Simulating Whole Limb Estimates of Electric Fields for Osseointegrated Implants
B.M. Isaacson, J.G. Stinstra, R.D. Bloebaum, COL P.F. Pasquina, R.S. MacLeod. In IEEE Transactions on Biomedical Engineering, Vol. 58, No. 10, pp. 2991--2994. 2011.
DOI: 10.1109/TBME.2011.2160722
PubMed ID: 21712151
PubMed Central ID: PMC3179554

Although the survival rates of warfighters in recent conflicts are among the highest in military history, those who have sustained proximal limb amputations may present additional rehabilitation challenges. In some of these cases, traditional prosthetic limbs may not provide adequate function for service members returning to an active lifestyle. Osseointegration has emerged as an acknowledged treatment for those with limited residual limb length and those with skin issues associated with a socket together. Using this technology, direct skeletal attachment occurs between a transcutaneous osseointegrated implant (TOI) and the host bone, thereby eliminating the need for a socket. While reports from the first 100 patients with a TOI have been promising, some rehabilitation regimens require 12-18 months of restricted weight bearing to prevent overloading at the bone-implant interface. Electrically induced osseointegration has been proposed as an option for expediting periprosthetic fixation and preliminary studies have demonstrated the feasibility of adapting the TOI into a functional cathode. To assure safe and effective electric fields that are conducive for osseoinduction and osseointegration, we have developed multiscale modeling approaches to simulate the expected electric metrics at the bone--implant interface. We have used computed tomography scans and volume segmentation tools to create anatomically accurate models that clearly distinguish tissue parameters and serve as the basis for finite element analysis. This translational computational biological process has supported biomedical electrode design, implant placement, and experiments to date have demonstrated the clinical feasibility of electrically induced osseointegration.

IMPICE Method for Compressible Flow Problems in Uintah
L.T. Tran, M. Berzins. In International Journal For Numerical Methods In Fluids, Note: Published online 20 July, 2011.

Scalable parallel regridding algorithms for block-structured adaptive mesh renement
J. Luitjens, M. Berzins. In Concurrency And Computation: Practice And Experience, Vol. 23, No. 13, John Wiley & Sons, Ltd., pp. 1522--1537. 2011.
ISSN: 1532--0634
DOI: 10.1002/cpe.1719

ZAPP – A management framework for distributed visualization systems
G. Tamm, A. Schiewe, J. Krüger. In Proceedings of CGVCVIP 2011 : IADIS International Conference on Computer Graphics, Visualization, Computer Vision And Image Processing, pp. (accepted). 2011.

Real-time magnetic resonance imaging-guided radiofrequency atrial ablation and visualization of lesion formation at 3 Tesla
G.R. Vergara, S. Vijayakumar, E.G. Kholmovski, J.J. Blauer, M.A. Guttman, C. Gloschat, G. Payne, K. Vij, N.W. Akoum, M. Daccarett, C.J. McGann, R.S. Macleod, N.F. Marrouche. In Heart Rhythm, Vol. 8, No. 2, pp. 295--303. 2011.
PubMed ID: 21034854

Association of left atrial fibrosis detected by delayed-enhancement magnetic resonance imaging and the risk of stroke in patients with atrial fibrillation
M. Daccarett, T.J. Badger, N. Akoum, N.S. Burgon, C. Mahnkopf, G.R. Vergara, E.G. Kholmovski, C.J. McGann, D.L. Parker, J. Brachmann, R.S. Macleod, N.F. Marrouche. In Journal of the American College of Cardiology, Vol. 57, No. 7, pp. 831--838. 2011.
PubMed ID: 21310320

MRI of the left atrium: predicting clinical outcomes in patients with atrial fibrillation
M. Daccarett, C.J. McGann, N.W. Akoum, R.S. MacLeod, N.F. Marrouche. In Expert Review of Cardiovascular Therapy, Vol. 9, No. 1, pp. 105--111. 2011.
PubMed ID: 21166532