Designed especially for neurobiologists, FluoRender is an interactive tool for multi-channel fluorescence microscopy data visualization and analysis.
Deep brain stimulation
BrainStimulator is a set of networks that are used in SCIRun to perform simulations of brain stimulation such as transcranial direct current stimulation (tDCS) and magnetic transcranial stimulation (TMS).
Developing software tools for science has always been a central vision of the SCI Institute.


We are excited to announce the new release of our software, ShapeWorks 6.2!

To download installation packages for Windows/Mac/Linux and/or the source code, please visit

HPC21Arecibo ObservatoryIn December of last year, the renowned Arecibo Observatory in Puerto Rico collapsed in spectacular fashion when the 900-ton equipment platform, suspended about 500 feet above the dish, came crashing down due to a support cable unraveling.

Fortunately, no one was hurt, but the question remained: What about the invaluable astronomical and atmospheric science data that was collected for decades by the observatory’s 1,000-foot-wide reflector dish? University of Utah Scientific Computing and Imaging (SCI) Institute and School of Computing professor Valerio Pascucci and U faculty members were part of a consortium of researchers responsible for retrieving that precious data and moving it to a safe location.

IMG 8805Bei Wang Phillips, a Faculty at the SCI Institute and an Assistant Professor at the School of Computing, and Arul Mishra and Himanshu Mishra, both Professors of Marketing at the David Eccles School of Business applied for the competitive Deep Tech grant offered by the State of Utah’s Office of the Commissioner of Higher Education. They were awarded a 3-year grant of about $340,000 for developing courses/modules on AI ethics and fairness that would bring fair and equitable AI to the forefront of education.

ldav21Congratulations to Duong Hoang, Harsh Bhatia, Peter Lindstrom, and Valerio Pascucci on receiving best paper honorable mention at the IEEE Symposium on Large Data Analysis and Visualization (LDAV) for their paper titled "High-quality and Low-memory-footprint Progressive Decoding of Large-scale Particle Data."


oneAPI cross-architecture programming & Intel® oneAPI Rendering Toolkit to Improve Large-scale Simulations, Data Analytics & Visualization for Scientific Workflows

[Oct. 26, 2021] - The Scientific Computing and Imaging (SCI) Institute at the University of Utah is pleased to announce that it is expanding its Intel Graphics and Visualization Institute of Xellence (Intel GVI) to an Intel oneAPI Center of Excellence (CoE). The oneAPI Center of Excellence will focus on advancing research, development and teaching of the latest visual computing innovations in ray tracing and rendering, and using oneAPI to accelerate compute across heterogeneous architectures (CPUs, GPUs including future upcoming Intel Xe architecture, and other accelerators). Adopting oneAPI’s cross-architecture programming model provides a path to achieve maximum efficiency in multi-architecture deployments supporting CPUs + accelerators. This core approach based on open standards will allow fast, agile development and support new, advanced features without costly management of multiple vendors’ specific proprietary code bases.

Valerio Pascucci 768x512The University of Utah’s Scientific Computing and Imaging (SCI) Institute is leading a new initiative to democratize data access.

The National Science Foundation (NSF) awarded a $5.6 million project to a team of researchers led by School of Computing professor Valerio Pascucci (pictured), who is also director of the Center for Extreme Data Management in the College of Engineering, to build the critical infrastructure needed to connect large-scale experimental and computational facilities and recruit others to data-driven sciences.

hansenCongratulations to Chuck Hansen on being elected to the IEEE Board of Governors for 2022. IEEE Computer Society relies on a fully elected Board of Governors (BOG) to drive its vision forward, provide policy guidance to program boards and committees, and review the performance of the organization to ensure compliance with its policy directions.

istock 943875208Valerio Pascucci won a NASA Earth Exchange (NEX) award entitled “A Flexible Encoding Framework and Autonomic Runtime System for Progressive Streaming of Scientific Data.” The one year, $100K award will help climate scientists study several terabytes of climate simulation datasets, manage workflows and reduce data management costs. The proposed software systems will advance the study of extreme-scale scientific data.
Valerio Pascucci 768x512Valerio Pascucci funded among WIFIRE Commons and BurnPro3D Team from the NSF. A century of suppressing wildfires has created a dangerous accumulation of flammable vegetation on landscapes, contributing to megafires that risk human life and property, and permanently destroy ecosystems. Small controllable fires can dramatically reduce the risk of large fires that are uncontrollable. BurnPro3D is a decision support platform to help the fire response and mitigation community understand risks and tradeoffs quickly and accurately to more effectively manage wildfires or conduct controlled burns.

1607352443 1Congratulations to School of Computing professor and Scientific Computing and Imaging (SCI) Institute director, Manish Parashar, who was named one of the Association for Computing Machinery Fellows for 2020 for contributions to high-performance parallel and distributed computing and computational science.

The ACM Fellows program recognizes the top 1% of ACM Members for their outstanding accomplishments in computing and information technology and/or outstanding service to ACM and the larger computing community, according to the organization. Fellows are nominated by their peers.

minmaxCongratulations to Bei Wang on her new NSF Award, SCALE MoDL: Advancing Theoretical Minimax Deep Learning: Optimization, Resilience, and Interpretability

The past decade has witnessed the great success of deep learning in broad societal and commercial applications. However, conventional deep learning relies on fitting data with neural networks, which is known to produce models that lack resilience.