Skip to main content
 

Exascale Day 10.18.2023

The exascale computing era is here.

With the delivery of the U.S. Department of Energyā€™s (DOEā€™s) first exascale system, Frontier, in 2022, and the upcoming deployment of Aurora and El Capitan systems by next year, researchers will have the most sophisticated computational tools at their disposal to conduct groundbreaking research.

Exascale machines can perform more than a billion billion calculations per second, or 1018, which helped inspire a day of celebration on October 18, or 10/18.

Exascale Day honors scientists and researchers who use advanced computing to make breakthrough discoveries in medicine, materials sciences, energy, and beyond with the help of the fastest supercomputers in the world. It is also a day to celebrate the impact of high-performance computing at all levels.

Aurora
El Capitan

Featured Content

Video 4:52

Readying Aurora Exascale Supercomputer to Assist in the Battle Against Cancer

Source: Intel

Video 0:52

Video 3:13

Exascale Day 2023: To celebrate those who keep asking what if, why not, and whatā€™s next.

At HPE, we see how exascale computing will change the world.Ā  More science, more discovery, and more innovation that improves the way we live and work.
Source:Ā HPE

 

Building a Capable Computing Ecosystem for Exascale and Beyond

The ECP has been a key component of bringing exascale to fruition, opening the door to scientific discoveries beyond our current comprehension.
Source:Ā ECP

 

Exascale Drives Industry Innovation for a Better Future

Exascale is a massive accelerator for technology, productivity, engineering, and science.
Source:Ā ECP

 

Oak Ridge: Exascale’s New Frontier Series

A series exploring the applications and software technology for driving scientific discoveries in the exascale era.
Source:Ā OLCF

 

Oak Ridge: Exascale Day 2023

Exascale computing is transforming our ability to solve some of the world’s most difficult and important problems.
Source: OLCF

 

Argonne installs final components of Aurora supercomputer

The ALCF’s exascale machine is one step closer to enabling transformative science.
Source:Ā ALCF

 

The Road to El Capitan: Preparing for NNSAā€™s First Exascale Supercomputer (Series)

This is the first article in a five-part series about Livermore Computingā€™s efforts to stand up the NNSAā€™s first exascale supercomputer.
Source:Ā LLNL

 

LLNL scientists eagerly anticipate El Capitanā€™s potential impact

WhileĀ Lawrence Livermore National LaboratoryĀ is eagerly awaiting the arrival of its first exascale-class supercomputer,Ā El Capitan, physicists and computer scientists running scientific applications on testbeds for the machine are getting a taste of what to expect.
Source:Ā LLNL

 

Powering up: LLNL prepares for exascale with massive energy and water upgrade

A supercomputer doesnā€™t just magically appear, especially one as large and as fast asĀ Lawrence Livermore National Laboratoryā€™s upcoming exascale-class behemoth El Capitan.
Source:Ā LLNL

– Exascale: Are we ready for the next generation of supercomputers? Technology Untangled

– How Argonneā€™s Sunspot Testbed is Helping Advance Code Development for Aurora Code Together PodcastĀ 

– How DAOS Enables Large Dataset Workloads on Aurora Code Together PodcastĀ 

Siting the El Capitan Exascale Supercomputer at Lawrence Livermore Lab Let’s Talk Exascale

– Exascale: The New Frontier of Computing The Sound of Science

Mike Bernhardt
Strategic Communications Consultant
Akima Infrastructure Services, LLC (AIS) Supporting the Exascale Computing Project
bernhardtme@ornl.gov
Website

Katie Bethea
Oak Ridge Leadership Computing Facility Outreach and Communications Group Leader
Oak Ridge National Laboratory
betheakl@ornl.gov
Website

Justin Breaux
Social Media Manager
Argonne National Laboratory
jbreaux@anl.gov
Website

Beth Cerny
Head of Communications for Argonne Leadership Computing Facility
Argonne National Laboratory
bcerny@anl.gov
Website

Aaron Grabein
Senior PR Manager
AMD
512-602-8950
Aaron.grabein@amd.com
Website

Bats Jafferji
Communications Manager, Super Compute
Intel
Bats.jafferji@intel.com
Website

Scott Jones
Communications Manager
Computing and Computational Sciences, ORNL
jonesg@ornl.gov
Website

Nick Malaya
Principal Engineer, Exascale Application Performance
AMD
512-981-6660
nicholas.malaya@amd.com
Website

Julie Parente
Head of Communications, Computing, Environment and Life Sciences
Argonne National Laboratory
jparente@anl.gov
Website

Charity Plata
Computational Science Initiative, Communications
Brookhaven National Laboratory
cplata@bnl.gov
Website

Carol Pott
Computing Sciences Area Communications Manager
Communications Liaison to the SciData Division

Lawrence Berkeley National Laboratory
cpott@lbl.gov
Website

Paul Rosien
HPC/AI Customer Evangelism & Community Programs
HPE
888-342-2156
Website

Jeremy Thomas
Public Affairs Office
Lawrence Livermore National Laboratory
925-337-4976
thomas244@llnl.gov
Website

Perspectives

This collection of quotes represents the voices of the HPC communityā€”scientists, researchers, application, software, and system developers, and industry opinion shapers, offering their perspectives on the impact of exascale computing and the contributions of the Exascale Computing Projectā€”the diverse, collaborative team of more than 1,000 contributors responsible for developing the worldā€™s first capable exascale computing ecosystem.

Ā 

Doug BlackEditor-in-Chief,insideHPC

The Exascale Computing Project (ECP) changed the way leadership-class supercomputers are developed, setting a permanent precedent. Itā€™s the notion of building out a comprehensive ecosystem around the generational hardware. Whereas previous leadership computing efforts were focused primarily on the system and its throughput, the ECP strategy encompassed applications, tools and training developed in parallel with the hardware.
This began at ECPā€™s inception in 2016 and was driven by project leaders from ECP, HPE, AMD, Intel and the national labs determined to work in concert and in combination. The result has been the delivery of real-world, scientific value from the first exascale system, Frontier, much sooner than would otherwise be expected from such a complex and innovative supercomputer of unprecedented power and scale.ā€
This began at ECPā€™s inception in 2016 and was driven by project leaders from ECP, HPE, AMD, Intel and the national labs determined to work in concert and in combination. The result has been the delivery of real-world, scientific value from the first exascale system, Frontier, much sooner than would otherwise be expected from such a complex and innovative supercomputer of unprecedented power and scale.

Earl JosephCEO,Hyperion Research

The Exascale Computing Project has made revolutionary advances in how leadership class computers can be used. For the last two decades, the applications that run on supercomputers have greatly lagged the advances in the hardware and system scale. The ability to run key applications at scale and to more fully take advantage of GPU technologies is providing an ability that would not have existed without the ECP project.
When combined with the new exascale class computers, the ECP project is allowing researchers to move their science forward more quickly and will help them incorporate new technologies like AI, machine learning and large language models. This will make the leadership computers dramatically more productive and useful to the nation.

Mark NossokoffResearch Director,Hyperion Research

One of the biggest, if not the biggest, barriers to both initial adoption of and extracting maximum value from new technology is related to impact on existing software, tools, and applications. Engineers, scientists, and researchers are loathe to crack open stable code merely to migrate to new technology, regardless of potential benefits. They want to focus on their respective areas of domain expertise and not have to worry about the underlying infrastructure.
The investments made by the Exascale Computing Project paved the way for these engineers, scientists, and researchers to hit the ground running and make immediate use of the exascale capabilities of Frontier today, and of Aurora and El Capitan as they come online, to accelerate their time to results in addressing the broad scope of some of todayā€™s most complex scientific, research, and engineering challenges.

The investments made by the Exascale Computing Project paved the way for these engineers, scientists, and researchers to hit the ground running and make immediate use of the exascale capabilities of Frontier today, and of Aurora and El Capitan as they come online, to accelerate their time to results in addressing the broad scope of some of todayā€™s most complex scientific, research, and engineering challenges.

Dave KepczynskiChief Information Officer, GE Research

Exascale delivers the speed, scale, and fidelity to enable early design and development of new technologies with high levels of confidence in digital engineering and modeling and simulation results.
Over the last five to six years, as a result of the nationā€™s Exascale Computing Initiative (ECI) and the significant progress made as part of ECP, we are now delivering on the promise of next-generation computing capabilities that enable these significant advances in technology and breakthrough products that have real implications for the future.

Addison SnellCEO,Intersect360 Research

The goal wasnā€™t to build an exascale computer just to check a box. It was to build a computer capable of addressing the next generation of scientific challenges. The role of the Exascale Computing Project (ECP) was to ensure the system did more than turn on. We had to program it. We had to power it sustainably. We had to use it.
Because of ECP, the United Statesā€™ exascale computing capability is a breakthrough resource for discovery and advancement.

Pete BradleyPrincipal Fellow, Digital Tools and Data Science,RTX Pratt & Whitney

The holistic approach taken by the Exascale Computing Project is already paying dividends to industry and state-of-the-art technology. As the project built the most powerful computers in history, it also built out the supporting software, technology infrastructure, and human experience necessary to hit the ground at full speed when the systems came online. The results will positively impact the lives of practically everyone on earth through advances in fields as diverse as energy efficiency, medicine, travel, climate, and physics.

Kathy YelickExabiome PI, Berkeley Lab

The Exascale Computing Project has made revolutionary advances in how leadership class computers can be used. For the last two decades, the applications that run on supercomputers have greatly lagged the advances in the hardware and system scale. The ability to run key applications at scale and to more fully take advantage of GPU technologies is providing an ability that would not have existed without the ECP.
When combined with the new exascale class computers, the ECP efforts have enabled researchers to move their science forward more quickly and will help them incorporate new technologies like AI, machine learning and large language models. This will make the leadership computers dramatically more productive and useful to the nation.

Gaurav BazazFounder,Spar Systems Inc.

Exascale computing has provided us with invaluable computational capabilities previously unavailable to us. This has significantly sped up our design space exploration and design optimization process. Our experience with exascale computing has given us insights that will influence our future product development strategy meaningfully. It's a significant step forward in our computational journey.

Guillaume BeardsellChief Science Officer,Noble Thermodynamic Systems, Inc.

Exascale computing is revolutionizing the path to clean energy solutions. At Noble Thermodynamic Systems, we're pioneering the Argon Power Cycleā€”a revolutionary power generation system that features a closed-loop reciprocating engine with no exhaust. The complex engine physics, characterized by high temperatures, high pressures, and turbulent in-cylinder flows, spans a vast range of time and spatial scales, demanding significant computational resources.
These challenges underscore the potential of exascale computing capabilities. They pave the way for more than just simulation; they open avenues for in-depth optimization, such as the fine-tuning of engine component designs, ultimately enabling increased system performance and accelerating the technology's development. As we strive to mitigate climate change, high-performance computing isn't just an assetā€”it's a catalyst, propelling groundbreaking technologies like ours into the market when they're needed most.

Qigui WangTechnical Fellow, Advanced Materials Technology,GM

The HPC for Energy Innovation (HPC4EI) project has offered great opportunities for us to work closely with material scientists in integrated computational materials engineering (ICME) and HPC experts in ExaAM at Oak Ridge National Laboratory (ORNL) to implement ORNLā€™s advanced materials modeling and HPC technologies to real world applications. Weā€™ve also been able to tackle some challenges in cutting-edge through-process modeling technologies for emerging manufacturing processes such as additive manufacturing (AM). The project has really helped integrating multi-scale microstructure modeling and material property models for high performance lightweight AM components.

Olaf WecknerAssociate Technical Fellow, The Boeing Company

Access to Frontierā€™s HPC power will enable Boeing to accelerate the design and manufacturing of impact-resistant composite fuselages for Open Rotor concepts and compete successfully with Airbus planning to flight-test open rotor engines on their A380 in 2025.

Piyush MehrotraChief, NASA Advanced Supercomputing Division,NASA Ames Research Center

NASA is dedicated to actively investigating the development of applications, software packages, and approaches to exploit pathfinding exascale systems for enabling future groundbreaking science and engineering discoveries. In collaboration with the US Department of Energyā€™s Exascale Computing Project staff, we have organized workshops and seminars for agency-sponsored scientists to prepare for and leverage exascale computing systems.
Applications of interest include coupled global atmosphere-ocean modeling for seasonal weather and climate predictions; galaxy formation simulations to understand the universe and our place in it; launch vehicle and launchpad design analysis for deep space exploration; and planetary entry, descent, and landing systems for Mars exploration. All will require exponentially higher fidelity results and faster turnaround time to achieve major advances in the understanding of Earth and our solar system.

Jordan MusserResearch Scientist,National Energy Technology Laboratory

A significant impact of the Exascale Computing Project (ECP) is the connections it fostered between researchers and scientists from across the national labs and universities. Itā€™s very easy to focus on your own research and lose sight of what others are accomplishing. ECP was able to achieve so much by creating co-design centers and encouraging collaboration between application codes and software technologies. I believe itā€™s the multidisciplinary focus of ECP that made it such a success, and I hope the collaborations and connections it facilitated are lasting.

Trish DamkrogerChief Product Officer and Senior Vice President, HPC, AI & Labs,HPE

Exascale supercomputing has already demonstrated a significant impact on the scientific community, which spans various initiatives across public and commercial sectors. Since debuting as the worldā€™s first exascale supercomputer at Oak Ridge National Laboratory, Frontier has allowed researchers to dramatically speed time-to-insight and model complex problems at a level they could not have before. We are inspired by the great discoveries that Frontier has already unlocked, from helping scientists analyze tens of thousands of COVID-19 mutations to enabling GE Aerospace to discover a new approach to advancing jet engine performance with better fuel efficiency. We also look forward to the future discoveries we anticipate when Aurora and El Capitan come online.
At HPE, we are honored to continue closely collaborating with the US Department of Energy, the Exascale Computing Project, and national laboratories to bring exascale technology to life and into the hands of researchers, scientists, and engineers that are solving problems to advance humanity.

Rob NeelyProgram Director, Weapon Simulation and Computing,Lawrence Livermore National Laboratory

The Exascale Computing Project has empowered the entire US Department of Energy (DOE) lab systemā€”including the most ambitious minds in computingā€”to work together toward a common goal: a highly capable set of exascale applications aimed at solving a gamut of scientific challenges. Moreover, all those applications were built upon an open software stack that will help US industry and academia follow the path that DOE helped blaze toward accelerated GPU-based computing.
These applications and software packages will serve the nation for decadesā€”long after our current exascale systems are retired. And this generational shift to accelerated computing would not have been possible without our industry vendor partners at HPE/Cray, AMD, and Intel joining DOE in this pursuit. This was collaboration at its best.

Douglas KotheAssociate Laboratories Director for Advanced Science & Technology,Sandia National Laboratories

I am confident that the Exascale Computing Projectā€™s pathfinding applications will be the science and engineering tools for the nation for decades to come, both via the insights and breakthroughs they will directly provide but also for showing the way for hundreds of other applications that will benefit from accelerated-node architectures.

Mark TaylorE3SM Chief Computational Scientist, Sandia National Laboratories

Exascale computing is allowing us to simulate the earth's climate at cloud resolving resolution. This is the resolution necessary to model the local water cycle, including storms, freshwater supplies, and droughts. This has been a long-sought goal of the climate community. It will improve our ability to predict, assess, and respond to the regional challenges of climate change.

Bronis R. de SupinskiChief Technology Officer, Livermore Computing, Lawrence Livermore National Laboratory

El Capitan will soon be making important contributions to the ASC program meeting its stockpile stewardship mission. The efforts across our community to enable exascale systems, including under the Exascale Computing Project (ECP), have been essential to that achievement. In particular, ECP's PathForward program, which I oversaw, enabled industry advancements that have largely overcome many of the exascale challenges laid out 10 years ago. For example, those investments led to the energy efficiency of large-scale systems increasing by over 20Ɨ, as tracked in the Green500 list.

Sunita ChandrasekaranPrincipal Investigator of ECPā€™s SOLLVE project, University of Delaware/Brookhaven National Laboratory

With the ever-changing architectural landscape, exascale computing and its carefully orchestrated hardware and interconnect technology have led to a paradigm shift in software technology! This includes recreating programming abstractions to represent key architectural features yet still capturing the algorithm in the application as accurately as possible leaving the articulate details to the implementations. It is an art to strike a delicate balance between reaping the best performance out of the hardware while not losing portability, and this only gets challenging as systems undergo radical changes.

Sunita ChandrasekaranPrincipal Investigator of ECPā€™s SOLLVE project, University of Delaware/Brookhaven National Laboratory

The impact of exascale is multifold: with a goal to advance science, exascale has pushed the boundaries for not only what hardware can do but also how software needs to be reimagined to efficiently utilize the rich hardware. These efforts only emphasize the need for co-design, and that implies a strong communication between the application domains, hardware, and software technologies.

Steve ConwayPresident,Conway Communications

For existing applications, exascale computers can slash solution times and enable more realistic solutions. Exascale systems can also make it possible to run previously intractable grand challenge problems with enormous potential benefit to science, engineering, economic growth, and society at large.

Jack DongarraUniversity of Tennessee

The Exascale Computing Project has been one of the most important development efforts I have been a part of over my career.
Iā€™m sorry see it end just when the exascale is becoming a reality.

Jacqueline ChenSandia National Laboratories

The Exascale Computing Project has enabled the development of high-fidelity combustion simulations, including spray, soot, and thermal radiation multiphysics with geometric complexity of real devices. These unique simulation tools will address important societal issues associated with providing clean energy while mitigating climate change and enabling the development of high-speed propulsion applications for national security.
For example, first-principles direct numerical simulations of turbulent combustion with zero-carbon alternative fuels, hydrogen, and ammonia/hydrogen blends are providing physical insights needed by US industry to develop fuel and load flexible gas turbines with minimal emissions for dispatchable power and industrial heat, offsetting the intermittency of renewable energy from wind and solar. Similarly, high-fidelity, implicit, large eddy simulations of spray combustion in aero-engine combustors are enabling the development of predictive models for the stabilization of flames under fuel-lean conditions burning sustainable aviation fuels with different ignitability properties. Lastly, in the interest of providing national security, high-fidelity combustion simulations are being used to understand fundamental aspects of high-speed propulsion in scramjets and rotating detonation engines.

William M. TangPrinceton University

In the midst of much current optimism as well as hype, we should keep in mind the fundamental importance of experimental validation for enabling true scientific progress as emphasized by Richard Feynman: ā€˜If it doesn't agree with experiment, it's wrong.ā€™
Accordingly, exascale computing's exciting advances are now accelerating vital community access to AI/machine learningā€“ready data together with efficient development of associated workloads utilizing such data at scale in key US Department of Energy missions, such as clean energy magnetic fusion energy.

Phil JonesLos Alamos National Laboratory

Exascale computing provides the additional capability needed to deliver more accurate and robust projections of future climate change and actionable information as we move into an uncertain climate future. It is another step toward our goal of achieving cloud-resolving simulations of the climate system.

Katrin HeitmannDeputy Division Director, High Energy Physics, Argonne National Laboratory

The Exascale Computing Project (ECP) is having a tremendous impact in pushing our simulation capabilities to new levels. It enabled us to navigate the complexity of the exascale architectures and connected us to crucial efforts in the community that provided important tools to fully exploit the power of the Frontier and Aurora machines. The ECP empowered us on our journey to build new simulation capabilities to explore the physics of our universe.

Katrin HeitmannDeputy Division Director, High Energy Physics,Argonne National Laboratory

Exascale-class computers are enabling our team to carry out the largest hydrodynamic cosmological simulations ever performed at a comparable resolution. These simulations will be invaluable to understanding the impact of baryonic physics on cosmological observables and helping us interpret results from next-generation surveys. The combination of data and simulations will tell us about the beginnings of the universe and evolution of the structures we observe today. Many puzzles await to be solved, and exascale computing will play a central role in their resolution.

Paul KentComputational Nanoscience Researcher,Oak Ridge National Laboratory

Exascale lets us perform previously impossible calculations. While previously we were regularly limited to single pure materials, we can now perform very high accuracy calculations on interfaces between different materials, including for the very challenging quantum materials. This capability is critical for fundamental physics studies through to devices and technological applications.

Paul KentComputational Nanoscience Researcher, Oak Ridge National Laboratory

Within the Exascale Computing Project, the QMCPACK project is helping to mature software and techniques for performance portability. These reduce the changes needed for a program run on different computers with different architectures, so we can focus more on the science.

Heidi HansonGroup Lead of Biostatistics and Multiscale Modeling in Advanced Computing for Health Sciences,Oak Ridge National Laboratory

Exascale computing has enabled near real-time extraction and harmonization of diagnosis data from clinical notes. The development of deep learning architectures for automated information extraction has enabled the National Cancer Instituteā€™s Surveillance, Epidemiology, and End-Results (SEER) program to extract oncologic information from pathology reports in near real-time at the national scale, allowing rapid case ascertainment for epidemiologic studies.
The use of large language modeling for automated extraction and harmonization of information from clinical text has the potential to improve patient outcomes across a diverse range of diseases at a population scale.

Heidi HansonGroup Lead of Biostatistics and Multiscale Modeling in Advanced Computing for Health Sciences,Oak Ridge National Laboratory

The Exascale Computing Projectā€™s Cancer Distributed Learning Environment (CANDLE) project supported the development of deep neural networks optimized for exascale platforms. The partnership between the CANDLE and Modeling Outcomes using Surveillance Data and Scalable Artificial Intelligence for Cancer (MOSSAIC) was critical for the development, evaluation, and deployment of exascale-enabled computational tools for cancer.

Justin WhittFormer Program Director, Oak Ridge Leadership Computing Facility,Oak Ridge National Laboratory

The US Department of Energyā€™s investments in exascale computing will positively transform the way we live for many years to come. We are already benefitting from a broad spectrum of scientific advances focused on improving our health and our overall well-being. As public-private partnerships drive computing technologies forward, scientists are discovering new drugs, better therapies for diseases, and materials that will revolutionize manufacturing and energy production and storage. Supercomputers and scientific software are allowing researchers to unravel the complex phenomena associated with predicting extreme weather events and gain more insight into the fundamental mechanics of the universe.
The realization of exascale computing marks an advancement in technology that is crucial to improving the conditions of people in the United States and around the worldā€”now and well into the future.

Rob NeelyProgram Director, Weapon Simulation and Computing,Lawrence Livermore National Laboratory

While the National Nuclear Security Administration (NNSA) is still building out its first exascale system, El Capitan, here at Lawrence Livermore National Laboratory, we are getting tantalizing glimpses into what will be unleashed to support the national security mission with work weā€™ve been doing for the past year on early access systems that mirror the ORNL Frontier exascale-class supercomputer nodes. We expect for the first time in history that El Capitan will enable a potent combination of high-fidelity ensembles of 3D calculations. While none of those characteristics is new in isolation, only with exascale will we be able to fuse all of them into a single analysis and have results in hours and days, not weeks and months.
This will fundamentally change how system design and analysis is performed, especially as we increasingly incorporate more AI/machine learning into our workflows. These tools and platforms will be critical in our efforts to attack the challenging problems of material aging, discovery of new polymers and high explosives, design for manufacturability, assessment of hostile and abnormal environments, and options to help accelerate one of our biggest current challenges in NNSAā€”plutonium pit production.

Andreas S. KronfeldDistinguished Scientist and Principal Investigator of ECPā€™s LatticeQCD Project,Fermilab

The exascale era will enable us to understand the strong nuclear force with unprecedented precision, allowing us to illuminate phenomena from high-energy collisions, to neutrino scattering, to nuclear structure. Exascale calculation of lattice quantum chromodynamics (the modern theory of the strong nuclear force) will complement a wide range of state-of-the-art experiments in high-energy physics and nuclear physics, leading to a breathtaking understanding of the fundamental interactions of matter.

Andreas S. KronfeldDistinguished Scientist and Principal Investigator of ECPā€™s LatticeQCD Project,Fermilab

Advanced exploitation of the exascale machinesā€”Frontier and Auroraā€”will be immediately possible thanks to the Exascale Computing Project (ECP). Support for domain scientists updating software and developing new algorithms together with information about and later access to prototype hardware made this possible. It is difficult to imagine life without ECPā€”some sort of successor will be needed to help researchers squeeze the most out of these amazing supercomputers.

Sanjiv ShahVice President of Developer Software,Intel

HPC has long unlocked proprietary, vendor-locked programming models to establish multiplatform industry standards. Now, exascale is driving accelerator programming with SYCL and oneAPI to all the popular accelerator platforms just as OpenMP and MPI standardized parallel programming over the past 3 decades. SYCL/oneAPI solutions are becoming available on top-10 supercomputers as well as on petascale systems, such as Argonneā€™s Polaris, running on a variety of architectures, including AMD, ARM, Intel, and NVIDIA.
More than 35 exascale applications have been ported, bringing to life oneAPIā€™s vision for a single model to span different architectures. This will allow scientists to focus on science, not programming, to power the next generation of scientific discoveries.

Sanjiv ShahVice President of Developer Software,Intel

The Exascale Computing Project is driving an accelerated path for science by growing development skills in portable programming models. Their investment in oneAPI ensures smooth migration of existing exascale applications as more GPUs, such as the Intel Data Center GPU Max Series, become available on the market. They have advanced the skills of hundreds of developers by hosting 20 hackathons, five dungeons, and five workshopsā€”all directed at porting, optimizing, and tuning real science applications to be exascale ready. Enabling applications with portable models allows scientists to spend their time on science rather than isolated ports to the architecture du jour.

Ulrike Meier YangLeader of the xSDK4ECP (Extreme-scale Scientific Software Development Kit for ECP)Group Leader - Mathematical Algorithms & Computing, Center for Applied Scientific Computing, LLNL

Exascale computing allows larger simulations and greater science results than were possible before. It could not have been achieved without the collaboration between application teams, software technology, and hardware developers forged by the Exascale Computing Project. It enabled new partnerships and improved software technologies and coordination between software used in combination. Application codes achieved significant performance improvements through their own code modifications and the improved libraries and tools.
Our efforts to achieve exascale computing throughout the last years have shown how far we can go when we all work together toward a common goal.

Frank AlexanderProject Director, ExaLearn,Argonne National Laboratory

With the exascale hardware and expansive software portfolio that the Exascale Computing Project has created, I fully expect significant, practical impacts throughout computational science, including within critical research areasā€”climate, energy systems design, health, and beyondā€”in the next few years. These advances combined with the spectacular progress in AI, machine learning, and other advanced hardware platforms make this the most exciting era for computing in my lifetime.

Addison SnellCEO,Intersect360 Research

Look at the grand challenges of our time: mitigating climate change, developing new energy sources, fighting pandemics, surviving natural disasters, exploring the universeā€¦. Every one of these fields is supported by advanced computation, whether through scientific modeling or using AI to make advanced predictions. Exascale isnā€™t about computing; itā€™s about advancing.

Jean-Luc VayPhysicist Senior Scientist, Accelerator Modeling Program Head; Principal Investigator of ECPā€™s WarpX Project,Berkeley Lab

Exascale computing is enabling us to conduct predictive simulations of novel concepts of charged particle beam production and acceleration that resolve the physics accurately in fully realistic 3D, which was not possible before. With this, we can explore new ideas that hold the promise to open new fields of application with more precise, more powerful, smaller, cheaper, and greener particle accelerators. We are also excited to use the WarpX code that we developed during the Exascale Computing Project for other fields, including fusion energy.

Alex HueblComputational Physicist, Principal Investigator, Software Architect for ECPā€™s WarpX Project, Berkeley Lab

With the Exascale Computing Project (ECP), we were able to pioneer accelerator and plasma laser modeling for first-of-their-kind machines. The efficient management, common goal to deliver an integrated exascale ecosystem, and tight exchange between scientists across national labs, academia, and industry partners formed the most productive collaborations we ever had. ECP's long-term view with an execution spanning 7 years enabled us to form long-lasting partnerships within the United States and abroad and establish highly scalable, integrated, open community software that is now highly sought-after. Demonstrating leadership by being the first to exascale, this project attracted talent and accelerated progress in computational research, surely advancing the state of the art in the involved fields and readying the community so the increasingly complex scientific challenges (e.g., designing and operating complex particle accelerator facilities) can be addressed.

Douglas KotheAssociate Laboratories Director for Advanced Science & Technology,Sandia National Laboratories

Exascale is yet another ephemeral milepost, and yet it isnā€™t. I remember thinking back in the mid 1990s that solving for >1 million unknowns was crazy! It seemed like a moonshot to me at the time, as we (the US Department of Energy) worked hard to field a 1 TF system in the late 1990s capable of solving that system, which now seems ā€œsimple.ā€ Such is the fast-paced field of supercomputingā€”to think about a system 1,000Ɨ more powerful is incredible!

Douglas KotheAssociate Laboratories Director for Advanced Science & Technology,Sandia National Laboratories

The Exascale Computing Project (ECP) has driven the creation and deployment of a game-changing Extreme Scale Scientific Software Stack (E4S ā€” e4s.io) that embodies 100+ software components with advanced algorithms built and optimized for accelerated-node compute architectures ranging from laptop to desktop to clusters to exascale. E4S is an incredible technology that lowers the barrier of entry for any researcher interested in building and deploying new computational and data science applications. I am confident that E4S will continue to evolve and be one of the long-lasting legacies for ECP.

Michael HerouxDirector of Software Technology, ECPSenior Scientist, Sandia National Laboratories

The Exascale Computing Projectā€™s (ECPā€™s) efforts have created a set of applications that demonstrate performance improvements of 100Ɨ or more over the baseline established at the beginning of the project, revealing a path that many others can follow. Furthermore, ECP has created a set of libraries and tools, available via E4S, built with Spack, and accessible from source, at US Department of Energy facilities, in containers, and in the cloud, that have been part of this success, providing performance portability across NVIDIA, AMD, and Intel GPU architectures.
This 100Ɨ potential demonstrated by ECP is ready to be leveraged by many others. Some can leverage it on the way to their own leading-edge scientific breakthroughs. At the same time, since much of the 100Ɨ improvement comes from effective utilization of GPUs and these GPUs are available to scientists at the desktop all the way to the largest leadership systems, others can leverage ECP investments by reducing cost and energy consumption of previous computations by 100Ɨ. Reducing the energy costs of scientific computations by unlocking the potential of portable GPU performance will be one of the biggest legacies of ECP.

Kathryn MohrorLead for the NNSA Software Technologies Portfolio, ECP, Lawrence Livermore National Laboratory

This has been an exciting year getting access to Frontier. Iā€™m looking forward to seeing what people can accomplish now that the system is more matureā€”the training wheels are off!

Elaine RaybournApplied Information Sciences,Sandia National Laboratories

With over 1,000 researchers participating in the Exascale Computing Project, the collective pursuit of exascale has provided many opportunities for multidisciplinary teams to work together across domains toward a common goal. We may not usually think of its impact in this way, but along the path to exascale, we have learned to be better collaborators and better scientists.

Christine SweeneyProgramming Models Team Deputy Team Leader,Los Alamos National Laboratory

Exascale computing has opened many opportunities for large-data, compute-intensive analytics for scientific applications. One example is single particle imaging for reconstructing images of viruses and proteins. For this domain, the computation needs to keep up with the ever-increasing data rates of experiment sources, such as the Linac Coherent Light Source at SLAC. This technology can potentially enable future scientific capabilities, such as videos of enzymes in action.

Amedeo PerazzoSLAC National Accelerator Laboratory

Instead of AI, what if we were able to create effective artificial photosynthesis? Imagine the possibilities around efficient food production and reduced environmental impacts. X-ray free electron lasers (XFELs) are the path forward to this endā€”and much more. These lasers are very powerful tools allowing scientists to reveal fundamental processes in biology and materials by providing atomic resolution details on ultrafast timescales. Most of these experiments require complex and intensive calculations, called inverse problems, that allow scientists to reconstruct these atomic details from millions of diffraction images that don't look at all like the sample under examination.
Connecting XFELs with exascale computers allows scientists to resolve these inverse problems in quasi-real time, effectively equipping the XFEL with a camera lens. Without this lens, scientists would be operating these powerful cameras without being able to see the results for days or weeks.

Amedeo PerazzoSLAC National Accelerator Laboratory

Anyone can use an exascale computer, but using an exascale computer effectively is a challenging feat. The size, complexity, and diversity of these systems demand new paradigms that require application developers to adopt tools to profile the performance of their codes, libraries that abstract the accelerators, high-performance parallel programming frameworks, and many more.
The Exascale Computing Project (ECP) has not only granted scientists the resources to port and optimize some of the most critical scientific applications, tools, and libraries on exascale systems, but it has also fostered an environment where these different communities were encouraged to collaborate with each other. ECP has been instrumental in the ability of the scientific community to make effective use of these powerful systems from the get-go.

Philip MoczComputational physicist, Weapons Simulation and Computing,Lawrence Livermore National Laboratory

Exascale computing is reshaping our field and our ability to simulate, model and analyze complex phenomena. The Exascale Computing Project has been instrumental in ensuring its successful development in the national lab ecosystem. Exascale physics simulations unlock new parts of parameter space captured on a computer with detailed high-fidelity modelsā€”things that are just not possible with an order of magnitude less resolution. Itā€™s a game changer.

Tom TaborOwnerTCI Media (formerly Tabor Communications, Inc)

1,000,000,000,000,000,000 calculations per second could only be attained via the system wide cohesive, coordinated effort of the ECP. Quite frankly, achieving exascale capability and capacity will impact and improve the quality of life of every human on the planet. This monumental accomplishment broadly speaks to the passion, commitment, and professionalism of the entire HPC community and ecosystem. It most certainly took the legacy and will of the entire village to get here.

Kathy YelickExabiome PI, Berkeley Lab

Exabiomeā€™s work allows computational biologists to take advantage of high performance computing systems, for which these tools didnā€™t previously exist. This effort, combined with advances in exascale computing, will allow these researchers to do very large-scale analyses at a much more regular pace.

David McCallenEQSIM, Berkeley Lab

Weā€™re doing things now that we only thought about doing a decade ago, like resolving high-frequency ground motions. It is really an exciting time for those of us who are working on simulating earthquakes.

Peter NugentExaLearnBerkeley Lab

We can actually look at a simulation of the universe and tell you what the cosmology is ā€“ using machine learning ā€“ very, very quickly. We did it well, and we did it training on some of the largest machines out there.

Peter NugentExaLearn,Berkeley Lab

Weā€™ve stuck our toe in the water with ExaLearn. I think things are just going to explode now with AI in science on exascale-sized computers.

Peter NugentExaEpi,Berkeley Lab

To enhance the calibration, workflow, and optimal decision-making of ExaEpi, we must capture a wide enough range of disease types. With these additional contagions, we will have targeted airborne, waterborne, and vector-borne diseases, bacterial and viral diseases, and diseases that are seasonal or sensitive to local climate.

Axel HueblWarp-X,Berkeley Lab

If you are working for such a long time on a project, how do you make this development sustainable? That is something that ECP really puts an emphasis on. Our codes outlive the machines most of the time, so we intentionally plan it in a way that we can continue in five years when the next machine is coming along, and we donā€™t have to start from scratch and can optimize the code without having to rewrite everything.

The Exascale Computing Project (ECP) is a seven-year effort (2016-2023) created to develop the nation’s first capable exascale computing ecosystem.

ECP is a unique, multi-lab collaboration bringing together some of the brightest application, software, and computational experts from coast to coast.

This unprecedented DOE research, development, and deployment project demonstrates the talent and perseverance of more than 1,000 team members composed of scientists, researchers, project management experts, and participating US high-performance computing systems companies.

81
Total research, development, and deployment projects
15
US Department of Energy Laboratories, 6 of which are Core Partner Labs
57
University Partners
36
Industry Organizations

Oak Ridge National Laboratory is managed by UT-Battelle LLC for the US Department of Energy