A numerical weather prediction model such as HAFS needs accurate initial weather conditions, so when a storm is brewing, EMC coordinates with NOAA’s Office of Marine and Aviation Operations, which sends reconnaissance planes into the developing storm to collect wind and rainfall data and capture a 3D view of the storm, he says.
EMC also uses satellite observations and gets real-time ocean data from autonomous sailing drones, underwater gliders and prepositioned buoys that provide ocean conditions.
“All of the data collected is fed into the model in real-time, and we improve our forecast based on that information,” Tallapragada says.
NOAA’s supercomputers take about an hour and 40 minutes to produce a five-day forecast. NOAA runs both versions of HAFS four times a day for as many as seven tropical cyclones at any given time.
RELATED: Predictive AI is essential to federal zero-trust security.
Because the supercomputers are busy, NOAA turns to Azure to run 31 experimental variations of HAFS — called a Hurricane Ensemble in Real-time on the Cloud (HERC) — for high-priority storms like Hurricane Helene. These variations use slightly different physics, assumptions and initial weather condition data, Tallapragada says.
A Cloud-Based Work Environment for Simulating Storms
The U.S. Army Corps of Engineers also uses models to simulate storms, relying on its Coastal Storm Modeling System (CSTORM-MS) to accurately assess risk and build levees, floodwalls and beach dunes to manage flooding.
USACE runs the model in a supercomputer operated by the Army Engineer Research and Development Center. But within the next year, the agency plans to adopt a hybrid model and take advantage of Microsoft Azure to enhance its modeling operations, says Chris Massey, senior research mathematician for the USACE Research and Development Center Coastal and Hydraulics Laboratory.
“It’s a very accurate model, but it requires a tremendous amount of computing power,” he says.
DISCOVER: GAO’s CIO is no stranger to cloud computing.
Accuracy is important because it allows USACE to build a levee or floodwall at the right height to reduce a community’s flood risk, while allowing the agency to be as cost-efficient as possible, he says.
“It allows for engineering margins of safety, but if you don’t need to build it above a certain height, then you can save on construction costs,” Massey says.
A large simulation that runs CSTORM-MS on 3,000 CPUs typically takes six hours to complete. However, the on-premises supercomputer is a shared resource; if the supercomputer is operating at capacity, USACE can run its modeling system in Azure instead, Massey says.
“This allows me to quickly surge capacity in the cloud for a short time to meet a deadline,” he says.
LEARN MORE: Agencies have options when deploying digital twins.
USACE is currently building a cloud application on Azure that will serve as a next-generation work environment to run the model.
Instead of using software installed on a powerful desktop computer, as they have in the past, USACE staff will log in to the new cloud-based work environment and choose between the on-premises supercomputer or Azure to set up the model, run simulations and analyze the results, he says.
USACE plans to go live with the new work environment within six months to a year. In the meantime, the agency has tested CSTORM-MS in the cloud, and it’s ready to go.
“We evaluated the model to see how well it works in the cloud, and it turns out that it works quite well,” Massey says.