Sep 16 2020

NOAA’s Challenge to Improve Weather Forecast Modeling

New technology drives the National Oceanic and Atmospheric Administration’s push toward a crowdsourced approach to improving U.S. weather forecasting.

On Oct. 29, 2012, Hurricane Sandy took a sharp left turn over the Atlantic Ocean and slammed into New Jersey, bringing with it a powerful storm surge that struck the surrounding areas, including New York City. The storm flooded the city’s streets and subways and cut off power throughout the city. This devastating storm caused immense property damage and took the lives of 285 people.

The toll likely would have been much worse had it not been for an eight-day advance warning to residents from the European Centre for Medium-Range Weather Forecasts. ECMWF had correctly predicted the surprising left hook that Sandy took, preventing greater loss of property and lives.

On the American side of the Atlantic, the National Weather Service and National Hurricane Center confirmed Sandy’s trajectory only four days out from landfall. At the time, ECMWF’s technology made the difference, providing a more advanced forecast then NWS could provide. The criticism of U.S.-based weather modeling led to a drive to upgrade the technology and get the U.S. forecast system on par with the European model — and perhaps even better.

Supercomputers at NOAA Get an Upgrade

Earlier this year, the National Oceanic and Atmospheric Administration took a key step toward that goal, announcing a major upgrade to the computing capacity, storage space and interconnect speed of its Weather and Climate Operational Supercomputing System. These computing improvements are poised to return the United States to the forefront of weather forecasting.

“We are committed to put America back on top of international leadership with the best weather forecasts, powered by the fastest supercomputers and world-class weather models,” Neil Jacobs, the acting NOAA administrator, said in a February NOAA announcement.

READ MORE: NOAA uses the cloud and AI to protect endangered species.

NOAA Gets the EPIC Backbone

Prior to this upgrade, the agency had access to a combined supercomputing capacity of 16 petaflops, relying on research and development supercomputers in Colorado, Mississippi, Tennessee and West Virginia.

With the acquisition of two new supercomputers from Cray (a Hewlett-Packard Enterprise subsidiary), each with 12 petaflops of capacity, NOAA is able to devote 40 petaflops of supercomputing capacity to new operational prediction and research. One of the Cray computers, the operational primary, will be located in Virginia, while the backup Cray will be stationed in Arizona.

The addition of the Cray supercomputers, at a cost of $505 million, will triple the computing capacity and double the storage and interconnect speed of NOAA’s forecast model development, making it possible to build higher-resolution and more comprehensive Earth-system models, using larger ensembles, advanced physics and improved data assimilation.

This will go a long way toward following through on the promise of an important NOAA initiative, the Earth Prediction Innovation Center (EPIC) program, born out of the National Integrated Drought Information System Reauthorization Act of 2018.

MORE FROM FEDTECH: Find out about NOAA’s strategic cloud and AI plans.

The Benefits of Crowdsourced Weather Modeling 

The goal of EPIC, according to DaNa Carlis, Office of Weather and Air Quality Program Manager for Next Generation Global Prediction System /EPIC at NOAA, is “to accelerate scientific advancements from the research community to produce the most accurate and reliable operational modeling system in the world.”

This crowdsourcing approach to researching and developing new weather modeling will look outside the NOAA for contributors, working with federal laboratories, higher education, private companies and other members of the weather enterprise.

Taking one of its first steps toward a more collaborative approach to weather modeling, NOAA released its first batch of user-friendly code for medium-range weather prediction in March. Openly available to the public and to members of the weather community, this code will support another NOAA program, the Unified Forecast System, which is a collaboration-focused program being developed in tandem with EPIC.

This was an important step in the development of NOAA’s new community-driven approach to development, as the existing numerical weather prediction code was “unique to NOAA computers,” making it inaccessible to the wider weather enterprise, according to Jacobs in the March announcement.

Finding the Right Cloud Partner at NOAA

Later in March, NOAA released an RFP that will award $45 million over five years to a technology partner to help design and build the infrastructure for EPIC. The focus of this proposal was on software engineering, software infrastructure development and the ability to deliver world-class support services to EPIC’s stakeholders. The RFP closed in May, and an announcement of a single partner to oversee EPIC’s cloud-centric design is expected this fall.

NOAA’s drive to reclaim international leadership in numerical weather prediction is coming at an opportune time. The agency released its “State of the Climate: Global Climate Report for Annual 2019” in January of this year.

The year 2019 was the second-warmest year on record, with nine of the 10 warmest years having occurred since 2005. These higher sustained temperatures are yielding greater extremes in weather activity. NOAA’s increased focus on improving weather modeling in the United States will better prepare its citizens for when the next hurricane strikes.

Image Courtesy of NOAA

Zero Trust–Ready?

Answer 3 questions on how your organization is implementing zero trust.