UIFCW 2023 Session Videos
UIFCW 2023 - Abstracts
Abstracts
Invited Presentations
Using UFS as a Teaching and Research Tool
Sarah Lu
This presentation shares our experiences in using NOAA’s Unified Forecast System (UFS) as a teaching and research tool for young scientists (graduate student and postdoc) at University at Albany. The UFS’s Hierarchical System Development (HSD) enables the university faculty members to train the young scientists, conduct research studies, and potentially contribute to the UFS development. The shared infrastructure, critical for research-to-operations-to-research (R2O2R) activities, enables the research community to conduct research work ranging from simple tests to comprehensive experiments. This presentation will demonstrate the engagement by presenting UFS-based aerosol studies. Our experiences can be applicable to other aspects of UFS (e.g., other earth system components and data assimilation).
Integrating Social and Physical Sciences at the National Weather Service to Understand the Ready in Weather-Ready Nation
Valerie Were and Leticia Williams
The National Weather Service’s vision is a Weather-Ready Nation where society is prepared for and responds to weather, water, and climate-dependent events. Weather forecasts have improved significantly over the years, demonstrating advancements in knowledge of how forecasts can support the vision. However, accurate weather forecasts do not ensure optimal societal outcomes. There remains a gap in understanding the nation’s readiness for increasingly severe events. Filling this gap requires that the NWS integrate social science with ongoing, world-class physical science. In 2021, the NWS formally established a Social, Behavioral, and Economic Sciences (SBES) program in the Office of Science and Technology Integration to help with that synthesis. This talk will provide an overview of what SBES are, how they are integrated into the Weather Enterprise, and how they help the NWS realize its vision.
Data Management, Optimization, Compression
Milan Klöwer
The first data archives of weather and climate prediction centers will approach the 1 exabyte milestone within the next few years. However, climate data compression has not received enough attention to reduce storage and facilitate data sharing. Requirements for compression are manifold: Speed, size, and error have to satisfy various bounds that depend on the data, its application, possible research questions, production volume, and access frequency. The real information problem of lossy climate data compression is to find a truncation that allows qualitatively identical research results compared to the uncompressed data set. But current techniques do not distinguish the real from the false information in data. Here we define the bitwise real information content from information theory as the mutual information of bits in adjacent grid points. This automatically assesses a level of meaningful precision from the data itself. Many of the trailing mantissa bits in floating-point numbers occur independently with high information entropy, reducing the efficiency of compression algorithms. Applied to data from the Copernicus Atmospheric Monitoring Service (CAMS), most variables contain fewer than 7 bits of real information per value and are highly compressible due to spatio-temporal correlation. Rounding bits without real information to zero facilitates lossless compression algorithms and encodes the uncertainty within the data itself. The removal of bits with high entropy but low real information allows us to minimize information loss but maximize the efficiency of the compression algorithm.
Developing and Implementing Upgrades to the Met Office Unified Model
David Walters
As the UK’s National Met. Service, the Met Office develop and deliver operational weather and climate services underpinned by numerical weather prediction and climate projection systems based on the same underlying code. I describe the process used to develop the core modeling code (based on the Met Office Unified Model) and the Research to Operations process that we use to pull this through into operational systems and deliver their numerical output for use in our weather services.
In contrast to the US, most of the research and development leading to upgrades to our systems and their components (e.g., dynamical core, physics packages, data assimilation and observation processing) is performed either within the Met Office or through closely managed collaborations. This means that our model development challenge is largely about motivating the right development decisions and plans early on, being responsive to changes in user requirements, and then efficiently coordinating these developments to lead to coherent improvements in a limited number of systems. This leads to different challenges when compared to organizations that source R&D from a broader group of suppliers that leads to increased diversity in both method and thought. I will discuss these differences and am keen to share best practices and learn from the experiences of others.
NOAA’s Testbeds and Proving Grounds: A Crucible for R2O2R
Andrea J. Ray
NOAA’s Testbeds and Proving Grounds (TBPGs) facilitate research to operations and feedback to research (R2O2R) via the development and pre-deployment testing of research for operations as well as evaluation of suitability and operational readiness. Collectively and individually, they facilitate the orderly transition of research capabilities to operational implementation and thus they are crucial for uptake of research into operations at NOAA and other partners, and ultimately for the realization of societal benefits. TBPGs are working relationships for developmental testing, in a quasi-operational framework among researchers and operational scientists/experts (measurement specialists, forecasters, IT specialists, etc.). Typically this includes partners in academia, the private sector and government agencies, and activities are aimed at solving operational problems or enhancing operations, in the context of user needs. Increasingly, social and cognitive sciences are part of TBPG activities. TBPG activities are two-way interactions with both R2O and O2R, and are iterative, in which any particular project is generally tested multiple times and ways.
This presentation will provide a view into how Testbeds and Proving Grounds facilitate the research to operations pipeline and feedback to research in diverse ways. In particular, it will highlight the roles of forecasters and assessment of usefulness in the evaluations efforts. Finally, it will discuss the possible synergies of TBPGs with UFS and EPIC, how they might fit into the UFS funnel, and some challenges and opportunities.
Seeking Portability and Productivity for Numerical Weather Prediction Model Code
Christian Kühnlein
Achieving hardware-specific implementation and optimization while maintaining productivity in an increasingly diverse environment of supercomputing architectures is challenging and requires rethinking traditional numerical weather prediction model programming designs. We provide insights into the ongoing porting and development of ECMWF’s non-hydrostatic FVM atmospheric dynamical core option in Python with the domain-specific library GT4Py. The presentation highlights the GT4Py approach for implementing weather and climate models, shows preliminary high-performance
computing results on CPUs and GPUs for FVM and other ECMWF relevant code, and outlines the roadmap for the overall model porting project with partners at CSCS and ETH Zurich.
Operationalizing a Weather Forecasting Systems at Tomorrow.io.
Luke Peffers, Ryan Honeyager, Xiaoxu Tian, JJ Guerrette, and Stylianos Flampouris
Tomorrow.io is a technology company that is revolutionizing the private weather industry by tackling the three primary components that are critical to operationalizing a weather forecasting system:
- observation systems,
- core numerical weather prediction (NWP) models, and
- information dissemination via a software as a service (SaaS) platform and API. Tomorrow.io’s scientists and engineers bring a wealth of experience and expertise in all aspects of operational NWP systems.
The team is utilizing open-source codes such as the Weather Research and Forecasting (WRF) model, the Unified Forecasting System (UFS), and the Joint Effort for Data assimilation Integration (JEDI) next-generation data assimilation system. Tomorrow.io has enjoyed utilizing the WRF model in operations for over 4 years whereas efforts to operationalize the JEDI/UFS forecasting system started just over a year ago. The global JEDI/UFS infrastructure was chosen to enable the assimilation of Tomorrow.io’s constellation of satellite weather radars and sounders and forecasting at a global scale. The team has experience with the Model for Prediction Across Scales (MPAS) and still considers it a candidate for operations but will first launch the much more computationally-efficient UFS into operations.
The timeline for operationalizing the JEDI/UFS system is aligned with the launch schedule of Tomorrow.io’s nearly 30 satellites throughout 2023 and 2024. The first pathfinder radar (R1) is already in orbit and more will follow soon. Launches of the remaining scanning radars and microwave sounders will continue well into 2024. This gives the team time to work on:
- building the workflows and tuning the UFS ensemble system that will allow hybrid ensemble data assimilation and offer probabilistic forecasts,
- working with the public version of JEDI as the core system upon which the development of forward operators for Tomorrow.io’s radars and sounders will occur, and
- optimizing the HPC system needed to host the system in operations.
Once operational, the global forecasting system will be fed into Tomorrow.io’s machine learning (ML) post processing system (1-Forecast, a.k.a., 1F) to extract the maximum accuracy and benefit from Tomorrow.io’s satellite constellation and to enhance the probabilistic forecast analytics for our customers.
While this is a monumental task for the team, we are confident in our approach specifically because we are not building it from scratch. Instead, we are leveraging the open-source codes (UFS and JEDI), which have a wealth of support from scientists and engineers who have worked for decades on these technologies. However, there are challenges associated with this effort. Specifically, UFS and JEDI have not enjoyed the exposure to the global community of scientists and engineers that the WRF model has over the decades. The UFS model’s community exposure is expected to increase drastically as the Earth Prediction and Innovation Center (EPIC) matures making UFS and its many auxiliary systems available and usable by the open community. The JEDI Take Control of Tomorrow, Today and associated systems are relatively new and will lag in their open community exposure, especially as much of the code is currently firewalled from non-government organizations.
Nonetheless, Tomorrow.io is using the tools as they are and filling in the gaps as needed. We feel that our experiences with WRF, UFS, JEDI and research experience with MPAS serves as a good example of how the private sector is utilizing these models and making the most use of the open-source efforts by our government agencies. We are eager to provide benefits back to operations within our nation’s operational centers in the future as these open-source repositories allow community-based research to operations. This is where EPIC in its fullest potential is needed and Tomorrow.io is doing its best to make that happen as sub contractors on the EPIC contract offering our expertise in multi-cloud/platform operationalization of weather models. We hope that our work serves as an example of the potential benefits of enabling the private sector to not only use these government-built systems but to improve them and bring novel ideas back to the government and thereby improving weather forecasting for our nation and the world.
The Need and Challenges of Interoperating UFS Across Multiple Clouds
Shuxia Zhang
To many private enterprises, NOAA’s Global Forecast System (GFS) has become their decision tool for the past several decades. As UFS has upgraded GFS with increased skills of forecasts and additional services, the demands of private industry on UFS products/services are tremendous. In this talk, we will discuss the needs and challenges of interoperating UFS across multiple clouds due to the characteristics of weather products/services: urgency or time-sensitiveness, huge amounts of data, complicated data structures, as well as the comprehensiveness of FV3/GFS models — requesting special computing hardware and software to process.
Learning from the Behavior of the UFS
Stefan Gary
Parallel Works provides customers with a uniform interface for high performance clusters and workflows in the cloud; as such Parallel Works customers may make decisions based on the output of UFS while Parallel Works, internally, is making decisions about the behavior of UFS. Tuning various cloud cluster parameters for faster weather model execution can make cloud clusters perform at or better than on-premises clusters; benchmarks with UFS inform this tuning process and ultimately can result in cost savings. Furthermore, in the future, databases of provisioning lag time, spot pricing, and overall availability of specific cloud worker instance types can be leveraged to automatically guide users toward the cluster configurations that will best match their cost and time demands for HPC workflows.
Improving Earth System Models via Hierarchical System Development
Mike Ek, Tracy Hertneky, Lulin Xue, Tara Jensen, Weiwei Li, Kathryn Newman, Louisa Nance, Xia Sun, Man Zhang, Ligia Bernardet, Jeff Beck, Grant Firl, Christiane Jablonowski, Cristiana Stan, and Lou Wicker
Hierarchical System Development (HSD) is an efficient way to effectively integrate the model development process, with the ability to test small elements (e.g., physics schemes) in an Earth System Model (ESM) first in isolation, then progressively connecting elements with increased coupling between ESM components and HSD steps. System in HSD is end-to-end: it includes data ingest/quality control, data assimilation, modeling, post-processing, and verification (e.g., via METplus). HSD includes Single Column Models (SCMs; including individual physics elements), small-domain and regional models, all the way to complex fully-coupled global ESMs with atmosphere/chemistry/aerosol, ocean/wave/sea-ice, land- hydrology/snow/land-ice, and biogeochemical cycle/ecosystem components. Datasets that are used for the different HSD steps are from observational networks and field programs, ESM output, or idealized conditions (e.g., used to “stress-test” ESM elements and components). To advance from one HSD step to the next requires appropriate verification metrics of ESM performance, many at the process level. It’s important to note that this process is concurrent and iterative such that more complex HSD steps can provide
information to be used at simpler HSD steps, and vice versa. The HSD approach can also help understand spatial and temporal dependencies in model solutions, where consistency for different models and resolutions across HSD steps is required. The Common Community Physics Package (CCPP) is designed to lower the bar for community involvement in physics testing and development through increased interoperability, improved documentation, and continuous support to developers and users. Together, CCPP and its companion SCM, developed and supported by the Developmental Testbed Center, provide an enabling software infrastructure to connect HSD steps. The HSD approach and use of CCPP will be illustrated and discussed through the use of Testing and Evaluation examples. This work supports the NOAA Earth Prediction Innovation Center (EPIC) program.
Short-Range Weather Application and Rapid Refresh Forecast System
The Current State of NOAA’s Rapid Refresh Forecast System
Matthew E. Pyle, Curtis R. Alexander, Jacob R. Carley, and Stephen Weygandt
The Rapid Refresh Forecast System (RRFS) is a new regional 3-km ensemble and deterministic forecasting system, based on the Unified Forecast System (UFS), and nearing the end of a multi-year development effort. This development has benefitted from the talents of several NOAA laboratories, EMC, and academia. The RRFS is currently targeting an initial implementation into NWS operations in late 2024. Details of the RRFS analysis and modeling system will be reviewed, with an emphasis on some recent development work. Measures of RRFS skill relative to current operational modeling systems such as HRRR and HREF will be provided, and will highlight some of the ongoing challenges for RRFS to overcome ahead of an operational implementation.
Evaluation of the Rapid Refresh Forecast System During the 2023 HWT Spring Forecasting Experiment
Israel Jirak, Adam J. Clark, David Harrison, and Jake Vancil
The 2023 NOAA Hazardous Weather Testbed Spring Forecasting Experiment (SFE2023) was conducted from 1 May – 2 June with participation from forecasters, researchers, and model developers from around the world. The focus of SFE2023 was to evaluate the FV3-based Rapid Refresh Forecast System (RRFS) as a potential future operational replacement in the National Weather Service for the deterministic High Resolution Rapid Refresh (HRRR) model and the High Resolution Ensemble Forecast (HREF) system. Several deterministic and ensemble evaluations were conducted to compare the performance of the RRFS to the operational baselines and other CAM forecasts. For the deterministic evaluations, the 0000 and 1200 UTC runs of the RRFS were compared to the operational HRRR for Day 1 (i.e., valid f12-f36 and f00-f24, respectively); the 2100 and 0000 UTC runs of the RRFS were compared to the operational HRRR for the first twelve hours (i.e., valid f00-f12); and the 0000 UTC runs of the RRFS were compared to other deterministic CAMs in a blind evaluation for Day 1 (i.e., valid f12-f36) and Day 2 (i.e., valid f36-f60). For the ensemble evaluations, the 0000 UTC runs of the single-physics RRFS were compared to the HREF for Day 1 (i.e., valid f12-f36) and single-physics, mixed-physics, and time-lagged versions of the 1200 UTC RRFS were compared to the HREF for Day 1 (i.e., valid f00-f24) and Day 2 (i.e., valid f24-f48). The subjective evaluation results of these RRFS forecasts from the SFE2023 will be discussed, offering evidence regarding the optimal ensemble configuration and overall operational readiness of the RRFS for severe weather forecasting.
Assessment of Convective-Scale Attributes of the FV3 Dycore Using Idealized Simulations
Louis Wicker
Over the past several years the Rapid Refresh Forecast System (RRFS) component of the Unified Forecast System (UFS) has been continuously under development and evaluation by a number of NOAA units (i.e., the Global Systems Laboratory (GSL), the Environmental Modeling Center (EMC), and the National Severe Storms Laboratory (NSSL)). The RRFS is configured to replace the current High Resolution Rapid Refresh system (HRRR) sometime during late 2024. RRFS development via the NOAA units have resulted in considerable advancement and improvements in its forecasting performance. During the warm season, however, the depiction of individual convective storms and their characteristics have shown some noticeable differences from our current operational convective allowing models (CAMs) such as the HRRR.
The Role of Convective-Scale Static Background Error Covariance in FV3-LAM-Based Hybrid EnVar for Direct Radar Reflectivity Data Assimilation Over the CONUS
Yue Yang, Xuguang Wang, and Yongming Wang
Previous work has shown that the utilization of a convective-scale static background error covariance (BEC) matrix in the hybrid ensemble-variational (EnVar) can improve the convective- scale analysis and prediction for the WRF-ARW (Wang and Wang 2021). In this study, the convective-scale static BEC for the FV3 limited-area model (FV3-LAM) is further developed. In contrast to Wang and Wang (2021) that includes all cross-variable correlations, this study selects and maintains the most critical cross-variable correlations for convective scales to save the computational cost without degrading the analysis and short-term forecast performance. The new BEC matrix developed for the FV3-LAM is employed to directly assimilate radar reflectivity within the GSI-based three-dimensional (3DVar) and EnVar frameworks, emulating the future operational Rapid Refresh Forecast System (RRFS). The role of the static BEC is explored for a case study of severe convective storm systems over the Great Plains on 26-27 May 2021.
Experiments using the full static BECs (3DVAR), the full ensemble BECs (PURE), and the blended BECs with a static/ensemble covariance weight of 30%/70% (HYBRID) are conducted. Detailed comparisons and evaluations indicate that PURE produces less weak spurious reflectivity over the Northern Plains than 3DVAR. Although 3DVAR is much cheaper than PURE, it outperforms PURE in adding the missed cell over Kansas. Compared to PURE, HYBRID improves the analyses and forecasts by maintaining the advantages of PURE and 3DVAR. However, HYBRID has weak spurious reflectivity in the early stage of the forecasts similar to 3DVAR. To optimize the hybridization, the same blended BECs in HYBRID are adaptively applied depending on the ensemble quality in HYBRID_CR, where the weak spurious reflectivity is suppressed, and the improved forecast skill is maintained. Efforts on further optimizing the Hybridization are ongoing and will be presented in the workshop.
Recent Collaborative Development of the Three Dimensional Real-Time Mesoscale Analysis (3DRTMA) Using the Short-Range Weather Application
Terra Ladwig, Manuel Pondeca, Guoqing Ge, Ed Colon, Craig Hartsough, Matthew Morris, Ming Hu, Annette Gibbs, Raj Panda, Jim Abeles, Gang Zhao, Jim Purser, Miodrag Rancic, Curtis Alexander, Jacob Carley, Steve Weygandt, and Israel Jirak
Short-term forecasting (i.e., 0-3 hours), including nowcasting, requires a timely, accurate, and rapidly updating analysis of current atmospheric conditions, both for general situational awareness of the current environmental conditions and to enable forecasters to extrapolate those conditions into the future. Severe and hazardous weather phenomena can evolve on time scales of minutes, and are highly sensitive to subtle thermodynamic changes in the boundary layer. The Weather Enterprise is thus inadequately served by analysis systems that typically update hourly, have larger-than-desirable grid spacing, contain only a small set of 2-D fields, and/or only leverage surface observations.
Towards Consistent Wind-Wave-Current Analysis Products in the 3DRTMA
Malaquías Peña, Stylianos Flampouris, Enrique Curchitser, Leonel Romero, Manuel Pondeca, Jacob Carley, Daryl Kleist, Isidora Jankov, and Guillaume Vernieres
The two-dimensional RTMA suite implemented in 2006, with numerous upgrades up to 2020 has been a cornerstone modeling scheme for guidance, monitoring, and forecast verification in its role as the analysis of record. With the inclusion of Significant Wave Height (SWH) in the 2018 implementation, the RTMA expanded its capabilities to support NWS forecast offices in charge of providing coastal, marine and ocean analysis with estimates of the state of waters at the margins of the continental U.S. Leveraging on the ongoing NOAA developments of the RRFS, the EMC’s Data Assimilation Team, the UFS, and the JEDI project, we are conducting a research project to advance the science and technology of data assimilation of surface wind, sea wave, and surface ocean current for 3D-RTMA. Analysis of marine variables requires an effective exploitation of observational information and the integration of models that capture key processes and air-sea interactions occurring over a wide range of spatial and time scales. The proposed project aims to improve the blending of global and local-scale weather and oceanographic variability by expanding protocols to include marine observations, use advanced data assimilation schemes, and by using sound scientific principles of mesoscale modeling. Its outcomes will facilitate mesoscale data to a community of practice for observing data impact, data integration, and coastal model evaluation. Four key analysis enhancements are addressed: physical consistency in the three-way interaction wind-waves-currents, reduced bias of the background fields, higher fidelity of fields, and improved analysis uncertainty. This talk will present an update on the data gathered and quality control systems for marine and ocean variables-especially the High-Frequency radar and satellite altimetry data, progress on the implementation of an appropriate configuration of the regional MOM6 ocean model to provide surface current’s first guess fields, progress and challenges to implement a bias-correction scheme for positional and amplitude error reduction in the background fields, and progress to develop and evaluate configurations of the background error covariance across the three fields.
Evaluations of Three Regional MPAS Configurations for Severe Weather Forecasting Applications During the 2023 NOAA/Hazardous Weather Testbed Spring Forecasting Experiment
Adam Clark, Kent Knopfmeier, Yunheng Wang, Larissa Reames, Israel Jirak, Louis Wicker, Pamela Heinselman, David Dowell, Craig Schwartz, Michael Duda, William Skamarock, and Patrick Burke
The Warn-on-Forecast initiative at the National Severe Storms Laboratory aims to extend warning lead times for severe weather hazards using an on-demand, adaptable domain and a rapidly updating high-resolution ensemble analysis and forecast system that assimilates radar, satellite, and other observations every 15 minutes. The current prototype of this model, the Warn-on-Forecast System (or WoFS), uses the Advanced Research WRF (ARW) configuration of the Weather Research and Forecasting model. However, NSSL has begun exploring alternative model cores for a next- generation version of WoFS that would (1) accommodate further refinements in grid-spacing (i.e., £ 1-km), (2) accommodate advances in data assimilation, and (3) fit within the framework of NOAA’s Unified Forecast System (UFS) initiative. Tests with the Finite Volume Cubed Sphere model (FV3) consistently yielded spurious storms at model initialization, inability to recover from early imbalances, and unrealistic storm characteristics. Thus, in collaboration with NCAR, NSSL has begun to explore the Model for Prediction Across Scales (MPAS) for its next-generation WoFS. The first step in this process is testing the model at “Day 1” lead times (i.e., 12-36h forecasts) to assess performance characteristics relative to the current operational baseline of the High-Resolution Rapid Refresh (HRRR) model, as well as the Rapid Refresh Forecast System (RRFS), which is an FV3- based system tentatively scheduled to replace the HRRR in 2024. For these tests, three CONUS- domain, 3-km grid-spacing MPAS configurations were developed at NSSL: (1) MPAS HT, (2) MPAS HN, and (3) MPAS RT. In these names, the last two letters denote the initialization dataset and microphysics scheme, respectively. “HT” is HRRR/Thompson, “HN” is HRRR/NSSL, and “RT” is RRFS/Thompson. All three configurations use the MYNN boundary layer parameterization, RUC land surface model, and RRTMG short and long wave radiation. These configurations will run in real-time during the 5-week (1 May – 2 June) 2023 NOAA/Hazardous Weather Testbed Spring Forecasting Experiment, and daily model evaluation activities will assess their performance characteristics alongside the HRRR, RRFS, and other experimental systems. This talk will present preliminary results from these evaluations and highlight notable cases of interest.
The Rapid Refresh Forecast System: Looking Beyond the First Operational Version
Curtis Alexander
The Global Systems Laboratory (GSL) in NOAA seeks to improve numerical weather prediction (NWP) across all weather hazards including severe convective weather, intense rainfall, winter storms, landfalling tropical systems and other small-scale phenomena such as smoke from wildfires. This model development is continuing within the Unified Forecast System (UFS), including the upcoming operational transition of the first version of the Rapid Refresh Forecast System (RRFS) in collaboration with NOAA’s Environmental Modeling Center (EMC).
The RRFS will be NOAA’s flagship hourly updating, convection-allowing deterministic and ensemble prediction system and, with its implementation, will facilitate the retirement of several operational convection-allowing modeling (CAM) systems in the present production suite. This new 3-km system will extend over a large North American domain. Through support from the UFS-R2O project along with other collaborations across the UFS CAM application team, significant development progress has been made with the RRFS towards the first planned operational implementation in 2024 including a freezing of the first configuration during 2023.
With the first version code freeze approaching later this year, many additional RRFS capabilities
are being planned for implementation after the first version and this presentation will provide an
overview of RRFS version two and beyond. These additional capabilities will include enhancements to the data assimilation design including a transition to Joint Effort for Data Assimilation Integration (JEDI), enhanced multiscale assimilation techniques and coupling of earth system components. Complementing data assimilation advancements will be model physics improvements for more scale-aware adaptivity, sub-grid cloud interactions and other component updates such as improved dynamics.
Medium-Range Weather and Subseasonal to Seasonal Applications
Overview of the Next Global Forecast System GFSv17
Jessica Meixner, Catherine Thomas, Jongil Han, Geoffrey Manikin, Hui-Ya Chuang, Rahul Mahajan, Jun Wang, Neil Barton, Alicia Bentley, L. Gwen Chen, Yali Mao, Wen Meng, Raffaele Montuoro, Lydia Stefanova, Guillaume Vernieres, Jiande Wang, and Yuejian Zhu
The Environmental Modeling Center (EMC) is working towards the next operational implementation of the Global Forecast System, GFSv17. This implementation leverages the UFS Application for Medium-Range Weather (MRW) and Subseasonal-to-Seasonal (S2S) and the UFS community at large. A major goal for GFSv17 is to employ a UFS-based fully coupled atmosphere-land-ocean-ice-wave model. The non-atmospheric components of the weakly coupled data assimilation in the Global Data Assimilation System (GDAS) will be based on the Joint Effort for Data assimilation Integration (JEDI) software. The GFS will be evaluated within the context of retiring the North American Mesoscale (NAM) and Rapid Refresh (RAP) models and the marine GDAS components to replace the Global Ocean Data Assimilation System (GODAS), which would result in further simplification of the overall modeling suite. Lastly, we plan to have greater alignment with the Global Ensemble Forecast System (GEFS) in both model and infrastructure development. In this presentation, we will provide an overview of the GFSv17 system, our current development status, future plans and we will highlight the connections to the UFS community.
Evaluation of High-Resolution Prototypes for the Next Global Forecast System GFSv17
Lydia Stefanova, Jongil Han, Wei Li, Jessica Meixner, Jiayi Peng, Sulagna Ray, Mallory Row, and Catherine Thomas
Plans are underway at NOAA/NCEP/EMC to implement an upgrade of the current operational Global Forecast System, GFSv16. The operational GFSv16 is a global deterministic forecast model with the wave model one-way coupled to the atmosphere model, and prescribed ocean and sea ice forcing. The next version, GFSv17, will be a coupled atmosphere, land, ocean, sea-ice, and wave model.
Building on the previously conducted Prototypes 1-8, the same principles are used to design high-resolution prototypes, which target GFSv17 development. The first prototype of these, the High Resolution Prototype 1 (HR1), consists of three sets of runs: Winter, Summer and Hurricane. The Winter (Dec 2019-Feb 2020) and Summer (Jun-Aug 2020) sets consist of 16-day forecasts, initialized three days apart at 00z. The Hurricane set, consisting of 7-day forecasts, is initialized every day at 00z and spans the 20 Jul 2020-20 Nov 2020 period to allow assessment of forecast tropical cyclone intensity and tracks. In this presentation, we will discuss the configuration details of the high-resolution prototypes in comparison with the operational GFSv16 and present an evaluation of the biases and forecast skill in the two systems. The evaluation will focus on sensible weather, surface temperatures, radiative fluxes, upper air circulation, MJO, and tropical cyclone tracks and intensity. We will highlight the comparative improvements/degradations of forecast quality between the two systems, and discuss directions for further improvement.
Improving the Representation of Tropical Variability and Its Large-Scale
Teleconnections in NOAA’s Unified Forecast System
Lisa Bengtsson, Juliana Dias, Maria Gehne, and Kyle Hall
Tropical weather acts as an engine for Earth’s atmospheric circulation; therefore, correctly modeling the seasonal and year-to-year variations in this region is crucial for improving predictions of weather and climate across the world. Atmospheric variability in the tropics is primarily driven by equatorial waves interacting with moist convective processes. These “convectively coupled” equatorial waves are important not only for the tropics, but for global subseasonal to seasonal predictions due to tropical-to-extratropical teleconnections. However, convectively coupled equatorial waves have been a major modeling challenge from weather to climate scales because the onset and propagation of these waves depends on processes that are only partially accounted for in global prediction systems. We here present research aimed at improving the coupling between cumulus convection and equatorial waves within NOAA’s Unified Forecast System (UFS) that has been transitioned to the UFS operational prototypes of GFSv17 and GEFSv13 – these updates to the convective parameterizations include representation of sub-grid convective organization using cellular automata, improved moisture coupling, stochasticity and prognostic evolution. We then use experimental versions of the UFS with improved tropical variability to discuss the imprints of these advancements for predictions outside the tropics.
Convectively Coupled Equatorial Wave Skill in the Unified Forecast System
Maria Gehne and Juliana Dias
Tropical precipitation and circulation are often coupled and span a vast spectrum of scales from a few to several thousands of kilometers and from hours to weeks. Current operational numerical weather prediction (NWP) models struggle with representing the full range of scales of tropical phenomena. Synoptic to planetary scales are of particular importance because improved skill in the representation of tropical larger scale features such as convectively coupled equatorial waves (CCEWs) have the potential of reducing forecast error propagation from the tropics to the midlatitudes.
Here we evaluate CCEW skill in two sets of model forecasts. First, two recent versions of NOAA’s Unified Forecast System (UFS): operational GFSv15 forecasts and experimental GFSv16 forecasts from April through October 2020. And second, several versions of the subseasonal-seasonal (S2S) component of the UFS: coupled prototypes 5, 7 and 8.
Results show overall better initial CCEW skill in the coupled prototypes than in operational forecasts, indicating a positive impact from coupling to an ocean model. Kelvin and Mixed-Rossby Gravity wave skill is below 0.5 by lead time 48 h, while Equatorial Rossby waves and the Madden-Julian Oscillation forecasts have skill until lead time 96-144 h in some cases. In general, CCEW precipitation skill increases somewhat for newer model versions, however the increase is not statistically significant, leaving room for further improvement.
Wintertime Diabatic Heating Biases in UFS Prototype-P8
Benjamin Cash, Chul-Su Shin, Erik Swenson, and David Straus
We analyze a series of integrations performed by EMC using the Unified Forecast System (UFS) Prototype-P8 (P8) configuration, as well as an ensemble of P8 integrations performed at George Mason University (GMU). Diabatic heating is calculated through the residual method (Swenson and Straus, 2021) for both the UFS and the ERA5 reanalysis. We find that while agreement between UFS and ERA5 is relatively good at lower levels, diabaEc heaEng in the UFS is significantly more negatIve in the region between 200 and 50mb than is found in ERA5.
Differences are particularly acute in the region of the Pacific stormtrack. This discrepancy can be further isolated to differences in the vertical advection term.
The Operational Use and Local Development of UFS MRW-GSI System at Central
Weather Bureau of Taiwan
Guo-Yuan Lien, Ling-Feng Hsiao, Chang-Hung Lin, Feng-Ju Wang, Yu-Han Chen, Jen-Her Chen, Jing-Shan Hong, Daryl Kleist, Fanglin Yang, Vijay Tallapragada
Since 2016, in collaboration with the U.S. National Centers for Environmental Prediction (NCEP), the Central Weather Bureau (CWB) of Taiwan has been making efforts on adapting the NCEP’s actively developed FV3 dynamical core based Global Forecast System (GFS) with the Gridpoint Statistical Interpolation (GSI)-hybrid data assimilation for operations at the bureau. After several years of work, development of the first CWB-localized version of the system has been completed and is scheduled for transitioning to operations in 2023. This system is mainly based on NCEP GFS version 15.1 and the corresponding version of the GSI. A full hybrid 4DEnVar data assimilation cycle workflow is adapted. The deterministic forecast is run at a horizontal C384 (about 25 km) resolution and the ensemble members are run at a C192 (about 50 km) resolution, both of which are half of the NCEP GFS v15.1’s operational resolution. In addition, this system includes a few CWB’s local developments that differentiate itself from the NCEP GFS v15.1. Since the model part of the system has later become part of the Unified Forecast System (UFS) Medium-Range Weather (MRW) Application, the CWB may be regarded as one of the UFS MRW’s early adopters for research and operations in the Western Hemisphere. In this presentation, we will briefly describe this CWB-localized GFS, present the system’s forecast skills in real-time operation, and share our experience and gratitude in using these UFS community based tools for building an operational prediction system in a weather
prediction center.
Introduce the Next Global Ensemble Forecast System for Weather, Subseasonal and Monthly Predictions
Yuejian Zhu, Bing Fu, Hong Guan, Eric Sinsky, Xianwu Xue, Neil Barton, Philip Pegion, and Avichal Mehra
During the past few years, the Unified Forecast System (UFS) fully coupled Global Ensemble Forecast System (GEFS) has been developed and tested for the next implementation which is called GEFS version 13. The GEFSv13 will integrate and couple all six components which are the atmosphere, land, wave, ocean, sea ice and aerosol. The GEFSv13 configurations of each model component are the same as GFSv17 except for lower horizontal resolution in the atmosphere, land, and GEFS will include aerosol components. The GEFSv13 will introduce initial perturbations from the early cycle of coupled EnKF for atmosphere, land and ocean. The GEFSv13 will run 1 unperturbed member (control) and 30 perturbed members, out to 48 days at 00 UTC and 16 days at 06 UTC, 12 UTC and 18 UTC. A 30-year (1994 – 2023) reforecast will be generated to support real-time GEFSv13 forecasts. This reforecast will be initialized with a coupled “replay” to ERA5 in the atmosphere and ORAS5 in the ocean, with the GFSv17 analysis for the real-time forecasts. The reforecast is configured for each 00 UTC with 5 members, out to 16 days, except 11 members, out to 35/45 days for every Monday and Thursday. The presentation will introduce a final configuration and a performance evaluation of GEFSv13 for weather, subseasonal, and monthly forecasts.
Developing a Strongly Coupled Land-Atmosphere Data Assimilation for UFS With JEDI
Zhaoxia Pu and Qien Huang
The overarching goal of this study is to implement a strongly coupled land-atmosphere data assimilation capability into the NCEP Unified Forecast System (UFS) using the JCSDA Joint Effort for Data Assimilation Integration (JEDI), focusing on the following areas:
- Implement into JEDI the capability to assimilate in-situ and satellite-measured soil moisture (e.g., SMAP, SMOS, etc.), conventional surface observations, and surface Mesonet observations (e.g., 2-m temperature and humidity, 10-m wind), along with other atmospheric observations into the UFS through a strongly coupled land-atmosphere data assimilation framework with the Noah-MP land surface model and GFSv17;
- Examine the influence of the strongly coupled land-atmosphere data assimilation system on near-surface and boundary layer atmospheric conditions as well as their impacts on severe weather forecasting;
- Evaluate the impacts of the strongly coupled land-atmosphere data assimilation system on medium-range weather forecasting and subseasonal to seasonal (S2S) prediction.
We conducted ensemble forecasting experiments in early development with various land surface perturbations. We evaluated the effectiveness of different perturbation methods on medium-range weather prediction using the UFS model. We also examined the cross-covariances between land and atmosphere. The Soil Moisture Active Passive (SMAP) satellite-derived soil moisture observations and near-surface atmospheric data (2-m temperature, humidity, and 10-m wind) are simultaneously assimilated into a coupled land-atmosphere data assimilation configuration. The impacts of the coupled data assimilation on the medium-range weather forecast, especially the prediction of near-surface atmospheric conditions with the UFS model, are examined. The progress, ongoing, and future work plan will be presented.
Demystifying NCEP’s Global Workflow [GFS]
Rahul Mahajan, Walter Kolczynski, Kate Howard, Xianwu Xue, Cory Martin, Henry Winterbottom, Terry McGuinness and many others
The global workflow is the overarching application that controls the end-to-end orchestration of NCEP’s Global Forecast System (GFS). In this presentation, we will review a brief recent history of the development of the global workflow and its use in research and operational environments. We will outline its current capabilities towards supporting research and development of the fully coupled UFS Weather Model (UFS WM), component as well as weakly-coupled data assimilation (DA) utilizing a variety of techniques ranging from variational to hybrid ensemble-variational via Gridpoint Statistical Interpolation (GSI) and Joint Effort for Data assimilation Integration (JEDI). Furthermore, the global workflow also employs a full suite of verification, validation, and monitoring tools for the evaluation of research experiments. We will outline current efforts in developing and employing a framework for incoming new capabilities as well progress on automated testing and migration to the cloud.
Better Use of Ensembles in Operations Through Clustering and Ensemble Sensitivity Analysis
Austin Coleman, Brian A. Colle, James Nelson, and Travis Wilson
Operational ensembles help forecasters determine the uncertainties of impactful weather phenomena, which need to be communicated to the public and various stakeholders. However, the standard ensemble mean and probability plots do not help forecasters communicate potential scenarios, as important nuances can be washed out amongst ensemble membership. One solution to this issue is to cluster ensemble members into groups, which reduces the amount of data that needs to be assessed while providing the forecaster with a representative view of the ensemble data.
A clustering approach, which was originally developed for a NOAA-CSTAR collaboration over the Northeast U.S., has been made more operational at the Weather Prediction Center (WPC). It uses ensemble-spread characteristics as represented by the leading Empirical Orthogonal Function (EOF) patterns of the spread for 500-hpa geopotential height. The EOFs are calculated across the model ensemble member dimension (GEFS+CMC+EC ensembles or 100 members). The leading principal components (PCs) are the projections of the dominant EOF patterns onto the difference between each of the ensemble members and the ensemble mean. The first and second PCs for the ensemble members are used as input into a K-Means clustering routine, which is utilized to group ensemble members with similar forecast scenarios. This clustering technique has recently been included in the NWS Dynamic Ensemble-based Scenarios for IDSS (DESI) software, a prototype ensemble visualization platform which now runs in many NWS forecast offices. One can create weather scenarios by examining the probability of each cluster spatially, and display many more variables (precipitable water, surface CAPE, cloud cover, etc) using histograms, plume diagrams, and box-and-whisker plots.
Forecasters have requested more information and tools to understand how the atmosphere needs to evolve in the proceeding days for a particular cluster scenario to come to fruition. This talk will also describe a new project involving an ensemble sensitivity analysis (ESA) tool that will be implemented at WPC and hopefully DESI. This tool uses the patterns in the ensemble spread of SLP to calculate upstream sensitive regions for high impact weather, which can help forecasters better understand the origin of the ensemble uncertainty. Overall, this presentation will review the clustering and ESA approaches with a storm event and highlight the new DESI software that forecasters have access to. Some of the benefits to forecasting and stakeholder communication will be highlighted as well as some potential future directions based on forecaster survey feedback.
Hurricane Analysis and Forecast System
Operational Implementation of NOAA’s Next-Generation Hurricane Prediction System: Hurricane Analysis and Forecast System (HAFSv1)
Zhan Zhang, Xuejin Zhang, Bin Liu, Avichal Mehra, Vijay Tallapragada, Sundararaman Gopalakrishnan, and Frank D. Marks, Jr.
The Hurricane Analysis and Forecast System (HAFS) is scheduled to become operational in the 2023 hurricane season to replace NOAA’s existing operational TC forecast systems, HWRF and HMON. HAFS is a Unified Forecast System (UFS) cloud-allowing, high-resolution regional atmosphere-ocean-wave coupled tropical cyclone forecast modeling system, featuring storm-following moving nests, vortex initialization, inner-core data assimilation, and TC-specific
model physics.
The detailed configurations of the first version of HAFS and the scientific evaluation of the three-year (2020-2022) retrospective experiments will be presented. Some of the HAFS forecasts from 2023 real time parallel before operational implementation will also be evaluated and analyzed.
The National Hurricane Center Model Evaluation Process for the Operational Implementation of the Hurricane Analysis and Forecast System Version 1 (HAFSv1) Models
Jonathan Martinez, David Zelinsky, Wallace Hogsett, Kate Musgrave, John Cangialosi, and Benjamin Trabing
The National Hurricane Center (NHC) tropical cyclone guidance suite leverages a variety of dynamical and statistical models to create skillful consensus aids that support operational forecasts. Consensus aids developed and maintained by NHC range from relatively simple deterministic consensus aids formed by equally weighted model means to highly complex probabilistic consensus aids using a blend of model and observational predictors. The operational implementation of the Hurricane Analysis and Forecast System Version 1 (HAFSv1) models for the 2023 season presented a critical juncture for consensus aids in the NHC guidance suite. Comprehensive evaluations were required to assess the implications of the two HAFSv1 models supplanting their predecessor models: the Hurricane Weather Research and Forecasting (HWRF) model and the Hurricane in a Multi-scale Ocean-coupled Nonhydrostatic (HMON) model.
The model evaluation process at NHC will be discussed for the 2023 HAFSv1 operational implementation. This talk will focus on evaluating the impacts of the HAFSv1 models on NHC’s simple track, intensity, and wind radii consensus aids and on the Hurricane Forecast Improvement Program (HFIP) Corrected Consensus Approach (HCCA). The configurable NHC Forecast Verification Software provides the basis for conducting in-house model evaluations by generating homogeneous verification measures, such as errors, biases, and skill. Additional verification measures are externally generated, such as contingency tables and associated skill scores for predicting rapid intensification (RI). Opportunities to advance NHC’s model evaluation process will be discussed in the context of synthesizing NHC’s Forecast Verification Software and Metplus within the Unified Forecast System framework.
Improving Hurricane Track Prediction in a Large-Domain High-Resolution Model
Kun Gao, Lucas Harris, Morris Bender, Jan-Huey Chen, Linjiong Zhou, and Thomas Knutson
High-resolution atmospheric models are powerful tools for hurricane track and intensity predictions. Although using high-resolution contributes to better representation of hurricane structure and intensity, its value in the prediction of steering flow and storm tracks is uncertain. Here we present experiments suggesting that biases in the predicted North Atlantic hurricane tracks in a high-resolution (approximately 3 km grid-spacing) model originates from the model’s explicit simulation of deep convection. Differing behavior of explicit convection leads to changes in the synoptic-scale pattern and thereby to the steering flow. Our results suggest that optimizing small-scale convection activity, for example through the model’s horizontal advection scheme, can lead to significantly improved hurricane track prediction (~10% reduction of mean track error) at lead times beyond 72 hours. This work calls attention to the behavior of explicit convection in the emerging large-domain and global convective-scale models, and it’s often overlooked role in affecting larger-scale circulations and hurricane track prediction.
Evaluation of UFS Tropical Cyclone Quantitative Precipitation Forecasts across Physics Suites and Applications
Kathryn Newman, Brianne Nelson, Evelyn Grell, Linlin Pan, Mrinal Biswas, and Weiwei Li
As the NOAA Unified Forecast System (UFS) advances towards operational implementation of two applications (Hurricane Application in 2023 and the Short-Range Weather (SRW) Application in 2024), it is important to understand the model performance for critical fields such as precipitation. Verification of tropical cyclone quantitative precipitation forecasts (QPF) not only informs improvements in process representation within models, but also helps evaluate and improve forecasts to mitigate the risks associated with extreme rainfall and flooding from landfalling tropical cyclones. Understanding model-based QPF over the ocean also provides important information regarding the forecast biases that may impact storm characteristics at the time of landfall. Additionally, a goal of the UFS is to support the development of a suite of physical parameterizations that can be applied with minimal modification across scales and applications. With this in mind, it is also important to understand the impact of physics suites on model QPFs.
This presentation will provide a comprehensive QPF evaluation of two UFS Hurricane Analysis and Forecast System version 1 (HAFSv1) configurations relative to performance of the operational Hurricane Weather Research and Forecast (HWRF) model, verified against Integrated Multi-satellitE Retrievals for GPM (IMERG) over water, and the Climatology-Calibrated Precipitation Analysis (CCPA) over land. In addition to standard QPF verification, evaluations using advanced methods such as track shifting, storm centric and object oriented verification will be shown using the enhanced Model Evaluation Tools (METplus). Differences between the HAFSv1 configurations will be explored in the context of different microphysics and planetary boundary layer schemes between the configurations. To further explore the impact of physics suites on the model QPF performance, case studies of several landfalling hurricane simulations (e.g., Laura 2020, Ian 2022) will be compared using multiple physics suites relevant for operations (i.e., Global Forecast System (GFS)v17 prototype 8, HAFS suite-1, HAFS suite-2, Rapid Refresh Forecast System (RRFS) version 1 beta, High
Resolution Rapid Refresh (HRRR)), using the SRW Application.
Progress of Development of HAFS-MOM6 Coupling and Preliminary Hurricane Forecast Results
Hyun-Sook Kim, Lew Gramer, Bin Li, Bin Liu, HeeSook Kang, Maria Aristizabal, John Steffen, Zhan Zhang, and Avichal Mehra
We introduce a coupled high-resolution regional Modular Ocean Model v6 (MOM6) in Hurricane Analysis and Forecast System (HAFS). The HAFS-MOM6 development is achieved by collaborative efforts between NOAA/OAR/AOML and NOAA/NWS/NCEP/EMC, with a goal to transition to a next-generation tropical cyclone (TC) forecast system. The effort includes integration into the UFS framework, to provide diversity in addition to the operational HAFS-HYCOM (HAFSv1) for the weather-scale and regional application. This presentation includes the development plan, followed by preliminary results from coupled HAFS-MOM6 simulations for the North Atlantic hurricanes in 2020-2022. Based on about 140 cases with the configuration the same as HAFSv1 (i.e., no parameter tuning), the track forecast skill is slightly better than whereas the intensity skill is comparable to HAFSv1. The presentation includes TC validations and comparisons of the ocean impacts and responses between MOM6 and HYCOM. Future plans include optimizing HAFS configuration to improve forecast skill and merging Marine JEDI for data assimilation to the high-resolution regional MOM6 as the ocean model component of coupled HAFS-MOM6.
Simultaneous Multiscale 4DEnVar With Scale-Dependent Localization (SDL) in HAFS for Hurricane Predictions
Xu Lu and Xuguang Wang
The evolution of hurricanes is complex, involving interactions across multiple scales. The movement of hurricanes is influenced by large-scale synoptic systems, while inner-core small- scale convective systems affect intensity evolution. However, traditional ensemble-based hurricane data assimilation (DA) systems rely on a single localization length scale (SSL), which is insufficient for accurately estimating the atmospheric state across multiple scales to initialize the hurricane forecast.
To address this limitation, the OU MAP lab recently implemented a scale-dependent localization (SDL) method for simultaneous multiscale 4DEnVar in the FV3-based GFS, which significantly improved global predictions. In this study, we further implement the SDL multiscale DA method into the FV3-based Hurricane Analysis and Forecast System (HAFS). Our investigations aim to answer the following scientific questions: (1) What is the optimal configuration for SDL multiscale 4DEnVar in HAFS for hurricane predictions? (2) What is the impact of the SDL multiscale 4DEnVar method relative to the single-scale localization method?
To address these questions, we first compare 3DEnVar and 4DEnVar DA configurations of different localization length scales during Hurricane Laura (2020). Next, we implement and test the SDL multiscale 4DEnVar DA method within the HAFS system, followed by further experiments on optimal SDL configurations. We compare the findings with corresponding single-scale localization configurations. We will present our detailed findings and in-depth diagnostics at the tropical conference.
HAFS as a Testbed for Non-Gaussian Data Assimilation Developments for the UFS
Jonathan Poterjoy
Most global and regional environmental prediction systems rely on combined variational and ensemble data assimilation systems for state estimation. For example, the current NOAA Global Forecast System (GFS) uses a variational “analysis” to initialize deterministic forecasts, and an ensemble Kalman filter (EnKF) to update an ensemble of model states for uncertainty estimation. Both methods oJen require additional constraints or heuristic modifications of model states before, after, or during data assimilation to reduce physical inconsistencies that may occur when updating state variables to reflect observations. For the case of tropical cyclones, ini3aliza3on steps within HWRF and HAFS can include major ad hoc measures such as vortex “relocation” and “modification” and the spatial filtering of analysis increments to reduce asymmetries induced during data assimilation. For this presentation, we will identify Gaussian assumptions as a major obstacle for achieving dynamically consistent state estimates without the need for heuristic steps. In doing so, we will demonstrate the implications of designing future prediction systems with non-Gaussian ensemble data assimilation methods based on particle filters and show how current modeling systems that combine variational and ensemble strategies can benefit from simply replacing an EnKF with a particle filter designed for high-dimensional applications. Research findings for this work come from multi-week, convection permitting FV3 experiments using an experimental version of the NOAA HAFS.
We will also discuss the long-term implications of bringing fully “non-parametric” data assimilation methods into community modeling facilities, such as the UFS. One advantage of particle filters, for example, is the tremendous flexibility permitted in the choice of likelihoods needed for performing data assimilation. As a motivating example, we will demonstrate how data-driven estimates of likelihoods trained on observation-space information can be leveraged to assimilate measurements that contain non-Gaussian, state-dependent uncertainty—while also bypassing the need for complex measurement operators. The same likelihood estimation strategy can also be extended for joint state-parameter estimation , which is a powerful technique for improving models. The end result is a data assimilation framework that is entirely “non-parametric” in that none of the required error distributions needed for Bayesian filtering follow specific “shapes” determined by parameters (e.g., mean and covariance for a Gaussian).
Hurricane Analysis and Forecast System Development: Future Priorities
Xuejin Zhang, Zhan Zhang, Avichal Mehra, Vijay Tallapragada, S. G Gopalakrisnan, and Frank D. Marks, Jr.
HAFS became the new-generation multi-scale numerical model for Tropical Cyclone (TC) application under the NOAA’s Unified Forecast System (UFS) in the 2023 hurricane season. It consists of five salient components: (1) storm-following telescopic moving nests, (2) high-resolution physics configured for TC application, (3) multi-scale Data Assimilation (DA) with vortex initialization, (4) atmosphere-ocean-wave coupling framework, and (5) intensive hurricane observational platforms to support the multi-scale DA system as well as the physics calibrations and system verifications/validations.
In this presentation, we will describe research and development priorities in the next 2-3 years after the initial operational implementation. We will describe new moving nest capabilities for the next upgrade including multiple-storm and nest refinement, data assimilation strategy and development, coupling plan and ocean model transition, and new product plan. To accelerate the transition among development, operation, and community applications, we are developing a HAFS release plan under the guidance of HFIP and EPIC. We are looking for collaborative projects among academia, research labs, NOAA’s operational centers to improve HAFS in the
Future.
Community Discussion: Insights from Industry and Academia
Insights from Tomorrow.io in Operationalizing a UFS-Based Weather Forecasting System at Tomorrow.io
Luke Peffers, Ryan Honeyager, Xiaoxu Tian, JJ Guerrette, and Stylianos Flampouris
Tomorrow.io is a technology company that is revolutionizing the private weather industry by tackling the three primary components that are critical to operationalizing a weather forecasting system: 1) observation systems, 2) core numerical weather prediction (NWP) models, and 3) information dissemination via a software as a service (SaaS) platform and API. Tomorrow.io’s scientists and engineers bring a wealth of experience and expertise in all aspects of operational NWP systems. The team is utilizing open-source code such as the Weather Research and Forecasting (WRF) model, the Unified Forecasting System (UFS), and the Joint Effort for Data assimilation Integration (JEDI) next-generation data assimilation system. Tomorrow.io has enjoyed utilizing the WRF model in operations for over 4 years whereas efforts to operationalize the JEDI/UFS forecasting system started just over a year ago. The global JEDI/UFS infrastructure was chosen to enable the assimilation of Tomorrow.io’s constellation of satellite weather radars and sounders and forecasting at a global scale. The team has experience with the Model for Prediction Across Scales (MPAS) and still considers it a candidate for operations but will first launch the much more computationally-efficient UFS into operations.
Experience of Using UFS for Academic Research
Xuguang Wang
A survey was performed by students and researchers at the University of Oklahoma Multiscale Data Assimilation and Predictability (MAP) lab on their experience of using the UFS for research. Aspects of the UFS that help to enable their research and a wish list will be summarized and discussed at the workshop.
Cross-Cutting Concepts #1 – Physics, Verification, and Validation
Evaluation and Process-Oriented Diagnosis of the GEFSv12 Reforecasts
Zhuo Wang, Jiacheng Ye, Fanglin Yang, Lucas Harris, Tara Jensen, Douglas E. Miller, Christina Kalb, Daniel Adriaansen, and Weiwei Li
Three levels of process-oriented model diagnostics are applied to evaluate the Global Ensemble Forecast System-version 12 (GEFSv12) reforecasts. The level-1 diagnostics are focused on model systematic errors, which reveals that precipitation onset over tropical oceans occurs too early in terms of column water vapor accumulation. Since precipitation acts to deplete water vapor, this results in prevailing negative biases of precipitable water in the tropics. It is also associated with over-transport of moisture into the mid- and upper- troposphere, leading to a dry bias in the lower troposphere and a wet bias in the mid-upper troposphere. The level-2 diagnostics evaluate some major predictability sources on the extended-range time scale: the Madden-Julian Oscillation (MJO) and North American weather regimes. It is found that the GEFSv12 can skillfully forecast the MJO up to 16 days ahead in terms of the Real-time Multivariate MJO indices (bivariate correlation ≥ 0.6) and can reasonably represent the MJO propagation across the Maritime Continent. The weakened and less coherent MJO signals with increasing forecast lead-times may be attributed to humidity biases over the Indo-Pacific warm pool region. It is also found that the weather regimes can be skillfully predicted up to 12 days ahead with persistence comparable to the observation. In the level-3 diagnostics, we examined some high-impact weather systems. The GEFSv12 shows reduced mean biases in tropical cyclone genesis distribution and improved performance in capturing tropical cyclone interannual variability, and mid-latitude blocking climatology in the GEFSv12 also shows a better agreement with the observations than in the GEFSv10.
Integrating JEDI and METplus for Evaluation of Atmospheric Composition Forecasts
Sarah Lu, Willem Marais, Shih-Wei Wei, Jérôme Barré, Maggie Bruckner, Maryam Abdi-Oskouei, R. Bradley Pierce, David Fillmore, and Tara Jensen
METplus, the verification and validation package for Unified Forecast System (UFS), has a limited capability for verifying atmospheric constituents. For instance, systematic evaluation of aerosol profiles and chemical species using satellite observations is not yet available. On the other hand, model verification using satellite measurements requires development of observational simulators that account for the sensor measurement characteristics. This assures that the sensor noise, uncertainties, and sensitivities are properly accounted for in the forecast verification. Efforts are underway to improve UFS constituent prediction through integrating Joint Polar Satellite System (JPSS) atmospheric constituent observations and other observations into the METplus verification framework. The technical approach is to integrate JCSDA Joint Effort for Data assimilation Integration (JEDI) observation processing component (i.e., IODA for Interface for Observations Data Access and UFO for Unified Forward Operator) into the METplus framework. The proposed work ensures that model products and satellite observations are spatially and temporally compatible, which maximizes the benefits of JPSS and products to downstream UFS operational and research users
Hierarchical Physics Development With the Common Community Physics Package (CCPP) Single Column Model (SCM)
Dustin Swales, Grant Firl, Ligia Bernardet, Mike Ek, and Lulin Xue
The Community Common Physics Package (CCPP) Single Column Model (CCPP SCM) allows for physics innovations to be developed in a simplified environment and later transitioned to any host model that uses the CCPP. This seamless ecosystem for physics development greatly reduces the barrier for research-to-operations and operations-to-research (R2O2R) transitions. Incorporating physics innovations for use in an operational setting requires extensive testing and evaluation. This is to ensure that new developments aren’t yielding unexpected results and that all computational considerations are being met. From a scientific perspective, this process should be incremental and hierarchical: starting with initial testing within a simple idealized case, then progressing to fully-coupled high-resolution global forecasts on high-performance computing systems.
The CCPP SCM provides much of the infrastructure needed to facilitate physics development from inception to operations. Here we will provide a brief overview of the hierarchical development capabilities enabled by the CCPP SCM, highlighting current limitations. We will also introduce new capabilities intended to help aid the Unified Forecast (UFS) Weather Model (UWM) CCPP Physics modeling effort.This includes driving the CCPP SCM with UWM output files and the ability to turn “on/off” physics processes within physics suites.
DTC Contributions to the UFS Physics Development and Advanced Testing and Evaluation Towards UFS Physics Unification
Weiwei Li, Man Zhang, Evelyn Grell, Tracy Hertneky, Bri Nelson, and Kathryn Newman
Under the UFS-R2O physics subproject, the Developmental Testbed Center (DTC) made contributions to the UFS physics development, by working closely with the physics developers to provide testing and evaluation support. The targeted physics include pre-implementation enhancements and innovations/updates towards a long-term goal of the UFS becoming a unified, convection-allowing Earth System Modeling system. The work is focused on three major physics suites, P8 [GFS_v17_p8 in Common Community Physics Package (CCPP) v6.0; an experimental suite for GFS v17/GEFS v13], RRFS v1beta and HRRR, along with several physics enhancements. The enhancements include the refactored RRTMGP radiation scheme, the prognostic and scale-adaptive cumulus convection closure (dubbed progsigma), and the updated Thompson microphysics, MYNN PBL, and gravity wave drag (GWD) schemes. Real cases over both the land and ocean were simulated using the limited-area model (LAM) provided by the UFS Short-Range Weather (SRW) Application and the CCPP single-column model (SCM) at both 13- and 3-km grid spacings to facilitate investigations of physics unification and scale adaptiveness. These cases capture processes of moist physics, atmospheric boundary layer, surface conditions, GWD, and their interplays, and represent phenomena under various cloud/weather/climate regimes. Additional in-depth diagnostics demonstrate physics sensitivity to vertical coordinate configuration, another important aspect of physics scale adaptiveness. Selected findings and highlights will be presented during the workshop. It is hopeful that our evaluations against relatively reliable benchmarks can help constrain parameterized processes and inform further physics improvement.
Developing Next-Generation Physics for UFS Applications: The Microphysics Parameterization Challenges
Jian-Wen Bao, Fanglin Yang, Georg Grell, James Doyle, Greg Thompson, Ligia Bernardet, Lisa Bengtsson, Hendrik Tolman, and Neil Jacobs
As the Unified Forecast System (UFS) has become available to both research and operations communities for applications, and its further development is underway, it is timely for the UFS steering committee Physics Working Group to start organizing annual UFS Physics Workshops to discuss the latest advancements in physical parameterization research and development. The expected outcome of these workshops is scientific and technical guidance for future UFS physics development and implementation to address research and operational needs of the UFS community, as well as the numerical weather prediction (NWP) community at large.
This presentation summarizes the outcome of the first Annual UFS Physics Workshop series. This first workshop features presentations on state-of-the-art research and development that address questions about how to best parameterize microphysical processes of clouds and precipitation and represent their dynamical impact across an increasing range of UFS and NWP applications, from global to convective scales. This presentation will summarize the workshop findings including the scientific and technical recommendations that address key challenges for microphysics parameterizations and future directions for development of these parameterizations for the UFS, and more broadly NWP models in the community.
A New Double-Moment Cloud Microphysics Parameterization Scheme for the UFS
Songyou Hong, Haiqin Li, Jian-Wen Bao, and Jimy Dudhia
A new double-moment parameterization for NOAA’s Unified Forecast System (UFS) has developed for global forecasting. A main ingredient of the scheme utilizes a concept to represent the partial cloudiness effect on the microphysical processes, following the study of Kim and Hong (2018). The underlying assumption is that all the microphysical processes occur in a cloudy part of the grid box. Based on the long-term evaluation of the WRF Single-Moment (WSM) and WRF Double-Moment (WDM) schemes by the WRF community, several revisions have been made in microphysics terms, along with a newly introduced aerosol effect in ice processes. A mass-conserving Semi-Lagrangian sedimentation is reconfigured for double-moment physics. The new scheme reproduces the storm structure in an idealized 2D testbed, accompanying better organized front-to-rear jets, cold pools, and convective updrafts, as compared to the results in the case of conventional microphysics. The wall-clock time is about a half in the US NOAA/GFS model, as compared to that of the Thompson scheme.
Improving Snow Cover Modeling in the UFS/Noah-MP Land Component
Cenlin He, Ronnie Abolafia-Rosenzweig, Fei Chen, Michael Barlage, and Peter Romanov
Snow cover plays a key role in modulating surface energy and water balance and further affects subseasonal-to-seasonal (S2S) weather predictions through surface albedo feedback and land- atmosphere interaction. The snow cover simulation in the Noah-MP land surface model, a key land component of UFS, has been known to suffer from systematic overestimation, which introduces additional uncertainty to the weather and climate predictions in the UFS/Noah-MP coupled modeling system. This study aims to improve the snow cover formulation and parameters in Noah- MP by leveraging the high-resolution observational datasets, including MODSCAG snow cover, SNODAS snow depth, SNOTEL snow depth and SWE, as well as a recently developed global 4- km snow cover and snow depth product from Dr. Peter Romanov. We analyze the observational data first to obtain the snow depletion curves which connect snow cover to snow depth in Noah-MP, and then we optimize the relevant parameters as a function of land type in Noah-MP based on the observed snow depletion curve. We will also conduct these analyses for different spatial resolution to test the scale-dependence of the snow depletion curve and parameters. We will quantify the impact of the updated snow cover parameterization in Noah-MP on surface albedo and snowpack evolution as well as feedback to the atmosphere in both offline Noah-MP and coupled UFS/Noah-MP simulations. Our goal is to provide an updated Noah-MP parameter lookup table or a new spatially varying input parameter dataset for Noah-MP to be used for the future UFS/Noah-MP system.
Cross-Cutting Concepts #2 – System
Architecture
Better Compression for UFS With Support from the NetCDF Community
Edward Hartnett
The Unified Forecast System (UFS) produces large amounts of data from each run. Some of these data are compressed to reduce the needed disk space, but this increases the time required for I/O.
To help meet current and future operational requirements better and faster compression was needed. The compression options available in the netCDF library were limited – they were introduced in 2008; since then, faster and better compression libraries and techniques have been developed.
Working with community collaborators, we have added new compression features to the netCDF C and Fortran libraries, including lossy compression, zstandard, and parallel I/O support. These new compression methods are supported in the current releases of netcdf-c and netcdf-fortran. These features will help science data producers such as NOAA, NCAR, NASA, and ESA process, store and distribute the large scientific datasets produced by higher-resolution models and instruments.
Pace: A GPU-Enabled Implementation of FV3GFS Using GT4Py
Oliver Elbert, Johann Dahm, Eddie Davis, Florian Deconinck, Rhea George, Jeremy McGibbon,Tobias Wicky, Elynn Wu, Christopher Kung, Tal Ben-Nun, Lucas Harris, Linus Groner, andOliver Fuhrer
As Moore’s law ends, HPC centers are turning to new compute architectures to drive performance gains. The next generation of leadership-class supercomputers use GPU accelerators to achieve maximum performance. Weather and climate modelers will need to adapt their codes to run efficiently on GPUs in order to fully harness the power of these platforms. While this can be done by rewriting models in low-level languages such as CUDA, or by making use of compiler directives, domain specific languages (DSLs) present an exciting alternative path. GridTools for Python (GT4Py) is a Python-based DSL built specifically for weather and climate models that translates code into C++ or CUDA before optimizing and compiling to run at high speeds on multiple hardware platforms. This allows scientists to maintain a single codebase without sacrificing performance (performance-portability). GT4Py also alleviates the need for model scientists to consider the performance implications of details such as loop-ordering and memory management, as the compiler toolchain naturally separates the concerns of scientific model development and performance engineering.
We present Pace, a GT4Py implementation of the nonhydrostatic FV3 dynamical core and the GFDL cloud microphysics that achieves a 3.5-4x speedup over Fortran on GPU- accelerated systems (Fig. 1). Pace also improves developer productivity by enabling novel workflows, test cases, and workflows: we can subtract the Pace dynamical core from itself at different model timesteps to ensure it is not stateful, easily incorporate machine-learning emulators, and directly integrate any Python pre-or post-processing routines. Work is ongoing to increase the model’s capabilities, but already Pace demonstrates the power of DSL programming and shows great promise for the future of numerical modeling.
JCSDA Next-Generation Earth System Data Assimilation for the UFS
Tom Auligné and Team
The Joint Center for Satellite Data Assimilation (JCSDA) is a multi-agency research center to improve the use of satellite data for analyzing and predicting the weather, the ocean, the climate and the environment. Recent effort has focused on the development, delivery, and support of next-generation Earth system data assimilation for research and operations. Two flagship projects of the JCSDA are the Joint Effort for Data Assimilation Integration (JEDI) and the Community Radiative Transfer Model (CRTM). The main partners involve NOAA, NASA, U.S. Air Force, U.S. Navy, UK Met Office, UCAR/NCAR, as well as several collaborations with academia and the private sector. Using open-science, agile and collaborative best practices, the JCSDA delivers turnkey solutions to the Earth system science community to optimize and accelerate the use of observations of the atmosphere, ocean, cryosphere, land, aerosols, and constituents. This presentation summarizes recent development at the JCSDA, the various products available to the community, as well as opportunities for training and collaboration.
Model Infrastructure Development in UFS Weather Model
Arun Charla, Jun Wang, Denise Worthen, Dusan Jovic, Raffaele Montuoro, Gerhard Theurich, Dan Rosen, Ufuk Turunconglu, Brian Curtis, Sadegh Sadeghi Tabas, Rahul Mahajan, Hang Lei, Dom Heinzeller, Jiande Wang, Matthew Masarik, Jessica Meixner, Bin Liu, Wen Meng, Ligia Bernardet, Rusty Benson, Thomas Robinson, Barry Baker, Tom Clune, and Weiyuan Jiang
The Unified Forecast System Weather Model is the community-based coupled earth science modeling system. Through the collaboration of NOAA laboratories and the broad research community, the coupling infrastructure has been developed with new model components integrated into the system and the coupling capability set up to support various configurations. Currently the UFS coupled model consists of FV3 dynamical core with the Common Community Physics Package (CCPP) for the atmosphere, MOM6 and HYCOM for the ocean, CICE6 for sea ice, WW3 for ocean waves, NoahMP for land, GOCART and CMAQ for aerosol and chemistry and the Community Mediator for Earth Prediction Systems (CMEPS) based on ESMF NUOPC for coupling framework. The UFS Weather Model supports the Hurricane Analysis and Forecast System (HAFS) v1, Regional Air Quality Model (AQM)v7, and upcoming Rapid Refresh Forecast System (RRFS) v1, Global Forecast System (GFSv17) and Global Ensemble Forecast System (GEFSv13) implementations.
In this presentation, an overview of general model infrastructure for the UFS coupled model will be provided. Major infrastructure achievements will be presented including upgrading and developing 9 NUOPC caps for the flagship earth component models, transitioning to CMEPS mediator, building UFS applications, and developing UFS Weather Model test framework. New capabilities on the write grid component capability to output history and restart files and extending the inline post-processing on multiple output domains with moving nests will be described. New approaches that were implemented to improve the computational performance for the coupled model forecast and code updates bridging the model configuration with the global workflow will also be discussed.
Simultaneous Multiscale Data Assimilation to Improve Convective-Scale Forecasts Over the Continental US (CONUS)
Yongming Wang and Xuguang Wang
Convection-allowing models (CAM) and a myriad of in-situ and remote sensing observations respectively resolve and sample a wide range of scales. Data assimilation (DA) algorithms are therefore required to analyze the state across multiple scales, hereafter “Multiscale Data Assimilation (MDA, Wang et al. 2021). For the initialization of regional CAMs such as those over the Continental US (CONUS), past studies adopted a sequential MDA approach (Baseline) to analyze larger scales with a broader localization radius by assimilating in-situ observations followed by the assimilation of radar observations with a tight localization radius to update the convective scales. This study introduces and adopts a simultaneous MDA approach (Simultaneous_MDA), where all observations are assimilated at once to update all resolved scales simultaneously. Simultaneous_MDA is enabled through scale dependent and variable dependent localization in the model space in the GSI-based EnVar. Simultaneous_MDA allows observations to be more effectively utilized and reduces the computational cost associated with I/O.
The two approaches, Baseline and Simultaneous_MDA, are first compared on the analyses and forecasts of the 3 May 2018 Squall-line case. During DA cycling, Simultaneous_MDA yields improvements from Baseline in the first guess verified against observations. Simultaneous_MDA outperforms Baseline during most forecast lead times for composite reflectivity at all thresholds
and conventional variables (winds, temperature, and moisture). Diagnostics show that Simultaneous_MDA improves the large-scale convergence along the front and dry-line by eliminating noisy correlations when assimilating in-situ observations and the near-storm environments by keeping larger-scale increments when assimilating radar reflectivity observations, compared to Baseline.
Implementation of Simultaneous_MDA within JEDI is ongoing, and the related initial results will be presented in the conference.
Unifying Workflows for UFS Applications
Christina Holt, Fredrick Gabelmann, Brian Weir, Venita Hagerty, Emily Carpenter, Janet Derrico
The Earth Prediction Innovation Center (EPIC) has teamed up to form a multi-institutional team with contributors from many NOAA programs and projects including NOAA EPIC, NOAA Joint Technology Transfer Initiative (JTTI), and Software Engineering for Novel Architectures (SENA), and serves stakeholders from across the Unified Forecast System (UFS) Community in academia and the public sector to meet the needs of research and operations at NOAA. The Unified Workflow Team plans to design and implement a Unified Workflow toolbox to begin the process of workflow unification across UFS Applications – Medium-Range Weather (MRW), Short-Range Weather (SRW), and Hurricane Analysis and Forecasting System (HAFS), to name a few. Each of these Applications is specifically configured for the weather systems in which they specialize, requiring customized sequences of tasks to run to produce the end forecast products. By linking the individual tools together in a specific, consistent way across the Applications, we will introduce a Unified Workflow Framework that will give UFS Application Users a common experience no matter their area forecasting interest.
In this presentation, we will summarize our plans and progress toward delivering a set of flexible, modular, reusable tools for common tasks: configuration management, batch system submission, and file handling, to name a few. We will also provide the plan and roadmap for introducing a Unified Workflow Framework in UFS Applications.
Cross-Cutting Concepts #3 – Dynamics and Nesting
The Worldwide, Federated FV3 Community
Lucus Harris
The GFDL Finite-Volume Cubed-Sphere Dynamical Core (FV3) is the dynamical core unifying weather and climate modeling in NOAA, NASA, and beyond. In this talk, I will describe the community resources available to developers and users and how it enables developers to tailor their own FV3-based models to carry out their missions. I will show new and emerging capabilities within FV3, including new grid capabilities, turbulent mixing, and integrated physics. I will also describe results and plans with FV3-based models at kilometer- and sub-kilometer scales. The presentation will close with advice for improving community and cooperation in the broader modeling community.
Forecasting and Hindcasting Capabilities in the Simple Cloud-Resolving E3SM Atmosphere Model (SCREAM) Model
Paul Ullrich, Weiran Liu, Peter Caldwell, Colin Zarzycki, and Jianfeng Li
The Simple Cloud-Resolving E3SM Atmosphere Model (SCREAM) is a new global atmospheric model aimed at cloud-permitting scales, with typical grid spacing around 3.25km. The model is written in performance-portable C++ and designed for efficient operation on both CPU and GPU architectures. Regional refinement in SCREAM allows us to effectively treat the model as a regional forecasting and climate model. When used in forecast or hindcast mode, the Betacast toolset allows us to easily draw initial conditions from GFS, HRRR, ERA5, HWRF, or other sources. In this presentation I will discuss our application of SCREAM to several major historical events, including the 2012 North American Derecho and Super Typhoon Mawar. The pseudo-global warming method is also highlighted, which makes use of forecasting capabilities to investigate climate change impacts.
A Scale-Aware Three-Dimensional TKE Turbulent Mixing Parameterization for the Hurricane Analysis and Forecast System (HAFS)
Ping Zhu, Jian-wen Bao, Xuejin Zhang, Jun Zhang, and Zhan Zhang
One of the most challenging problems in numerical weather prediction (NWP) is how to use a physically coherent parameterization scheme to represent the three-dimensional (3D) sub-grid scale (SGS) transport induced by the unresolved turbulent eddies in numerical simulations. In current operational NWP models, the SGS horizontal and vertical turbulent mixings are treated separately in a model’s dynamic solver and a physics package outside the dynamic core, respectively. Such a separated parameterization strategy partially reflects the nature of anisotropic horizontal and vertical mixing induced by large turbulent eddies and partially is due to the fact that the vertical turbulent mixing is much stronger than the horizontal mixing in fair-weather conditions. However, as model grid resolution increases to the order of one kilometer, the so-called gray zone, this method becomes problematic as the unresolved turbulent eddies possess both anisotropic and isotropic characteristics. While the anisotropic part of turbulent mixing may be treated properly by the separate horizontal and vertical mixing schemes, the isotropic component must be treated three dimensionally as the turbulent mixing is the same in all directions under the isotropic assumption. This is particularly true in the numerical forecast of tropical cyclones (TCs). One of the unique features of turbulent processes in the TC inner core is that turbulent eddies experience large lateral contrast across the eyewall, rainbands, and the moat in-between. Large turbulent eddies across the edge of the eyewall generate internally connected comparable horizontal and vertical turbulent mixing, which cannot be appropriately treated by the separated horizontal and vertical turbulence schemes.
Moreover, theoretical, observational, simulation studies show that the dry moat air laterally entrained into the eyewall and rainbands by the large turbulent eddies meets the instability criterion, and thus, sinks unstably as convective downdrafts to generate turbulent kinetic energy (TKE) in the eyewall and rainbands (Zhu et al. 2023). Both the comparable inter-connected horizontal and vertical turbulent mixing and the positive feedback between the lateral entrainment instability and TKE generation in the eyewall and rainbands are unique for the turbulence development and transport in the TC inner core, and therefore, they must be represented realistically in numerical models for TC prediction.
To resolve the problem, in this collaborative project involving with Florida International University (FIU) and NOAA’s Physical Sciences Laboratory (PSL), Hurricane Research Division (HRD), and Environmental Modeling Center (EMC), we have been developing a scale-aware (SA) ‘moist’ 3D TKE turbulence scheme and implementing it in the Hurricane Analysis and Forecast System (HAFS), one of the Unified Forecast System (UFS) models used for TC prediction. The implementation of the scheme involves both HAFS’s dynamic solver and the Common Community Physics Package (CCPP). This SA 3D turbulence scheme allows for a unified treatment of the horizontal and vertical turbulent mixing induced by small isotropic and large anisotropic eddies adaptively applicable from LESs to mesoscale simulations. Our initial results show that the new scheme can appropriately represent the internally connected horizontal and vertical turbulent mixing in the inner core of TCs.
The Motivation, Specification, and Optimization of the Extended Schmidt Gnomonic (ESG) Grid for Limited Area Applications
James Purser, Dusan Jovic, Jili Dong
The FV3 cubed-sphere model now used for global forecasting applications at NOAA originally came equipped with different styles of the gnomonic grid specifications applied to its six tiles, together with the traditional conformal Schmidt mapping procedure allowing an enhancement of the resolution targeting a chosen tile. The various gnomonic grids differed primarily in their implied profiles of the spacing of the great-circle grid lines, prior to any subsequent Schmidt refinement. Unfortunately, it became apparent that no choice of these existing mapping parameters, with or without the conventional Schmidt refinement, could adequately provide suitable grids for very large rectangular domains such as those needed to cover North America together with broad swathes of both the Atlantic and Pacific Oceans, owing to unacceptably severe map distortions around the periphery of these regions.
However, a resolution of these difficulties was found to be possible through a reinterpretation, and extension, of the original parameter space that could be formally regarded as being equivalent to allowing the square of the conventional Schmidt refinement factor to range over both positive and negative real values – i.e., as if the Schmidt factor itself were permitted to become an imaginary value. The formalism involved in this extended transformation necessitates some concomitant changes in the other mapping parameters to render the geometrical transformations real for all parameter choices, and in general, cannot be applied to any fully global mapping without introducing some mapping singularities. But since these singularities can lie far outside the large rectangular regions of relevance, and the resulting mappings, judiciously tuned, may lead to a spectacular reduction in the integrated distortion over these large domains, the new Extended Schmidt Gnomonic (ESG) system of grids turns out to be a perfectly acceptable way of generalizing the original suite of transformations.
We shall describe the original suite of FV3 gnomonic and Schmidt-transformed grids and show how algebraic and projective geometric methods can be brought to bear to expand the range of transformations in the desirable manner described above. We shall also describe how it is possible to prescribe a formal criterion of local or integrated map distortion, and to employ such a criterion to automatically optimize the ESG mapping parameters for a large rectangular domain of given dimensions.
Novel Grid Capabilities in GFDL’s Dynamical Core FV3
Joseph Mouallem, Lucas Harris, and Rusty Benson
We present the latest grid-related capabilities implemented in Geophysical Fluid Dynamics Laboratory (GFDL)’s Finite-Volume Cubed-Sphere Dynamical Core (FV3): multiple nested grids and the Duo-Grid. First, two-way multiple same-level and telescoping grid nesting allows simulating various independent weather events in greater detail by resolving smaller-scale flow structures. Nested grids run concurrently on different sets of processors to optimize the overall computational performance. Second, a Duo-Grid algorithm consists of remapping a tile’s halo data from neighboring tiles from kinked to natural locations along great circle lines. Results from grid imprinting are practically eliminated in the numerical solutions. This comes at the expense of an increase in computational cost.
Moving Nest Features for HAFS
William Ramstrom, Ghassan J. Alaka Jr., Xuejin Zhang, and Sundararaman G. Gopalakrishnan
Storm-following moving nest functionality has been implemented in HAFS v1.0 for operational use by NOAA for the 2023 hurricane season. The initial version enables a high-resolution nest to follow a single storm in regional and global configurations. Several new moving nest features are under development to expand the capabilities of HAFS in future seasons.
In-Person Posters
Evaluating and Enhancing Snow Compaction Process in the Noah-MP Land Surface Model
Ronnie Abolafia-Rosenzweig, Cenlin He, Fei Chen, and Michael Barlage
Interactions between the land, surface, and atmosphere can provide an important source of predictability to weather and climate at subseasonal-to-seasonal time scales. Simulated snowpack, which has a particularly long memory and important land-atmosphere feedback, is a key source of uncertainty in land surface model (LSM) simulations. This study targets the accuracy of snow density in LSMs, which impacts the accuracy of simulated terrestrial water and energy budgets. A baseline snow simulation with the widely-used Noah-MP LSM systematically overestimates snow depth by 53 mm even after removing biases in snow water equivalent (SWE). To reduce uncertainties associated with snow compaction, we enhance the most sensitive Noah-MP snow compaction parameter—the empirical parameter for compaction due to overburden (Cbd)—such that Cbd is calculated as a linear function of surface air temperature as opposed to a fixed value in the baseline simulation. This enhancement improves accuracy in simulated snow compaction across the majority of the western U.S. (WUS) SNOTEL sites (RMSE reduced at 85% of SNOTEL sites), with modest improvements in cooler accumulation periods (RMSE reduced at 59% of sites) and substantial improvements during warmer ablation periods (RMSE reduced at 93% of sites). Relatively larger improvements during warm conditions are attributable to the default Cbd value being well suited for cold temperatures (≤ -5°C). Comparisons of simulated snow depth and density with observations outside of the optimization period also show improvements for simulations using the enhanced compaction parameter, supporting that the snow compaction enhancement is temporally transferable. Differences between enhanced and baseline gridded simulations across the total WUS support that the enhancement can have important impacts on snowpack evolution, snow albedo feedback, and snow hydrology.
Enhancing Community UFS Land Model Development Through Advancing Land Component and Land Data Assimilation Capabilities
Michael Barlage, Clara Draper, and Ufuk Turuncoglu
A collaborative effort is currently underway to develop NOAA’s next-generation Unified Forecast System (UFS) framework. Within the UFS, there are multiple major earth system components, including atmosphere, oceans, and land. UFS applications span local to global domains and predictive time scales from sub-hourly to seasonal. These wide-ranging applications pose challenges and provide opportunities for the development and evaluation of UFS land components. This presentation will discuss on-going efforts in addressing and coordinating within UFS: land model physics advances both within the CCPP physics repository and through a separate land component and a JEDI-based land data assimilation capability.
To facilitate UFS community engagement and accelerate R2O transition, a hierarchical testing approach is being developed that involves a spectrum of LSM-only simulations, a single-column coupled land-atmosphere modeling system, and coupled simulations both without and with a prognostic ocean. This approach is used to isolate and quantify the impacts of individual components before systematically increasing complexity and inherently introducing non-linear, difficult to track interactions. This provides a direct pathway for candidate models to diagnose problem areas in the model process chain, which enables identification of specific parameterizations that are the source of poor model performance.
The presentation will focus more deeply on two ongoing efforts in UFS Land development with a community focus: creation of a component land model capability in UFS and creation of a JEDI-based land data assimilation capability. A successful UFS land effort will expedite community involvement in land model development, contribute to looking beyond the land surface model as a boundary condition by providing land surface process-level information to expanding user communities.
Utilizing Machine Learning Methods for the Objective Tropical Cyclogenesis Classification in Numerical Weather Forecasts
Brammer, Haynes K., Libardoni A., Grogan D., Schumacher A., and Dunion J.
The Ensemble-based Globally available Genesis Index for Tropical Cyclones (EGGI-TC) is a novel hybrid dynamical ensemble-statistical model designed to provide probabilistic forecasts of Tropical Cyclogenesis. This model combines machine learning techniques with numerical weather forecast outputs to accurately identify the occurrence of tropical cyclone genesis in ensemble forecasts. By leveraging the reliability and improved skill of machine learning methods with ensemble outputs from the NOAA Global Ensemble Forecast System (GEFS), EGGI-TC enhances the previous basin-specific linear statistical model. Currently, the real-time model operates on the GEFS ensemble outputs, enabling the estimation of uncertainty in pre-genesis tracks, as well as providing information on the timing, location, and probability of cyclone genesis. This approach represents a powerful integration of ensemble and statistical forecasts, combining the skill of statistical models with the dynamical uncertainty offered by numerical ensembles.
Results from an expanded global pre-genesis database will be presented, along with forecasts from 2022 real-time demonstrations. This expanded dataset facilitated a comparative analysis between linear discriminant analysis statistical models and non-linear machine learning methods, evaluating the most effective real-time model and uncovering variations in pre-genesis disturbances across all basins. Verification results will also be presented for forecasts generated in both retrospective and real-time runs. These results demonstrate the added value of track spread and genesis uncertainty obtained from ensembles, providing valuable insights beyond deterministic inputs alone. Furthermore, we will showcase example graphics and forecaster aids from the 2022 real-time demonstrations, along with preliminary verification of their skill during this demonstration period.
Propagating Meteorological Forecast Uncertainty Through a High-Fidelity Hydrodynamics Framework for Hurricane-Induced Storm Surge Prediction: Hurricane Ian (2022) Case Study
Albert Cerroneab, Dylan Wooda, Benjamin Pachevb, Maria Teresa Contreras Vargasa ,Damrongsak Wirasaeta, Clint Dawsonb, and Joannes Westerink
Coupling a high-fidelity hydrodynamics driver with an ensemble meteorological product enables the generation of probabilistic storm surge guidance (PSSG). While computationally expensive, exercising a high-fidelity driver for this purpose is critical to reduce the epistemic uncertainty in the system. In other words, by leveraging a hydrodynamics driver with minimal model discrepancy, the resulting PSSG’s bounds are dictated almost entirely by the meteorological forcing itself.
To render PSSG in this fashion, the spatial and temporal resolutions of the forcing meteorological product must be accommodated — they cannot be homogenized or decimated in any way otherwise these actions will introduce more uncertainty into the system. Our approach retains the full complement of a given product (albeit deterministic or ensemble) while also retaining adequate coastal mesh resolution to quantify hurricane-induced storm surge levels. Specifically, we adopt the National Oceanic and Atmospheric Administration’s (NOAA) Global Surge and Tide Operational Forecast System (G-STOFS). G-STOFS is built around the hydrodynamics driver ADCIRC, a scalable finite element circulation model. At the University of Notre Dame, we maintain a shadow of G-STOFS that renders global 7-day water level guidance once per day.
In support of this study, we first forced a reanalysis wind product from Oceanweather Inc (OWI) through our G-STOFS shadow for the period corresponding to the Hurricane Ian event. We then validated the resulting water levels, ζ, in Southwest Florida against United States Geological Survey (USGS) high water marks (HWMs) and peaks. The predicted maximum water levels correspond well to the observed levels with a mean absolute percentage error (MAPE) less than 10% and a coefficient of determination (R2) greater than 0.9.
This suggests that G-STOFS does not have an intrinsic modeling deficiency for the Hurricane Ian event. In other words, for the purpose of producing PSSG within G-STOFS, the only major source of epistemic uncertainty in the system is the meteorological forcing. Based on this result, we began forcing our shadow system with dynamical meteorological products — NOAA’s Global Forecast System (GFS), High-Resolution Rapid Refresh (HRRR), and Global Ensemble Forecast System (GEFS). For the Hurricane Ian event (and other hurricanes in the recent past), these products tend to have muted wind intensities near the hurricane core. Consequently, we must amplify them in a manner that does not compromise the track variability expressed in ensemble products like GEFS. For this, we consult the track forecast of the National Hurricane Center (NHC). NHC’s forecast has consistently proven to be one of the best products for tracking and wind intensity. We elevate the winds within each dynamical product in the vicinity of the core to the NHC levels throughout the forecast horizon. Regarding variability, we do not consult the NHC track forecast cone as its size does not change over a given year. Rather, we leverage GEFS directly by forcing each of its members
individually through G-STOFS. While the computational expense is excessive, we leverage a novel ensemble workflow manager on Texas Advanced Computing Center’s (TACC) Frontera for automated meteorological forcing acquisition and job execution.
In general, G-STOFS forced by deterministic dynamical products with NHC-assimilated winds indicated heightened risk to metropolitan areas in Southwest Florida like Fort Myers approximately one day before Hurricane Ian made US landfall. Consulting PSSG generated with an ensemble of NHC-assimilated GFS, HRRR, and GEFS, dangerous water levels were indicated approximately two days before landfall. This result suggests that PSSG can be used to qualify “best” water level predictions amidst high variability in ensemble meteorological products.
A Framework for Simulating Hurricane Boundary Layers Using Large-Eddy Simulation and Its Use in Developing PBL Parameterizations for NOAA’s Hurricane Analysis and Forecast System
Xiaomin Chen, George H. Bryan, Andrew Hazelton, Frank D. Marks, Pat Fitzpatrick, Ghassan J. Alaka, Jr, and Chunxi Zhang
Accurately representing planetary boundary layer (PBL) turbulent processes in numerical models is critical for improving hurricane forecasts. However, existing PBL parameterization schemes are mostly designed for low-wind conditions, and assessing their uncertainties in hurricane conditions remains challenging, mostly due to very scarce in-situ turbulence measurements. To fill in the gap, this study develops a modeling framework based on a small-domain large-eddy simulation (LES) to better understand the uncertainty of PBL schemes in hurricane conditions. The novelty of this framework includes the usage of a few input parameters to represent the TC vortex and the addition of a simple nudging term for temperature and moisture to account for the complex thermodynamic processes in TCs. This special model setup allows for a fair comparison of PBL schemes under the same controlled thermodynamic conditions against LES. Using this framework, we improved the high-order PBL scheme used in NOAA’s next-generation hurricane forecast model, Hurricane Analysis and Forecast System (HAFS). HAFS retrospective runs during the 2021 Atlantic hurricane season demonstrate that the improved PBL scheme leads to better structure and intensity forecasts than the original PBL scheme. Importantly, the improved PBL scheme shows promise to improve the forecast skill of rapid intensification events, which are notoriously challenging to predict. Avenues for future development of PBL parameterizations will be discussed.
Extended Range Machine-Learning Severe Weather Guidance Based on the Operational GEFS
Adam Clark, Aaron Hill, Kimberly Hoogewind, Andrew Berrington, and Eric Loken
Research by Hill et al. (2023) demonstrated extremely promising results using a random forest machine-learning algorithm with input from the Global Ensemble Forecast System v12 (GEFSv12) to generate probabilistic severe weather forecasts out to days 4-8. In their work, the GEFS Reforecast Dataset (GEFS/R) was used to train and test their random forest model, which was used to generate forecasts using operational GEFS forecasts as input. One limitation of the Hill et al. (2023) work was that, due to computational limitations, the GEFS/R forecasts only included 5 members. This work aims to build on Hill et al. (2023) by using the operational GEFSv12 dataset for training and testing. At the time of this writing, GEFSv12 has been operational for more than two years, so it may be possible to take advantage of the 31 members in the operational GEFSv12 for training and testing to get an improved result. We test this hypothesis by conducting several different experiments where the number of ensemble members in the training and testing dataset is varied between 5 and 31. We also test using individual ensemble members vs. ensemble summary measures as predictors.
Building Blocks for Workflows Orchestrating Message Passing Interface Simulations
Stefan F. Gary, Christopher Harrop, Alvaro Vidal Torreira, Christina Holt, Venita Hagerty,Matt Long, Ben Clifford, Yadu Babuji, Kyle Chard, Michael Wilde, and Isidora Jankov
The production of weather forecasts depends on the orchestration of many diverse applications: large numbers of ensemble members as well as data-assimilation, pre-/post-processing, visualization tools, and most recently, machine learning. To enable weather research community involvement, these complex workflows need to be automated, users need to be able to manage “live” workflows (i.e., as they run) in a straightforward manner, and the workflows need to be portable so they can run in a wide range of compute environments (e.g., on-premises clusters and cloud). Separately, over the last decades, the workflow community has developed different workflow systems for expressing automation and control but the majority of this work has been to support high-throughput workflows: i.e., 1000’s of single node tasks. Workflow fabrics’ support for the coordination and management of multi-node tasks (i.e., the large MPI jobs that are typical in weather forecasting) is much less widespread and documented. Here, we aim to help bridge this interdisciplinary knowledge gap with a curated set of templates and examples that demonstrate the automation and control of workflows that launch MPI tasks with existing workflow fabrics. This project is working toward the development of a proof of concept workflow that manages MPI tasks and has a similar topology as real-world operational workflows. By making these documented and evolving building blocks available to the community, we hope to empower users to work in the “MPI task niche” within the broader landscape of workflow fabrics.
A Quick Interface for Nesting Gigantic-Data Within Git Repositories
Guoqing Ge, Ming Hui, Terra Ladwig, and Stephen Weygandt
In today’s big data landscape, we encounter enormous amounts of data on a daily basis. In many cases, we want to implement version control for large data volumes in addition to our source code. For instance, when conducting scientific/engineering simulations or working with machine learning models, we may want to version control the simulation results, prerequisite, or training datasets, which can consist of massive amounts of data such as high-resolution images or videos. The size of these datasets can easily exceed 100 GB (some may even reach a few terabytes). For example, the UFS short-range weather app already has about 169 G of fixed files, and the UFS medium-range weather app already has more than 600 G of fixed files. How to effectively manage these large volumes of fixed files within the industrial-standard Git workflow is a great challenge.
Let’s consider a scenario where two teams, each consisting of 25 members, share a repository that contains 100 GB of binary data but work from two different sites. In practice, only one copy of the 100 GB data is needed at each site, and every team member should be able to access it transparently and effortlessly. It is unreasonable for each member to clone an individual copy of the data as it would put an enormous burden on network bandwidth as well as disk space, and significantly increase the clone time, leading to a poor user experience. Unfortunately, the current GIT utilities available in the market, such as git-lfs, cannot meet the above version control requirements.
The Quick Interface for Nesting Gigantic-data (QING, https://github.com/git-qing/git-qing) splits a repository into two different spaces, the normal lightweight GIT space and the heavyweight QING space. The data integrity is fully assured by using the SHA512 hash algorithm to mark each binary data. Users can choose how to transfer, archive, or back up a QING space by providing their own “download” and “upload” plugin scripts. Example plugins are provided with the interface. This allows users to leverage existing data transfer and archiving services, such as local disks, NAS, HPSS, FTP, Amazon Web Services, etc. The interface also introduces a local mirror concept so that multiple users in the same computer platform can share one single copy of gigantic data.
Impact of Changing Atmospheric Physics Schemes in Subseasonal-Length Coupled UFS Simulations
Benjamin W. Green, Eric Sinsky, Shan Sun, Georg A. Grell, and Vijay Tallapragada
Subseasonal prediction remains a uniquely challenging problem because the timescale involved cannot take much advantage of the memory imparted by atmospheric initial conditions (leveraged for predictions shorter than ~2 weeks), or of the slowly-evolving boundary forcings (leveraged for predictions longer than ~3 months). Regardless, interest in subseasonal prediction has grown substantially over the past decade owing to the identification of so-called “forecasts of opportunity” and the potential benefits of these forecasts to numerous sectors of society. Recognizing the demand for subseasonal forecasts, NOAA has been developing a fully-coupled Earth system model under the Unified Forecast System (UFS) framework which will be responsible for global (ensemble) predictions at lead times of 0-35 days. The development has involved several prototype coupled UFS runs consisting of bimonthly initializations over a 7-year period for a total of 168 cases. This webinar presents results from a study that leverages these existing baseline prototypes to isolate the impact of substituting (one-at-a-time) parameterizations for convection, microphysics, and boundary layer on 35-d forecasts. It is found that no particular configuration of atmospheric physics within the coupled UFS is uniformly better or worse for subseasonal prediction, based on several metrics including mean-state biases and skill scores for the Madden-Julian Oscillation, precipitation, and 2-m temperature. Importantly, the spatial patterns of many “first-order” biases (e.g., impact of convection on precipitation) are remarkably similar between the end of the first week and weeks 3-4, indicating that some subseasonal biases may be mitigated through tuning at shorter timescales. An additional convective parameterization test using a different baseline shows that attempting to generalize specific results within UFS may be misguided. Finally, an additional set of runs was able to isolate the impact of changing the default microphysics scheme between two of the coupled prototypes.
The GEFS Reforecasts to Support Subseasonal and Hydrometeorological Applications
Hong Guan, Yuejian Zhu, Bing Fu, Eric Sinsky, Xianwu Xue, Philip Pegion, Fanglin Yang, and Avichal Mehra
A 31-year (1989-2019) GEFSv12 reforecast has been generated at the National Centers for Environmental Prediction (NCEP) to support GEFSv12 operational implementation. The dataset is used to support forecast calibration and validation for the key stakeholders of the Office of Water Prediction (OWP), NCEP Climate Prediction Center (CPC), and other downstream applications of the National Weather Service and also serves as a useful tool for the broader research community in different applications. The dataset is freely accessible through Amazon Web Service (AWS): https://registry.opendata.aws/noaa-gefs-reforecast/ and NCEP rzdm:ftp://ftp.emc.ncep.noaa.gov/GEFSv12 (anonymous).
For the coming GEFSv13 implementation, which will use the United Forecast System (UFS) based on a fully coupled global ensemble forecast system, a 30-year (1994-2023) reforecast will be generated with a slightly different configuration from GEFSv12 to support weather and S2S applications. This study will summarize the dataset of the GEFSv12 reforecast and systematic characteristics and present the development and plan for the GEFSv13 reforecast. The systematic characteristics of the GEFSv13 based on the 2-year experiments will be also compared with the GEFSv12 reforecasts, particularly for the surface systematic errors of atmosphere and ocean surface and MJO’s prediction as well.
Evaluation and Improvement of NOAA’s Hurricane Analysis and Forecast System (HAFS) Using Aircraft Observations
Andrew Hazelton, Ghassan Alaka, Sundararaman Gopalakrishnan, Xuejin Zhang, and Lew Gramer
The Hurricane Analysis and Forecast System (HAFS) is the hurricane application of the Unified Forecast System (UFS), and has been accepted for operational implementation during the 2023 hurricane season. AOML has been heavily involved in the development and evaluation of HAFS, including conducting real-time and retrospective forecasts, and also using airborne data to evaluate TC structure and improve model physics. One of the two operational HAFS configurations (The “HAFS-B” configuration), in addition to using the Thompson microphysics, includes PBL physics modified for TC environments through the use of aircraft observations and LES data. We perform a comprehensive evaluation of HAFS-B by conducting sensitivity tests by independently switching the PBL and microphysics schemes, to isolate how the different physics modifications impact the forecasts for a large subset of key cases from the 2020-2022 retrospective forecasts. The results show the importance of the PBL physics modifications for improving forecasts of TC rapid intensification, as illustrated in composite RI statistics as well as careful analysis of several case studies. The microphysics differences also lead to notable changes in track skill as well as vortex depth and precipitation structure. In addition to this physics testing, the HAFS-B retrospective forecasts are directly evaluated through comparison with airborne radar data from the NOAA P3 flights into multiple Atlantic hurricanes over the last 3 seasons. The results from this observational comparison illustrate some of the strengths and weaknesses of the HAFS-B structure forecasts, and motivate ongoing physics development to further refine and improve the forecasts.
Digitalizing Atmospheric Systems
Wei Huang
A numerical method is developed to digitalize atmospheric systems into 6 kinds:
- High pressure, with warm temperature, and high density;
- Low pressure, with cold temperature, and low density;
- High pressure, with warm temperature, but low density;
- Low pressure, with cold temperature, but high density;
- Low pressure, with warm temperature, but low density;
- High pressure, with cold temperature, but high density.
One can put these 6 kinds in to two modes, where the first 2 kinds as thermal-dynamic systems, as the dot product of temperature and density is great than (or equal to) zero; the other 4 kinds can be treated as non-thermal-dynamic systems, which the dot product of temperature and density is less than zero. Further analysis will put the non-thermal-dynamic systems into two types: 3 and 4 as thermal systems, where dot product of pressure and temperature is great than zero, or say pressure mainly follows the temperature, which means these 2 systems are temperature driven, or thermal driven; 5 and 6 as dynamic systems, where dot product of pressure and density is great than zero, or pressure mainly agrees with density, means these 2 systems are density driven, or dynamic driven.
GFS model analysis data are used to calculate the above indices, and its plots show the patterns corresponding to what we know of subtropical high regions, tropical low regions, and others. Such an index did not change with altitude, therefore giving a clearer picture of atmospheric systems.
IFS model analysis data are also used for analysis. Surprisingly, GFS and IFS have very different indices, especially in the upper troposphere.
Further study will investigate the root cause of the difference, and will try to find a solution to reduce those differences.
Generating a Flexible Verification System for Precipitation by Assessing Precipitation Skill in the Unified Forecast System (UFS) and North American Multi-Model Ensemble (NMME) via Model Evaluation Tools (METplus)
Johnna M. Infanti, Emily Becker, Tim Eichler, Dan C. Collins, Justin Hicks, Tara Jensen, Ben P. Kirtman, and John Opatz
Integration of the NOAA Unified Forecast System (UFS) Seasonal Forecast System (SFS) into the seasonal research and forecasting communities relies on assessment of skill and biases of precipitation over North America in both hindcasts and real time forecasts, as well as comparison to existing seasonal model forecasts. The National Center for Atmospheric Research (NCAR)’s enhanced Model Evaluation Tools (METplus) verification framework is intended to be used to verify the UFS and has a large library of verification metrics and community support approach, and is thus a natural choice for creation of a verification system for UFS-SFS and necessary comparisons. We aim to create a flexible verification framework utilizing METplus to allow streamlined assessment of probabilistic seasonal precipitation forecast skill, including deterministic and probabilistic hindcast, realtime, and conditional skill related to the key drivers of precipitation such as the El Nino Southern Oscillation (ENSO) and soil moisture that can be easily expanded to any climate model ensemble. Via METplus, we are able to calculate a variety of metrics including bias, error, anomaly correlation, Brier Skill Score, Rank Probability Skill Score, and more. These metrics are available without the need to code them in a preferred language such as Python, which can minimize code bugs in verification and allow for more consistency when calculating skill scores. The development, documentation, and demonstration of these process-based model capabilities will provide valuable feedback to the UFS model development team and community, with the potential to improve the key modes of variability that impact seasonal precipitation forecasts.
METplus: The Long and Winding Road to Unified Verification
Tara Jensen, John Halley Gotway, Molly Smith, Bonny Strong, John Opatz, Tina Kalb, George McCabe, Hank Fisher, Jonathan Vigh, Minna Win-Gildenmeister, Dan Adriaansen, and Julie Prestopnik
With the evolution of weather and climate prediction using the Unified Forecast System (UFS), verification and evaluation activities are critical for the success of Research to Operations (R2O) across the UFS community. The enhanced Model Evaluation Tools (MET) framework (METplus) is at the core of an expanding cross-section of the community evaluation activities.
The METplus system consists of several components, including the MET, for the computation of verification statistics based on gridded forecasts and either a gridded analysis or point-based observations. The system also incorporates an analysis system for aggregating statistics and plotting graphical results. These tools are designed to be highly flexible to allow for quick adaptation to meet additional evaluation and diagnostic needs. A suite of python wrappers have been implemented in METplus to facilitate a quick set-up and implementation of the system, and to enhance the pre-existing plotting capabilities.
All components of METplus were recently accepted for installation on the National Oceanic and Atmospheric Administration’s (NOAA’s) operational high-performance computing platform and is being integrated into the Environmental Modeling Center (EMC) Verification System (EVS). It has also undergone several years of rapid development to better support evaluations on many temporal and spatial scales for the atmospheric component of a coupled Earth system model, into which the UFS is evolving. Development also continues to broaden its support for the components of the UFS. This presentation will focus on the challenges and successes of developing the community software suite, including addressing the needs of operational partners, accepting contributions to facility R2O of evaluation methods, and refactoring for the future.
Augmented Global Background Perturbations for Mitigating Sampling Errors for Regional Applications of the UFS
Kenta Kurosawa and Jonathan Poterjoy
In convective-scale numerical weather prediction (NWP) models, handling sampling errors arising from small ensemble sizes, model errors, and nonlinear model dynamics is a substantial challenge. Increasing the ensemble size brings a promising prospect for improving prediction accuracy; however, it often encounters limitations, particularly when it comes to the computational resources needed for handling large-scale, high-resolution datasets. The current study introduces a method to address the issue of sampling error in convective-permitting data assimilation by enhancing the flow-dependent background error covariances from the filter update with perturbations sourced from a different model. This approach uses the advantages of a hybrid background error covariance matrix, integrating a more comprehensive representation of system behavior. Specifically, we supplement ensemble perturbations from the Global Data Assimilation System (GDAS), and the perturbations are centered on the ensemble mean of the NOAA Hurricane Analysis and Forecast System (HAFS) at each analysis period. Additionally, we conduct a comparative analysis of the Ensemble Kalman Filter (EnKF), localized particle filter (LPF), hybrid LPF-EnKF, and EnKF/LPF combined with En3DVar using the augmented perturbation strategy. Experiments that rely on 80 augmented perturbations from GDAS for updating 40-member ensembles with the EnKF and LPF are found to produce substantial improvements over benchmark experiments. Likewise, a hybrid LPF-Var method, which blends the strengths of parametric and non-parametric data assimilation, consistently demonstrates superior performance to the current EnKF/EnVar strategy. The new approaches are evaluated over multi-week cycling data assimilation experiments focusing on Hurricanes Laura and Marco in August 2020.
Recent Improvements and Near-Future of NCEP’s Global Workflow
Walter Kolczynski, Rahul Mahajan, Kate Howard, Xianwu Xue, Cory Martin, Henry Winterbottom, and Terry McGuinness
The global workflow has accumulated a lot of disparate scripts over the years as many different developers have contributed additional features without overarching code standards. Now we are in the process of standardizing not just our codes but our processes. Here we review some of the strides we have already made in standardization, from refactoring existing code to implementing new processes to ensure new code complies with standards. We also take a peek at our python- based future.
Waves in the Unified Forecast System
Jessica Meixner, Chris Birchfield, Darin Figurskey, Michael Folmer, Matthew Masarik, Saeideh Banihashemi, Ricardo Campos, Salimi-Tarazouj, and Avichal Mehra
Within the Unified Forecast System (UFS) community, it is frequently asked why waves are modeled. Wave forecasts can be used for safe recreation: swimmers can be made aware of potential for rip-currents, surfers can learn of wave conditions, and boaters can be made aware of conditions for small watercraft. As we move offshore, wave forecasts help save life and property at sea. The next inquiry is typically about the computational cost of the wave model. In the UFS, the third generation spectral wave model WAVEWATCH III is used. In the model we solve for the wave action density which is a function of time, geographic space, wave direction and wave number, which means at every point in time and geographic space, there are O(103) degrees of freedom to solve for, explaining why the wave model is more expensive than a 2D model that many might think is a surface problem. The follow up question that is frequently asked is why include waves in the coupled system. Currently, waves are actively developed as part of the coastal, hurricane, and global UFS applications. In coastal applications, waves can contribute up to a meter to the total water height in coupled wave-surge models. In global models, waves are at the sea-surface contributing to mixing in the ocean and can provide a sea-state dependent roughness to the atmospheric model. Feedback from the wave model to the ocean and atmosphere in the hurricane application is in future work. Additionally, looking to the future, the wave models in UFS coupled systems could provide sea-spray to the atmospheric and aerosol models and wave states to the ice model to improve the representation of wave-ice interaction. Here, we explain the importance of investing resources into these complex wave models and how they are being used operationally by weather forecasters and public safety decision-makers to keep the community safe.
Offshore Wind and Wave Energy Resource Characterization Using the Real-Time Mesoscale Analysis (RTMA)
Panagiotis Mitsopoulos and Malaquias Peña
Wind speed and significant wave height are the most critical metocean parameters for characterizing offshore wind and wave climate. Although the two variables are physically coupled, they are usually studied independently, and their relationship is not fully understood under several marine conditions. At the same time, wind speed and significant wave height are critical parameters for engineering applications. Offshore renewables, specifically energy generated from wind turbines and marine power, are at the forefront of current and future energy management developments.
On the one hand, the nascent offshore wind energy industry is at the cutting edge of offshore re- newables due to its mature technology. On the other hand, ocean wave-generated power devices are advancing towards commercialization as the first wave power testing sites (Freeman et al.) are being permitted. Therefore, the accurate wind speed and significant wave height estimations from state-of-the-art analysis models and their synergy with in-situ observations from buoys, remotely sensed from satellites, and hindcast or reanalysis estimations (Guillou et al.) provide essential com-plementary information to energy resource assessment.
The 3D-RTMA provides an analysis of surface meteorological variables with potential use for offshore wind and wave power characterization. Its high spatial (2.5 km) and temporal (15-minute) resolution can be leveraged to accurately estimate the wind and wave power, expanding the value of in-situ observations to characterize resource spatial and temporal variability. In addition, the expected physically consistent wind-wave model of the 3D-RTMA may help detect potential sites
that could be ideal locations for wind and wave energy cogeneration.
This presentation will use the National Blend of Models (NBM) data, which includes hourly 80-meter wind speed estimations and significant wave heights, to show the potential use of the upcoming 3D-RTMA data and the requirements for the coastal and offshore renewable energy industry. Essential information for wind energy resource assessment and the evaluation of forecasts relies on data from scarce lidar buoys or offshore wind towers. Thus, the 80-meter wind products are useful
for filling important geographical gaps. However, considering the current generation of wind tur-bines can reach up to 280-meter levels, it is imperative to consider additional model levels for better representation of offshore wind speed and direction and energy production. The importance of the 3D-RTMA marine component is also emphasized, considering the satellite altimeters’ significant wave height quality deterioration in the last 15 km from the coast and in areas where land contamination of the altimeter signal is significant (Mitsopoulos et al.). Therefore, in addition to the hourly significant wave height estimations, we suggest the dissemination of peak wave period or, ideally, adding the wave spectrum in the marine component, which will be pivotal to estimating the wave power flux accurately. The wave power flux estimations based on the National Data Buoy Center data are introduced, including the wave spectrum and empirical parameterizations using the ERA5 significant wave height and peak period. We also stress needing a sea state analysis with a higher spatial and temporal resolution.
Contributions of this study include identifying geographical locations where the co-located wave and wind energy sources complement each other based on their temporal variability. These locations are optimal for maximizing the complementarity of the wind and wave resources, and providing this information is critical for assessing energy generation, storage, and several other engineering applications.
The study’s impact relies on describing how a real-time analysis system can benefit the resource
assessment and reduce the errors inherent in estimations using reanalysis or hindcast data. Besides, we emphasize the RTMA areas of improvement in describing the wind/wave mesoscale variability and the scales of variability it currently does not capture, which are crucial for managing power production.
Developing the NOAA National Ocean Service Coastal Ocean Models Coupling Infrastructure (ufs-coastal)
Saeed Moghimi, Maoyi Huang, Panagiotis Velissariou, Yunfang Sun, Ed Myers, Corey Allen, Tracy Fanara, Derrick Snowden, Patrick Burke, Carolyn Lindley, Arun Chawla, Ufuk Turuncoglu, Dan Rosen, Carsten Lemmen, Joseph Zhang, Damrongsak Wirasaet, Joannes Westerink, Jianhua Qi, Changsheng Chen, Hernan G. Arango, John Wilkin, Ayumi Fujisaki-Manome, Chris Domanti, and Keven Blackman
NOAA’s National Ocean Service (NOS) is partnering with the Oceanic and Atmospheric Research (OAR), National Weather Service (NWS) and coastal ocean modeling community to develop its next-generation coastal ocean coupling infrastructure for integration into the Unified Forecast System portfolio. In this presentation, we will share our draft roadmap to build the UFS COASTAL (ufs-coastal-model) coupling infrastructure and its downstream applications (ufs-coastal-apps). This effort is reinforced by critical support provided through the Bipartisan Infrastructure Bill (BIL), where multiple projects were funded to support the coastal ocean modeling community, ESMF/NUOPC development team and NOAA industry partners.
Using Deep Learning to Forecast Marine Ecosystems
Gian Giacomo Navarra
Several studies have documented the strong link between North Pacific marine populations and climate variability. Yet, it remains unclear the extent to which these relations lead to improved forecasts of ecosystems. Using a deep learning tool (neural network) we explored the extent to which physical predictability leads to multi-year prediction of dominant fisheries indicators. We have found that the Encoder-Decoder (ED) deep learning model was able to reproduce the testing data with a significant skill up to 3 years of lead time. Our results suggest an important role played by physical drivers, the SST and SSH, on the forecast of the ecological time series even if it is not the only contribution. The ability of deep learning tools in having a skill-full forecast further in the future can represent a promising approach for many socioeconomic communities to adapt to climate change.
The Weather Prediction Center Development and Training Branch: R2O Activities Within the Hydrometeorological Testbed (HMT)
James Nelson
The Weather Prediction Center (WPC) vision of being America’s go-to center for high-impact precipitation and hazardous weather events is supported by the Development and Training Branch (DTB) of WPC. WPC forecast operations range from short term (<6hrs) to long term (up to 7 days). WPC is focused on high-impact precipitation (heavy rainfall, heavy snowfall, and ice) in the short term. At longer time ranges, WPC forecasters prepare forecasts of sensible weather elements (temperature, dewpoint, wind, sky, precipitation, and chance of precipitation) with particular focus on potential weather hazards.
In order to support and advance WPC’s forecast operations; a cornerstone of DTB is the R2O process. DTB engages in collaborative efforts with many scientists throughout the meteorological community focused on heavy rainfall, heavy snowfall, and medium-range forecasts. One method for engaging the Weather Enterprise (UFS, Academic, NOAA Labs, Private, etc.) is through the Hydrometeorological Testbed (HMT) at WPC. The DTB also leverages the NCEP Visiting Scientist Program and other avenues to stay abreast of the latest science and technology. Particular focus is given to the use of ensembles in the forecast process as well as production of probabilistic products. This presentation will highlight the HMT experimental forecast and verification activities supporting R2O that enhance the National Weather Service’s Weather Ready Nation initiative.
Joint Technology Transfer Initiative: Building a Bridge Between the Weather Research and Operational Communities
Chandra Kondragunta, Aaron Pratt, Valbona Kunkel, Kevin Garrett, Nicole Kurkowski, and Wendy Sellers
In FY2016, the National Oceanic and Atmospheric Administration (NOAA) Office of Oceanic and Atmospheric Research’s (OAR) appropriation included an increase of $6M to create a new program called the Joint Technology Transfer Initiative (JTTI). OAR received increased funding in the subsequent years in support of this program. OAR carried out this program in coordination with the National Weather Service (NWS), and in cooperation with the American Weather Enterprise. JTTI’s main mission is to transition promising matured weather research from the American Weather Enterprise into the NWS operations.
Within OAR, the Weather Program Office (WPO) is responsible for managing the JTTI program. Promising transition projects are selected through external competitions (Notice of Funding Opportunity (NOFO) through Grants.Gov) for non-NOAA scientists and internal competitions for NOAA scientists. More than half of JTTI funded R2O projects support advancement of the Unified Forecast System (UFS), through NOFOs, internal competitions and NWS/Office of Science and INtegrations led UFS_R2O project. In order to further advance the UFS, this year JTTI partnered with WPO’s Earth Prediction Innovation Center (EPIC) program in the Innovations for the Community Modeling competition to fund projects that accelerate transition of scientific research and modeling contributions to NOAA operations. In addition, FY23 appropriations directed the JTTI to support faster adoption of operationalizable weather model upgrades. In this paper, we present how JTTI built the bridge to connect the weather research community to the operational community, JTTI funded UFS_R2O projects and how JTTI supports the EPIC mission.
High-Resolution Simulation of Smoke Transport Within NOAA’s Rapid-Refresh Forecasting System: Verification of the Smoke 3D Distribution Using Aircraft Measurements
Johana Romero, Ravan Ahmadov, Haiqin Li, Jordan Schnell, Eric James, Shobha Kondragunta, Xiaoyang Zhang, Fangjun Li, Chelsea Stockwell, Siyuan Wang, and Georg Grell
Accurate smoke forecasting helps minimize population exposure to hazardous pollutants, improves weather forecasting, and guides wildfire-fighting operations. NOAA’s Global Systems Laboratory is developing a new experimental weather forecast model, the Rapid-Refresh Forecasting System (RRFS), that simulates the transport and mixing of smoke and dust aerosols. This experimental RRFS-Smoke-Dust model covers the CONUS domain at a 3 km resolution. The smoke emissions are based on satellite fire radiative power data from the hourly Regional ABI and VIIRS fire Emissions product (RAVE) that fuses FRP from the Advanced Baseline Imager (ABI) on the Geostationary Operational Environmental Satellites (GOES-R) and the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Joint Polar Satellite System (JPSS) satellites. Fire plume rise and dry and wet removal parameterizations are also included in the model.
Here we use in-situ and remote sensing measurements taken onboard the DC-8 aircraft during the 2019 FIREX-AQ field campaign to evaluate the modeled mixing layer, fire injection heights, and aloft smoke concentration distribution.
Simulated smoke vertical distribution compares well with that estimated using the aircraft’s high-spectral-resolution lidar measurements. The distribution of smoke near the surface is also well-represented. Evaluation of the performance of RRFS-SD smoke predictions via comparisons with operational HRRR-Smoke is also discussed.
Management and Contributions of Supplemental Projects to the Advancement of the Unified Forecast System (UFS)
Christopher Spells and Ben Woods
Each year, extreme weather including floods, wildfires, hurricanes, and heavy precipitation has massive societal and economic impacts on the United States. As extreme weather events have become more costly, frequent, and devastating, it is critical that resources are allocated to improve weather forecast accuracy, helping to mitigate the impacts on life, property, and the economy. As detailed in this poster, the National Oceanic and Atmospheric Administration’s (NOAA) Weather Program Office (WPO) is managing research portfolios that contribute to the advancement of the Unified Forecast System (UFS).
Following active 2017 and 2018 hurricane seasons and an active 2019 wildfire season, Congress passed the Bipartisan Budget Act in February 2018 and the Additional Supplemental Appropriations for Disaster Relief Act in June 2019.
Variational Assimilation of Surface Particulate Matter Observations in the Experimental Rapid Refresh Forecast System Coupled With Smoke and Dust Model
Hongli Wang, Stephen Weygandt, Ravan Ahmadov, Ruifang Li, Johana Romero-Alvarez, Haiqin Li, Youhua Tang, Georg Grell, Mariusz Pagowski, and Cory Martin
NOAA Global System Laboratory (GSL) has developed an experimental FV3-based limited area Rapid Refresh Forecast System (RRFS) Smoke and Dust model (RRFS-SD) that aims at operational application at NCEP/EMC. This presentation describes the recent development of surface Particulate Matter (PM2.5 and PM10) assimilation scheme for providing accurate smoke and dust initial conditions to RRFS-SD in the framework of the Gridpoint Statistical Interpolation (GSI) three-dimensional variational (3D-Var) data assimilation system. The impact of the developed PM2.5 assimilation on fire prediction is evaluated with the heavy fire events taking place in the US during September 2020. In general, it is found that the assimilation of PM2.5 from the AirNOW or PurpleAir PM2.5 observing network reduces the bias in 24h PM2.5 simulation during the heavy fire events. Challenges in assimilating surface PM2.5 will also be discussed.
Atmosphere-Ocean Coupled Energetics of Shallow and Deep Tropical Convective Discharge-Recharge Cycles in the UFS
Brandon Wolding
An energy budget combining atmospheric moist static energy (MSE) and upper ocean heat content (OHC) is used to examine how day-to-day convective variability couples to flows of energy into, out of, and between the tropical atmosphere and ocean. Two slowly evolving states of approximate MSE balance are identified, corresponding to suppressed and enhanced convective equilibrium states identified by previous multi-equilibria studies of the tropical atmosphere. Feedbacks arising from atmospheric and oceanic transport processes, surface fluxes, and radiation drive the cyclical amplification and decay of convection around these suppressed and enhanced convective equilibrium states, referred to as shallow and deep convective discharge-recharge (D-R) cycles respectively. Variations in the flows of energy into and out of the atmosphere are comparable in magnitude to, but considerably more balanced than those experienced by the upper ocean. Variations in the quantity of atmosphere-ocean coupled static energy (MSE + OHC) result primarily from atmospheric and oceanic transport processes, but are mainly realized as changes in OHC. Changes in the amount, type, and organization of clouds occurring throughout D-R cycles are characterized using satellite-derived precipitation and cloud type datasets, and related to variations in the atmosphere-ocean coupled energy budget. This analysis is repeated for the coupled UFS, and used to provide actionable guidance to model developers seeking to improve representation of tropical convective variability and atmosphere-ocean coupling.
A Novel Dynamically Coupled Land-River-Ocean Modeling Suite for Hurricane-Induced Compound Flooding
George Xue, Daoyang Bao, and John C Warner
We introduced WRF-Hydro to the Coupled-Ocean-Atmosphere-Wave-Sediment Transport (COAWST) Modeling System. The river model (WRF-Hydro) is coupled with the ocean model (ROMS) along the land-ocean boundary, where water level and velocity information is exchanged dynamically. The Model Coupling Toolkit was applied at the hydrological and ocean model boundary to ensure the seamless exchange of water level and momentum. We applied the system to simulate the water dynamics during the compound flooding of several hurricanes, including Harvey in 2017, Florence in 2018, and Ida in 2020. A series of diagnostic experiments were conducted to assess the contribution of ocean and river in the compound effect. Our findings reveal that the compound effect during flooding events acts as a double-edged sword, yielding contrasting outcomes. In certain locations, the compound effect amplifies the peak flood level, while in others, it diminishes it.
Satellite Radiance Data Assimilation in the NOAA’s Next-Generation Regional Model-Rapid Refresh Forecast System (RRFS)
Xiaoyan Zhang, Ting Lei, Shun Liu, Haidao Lin, and Jacob R. Carley
The next-generation operational regional forecast and data assimilation system at National Oceanic and Atmospheric Administration (NOAA) uses the unified hourly-updated, storm-scale ensemble data assimilation and forecasting system based on the FV3 dynamic core, which will be called the Rapid Refresh Forecast System (RRFS). As the satellite radiance data becomes higher and higher in both space and temporal resolution, how to effectively assimilate radiance data in a high-resolution regional model to improve the weather forecast is always a challenging project. Although the satellite radiance data has been assimilated in the operational regional model (NAM and RAP) for a decade, it still needs to be re-addressed within this new RRFS system by evaluating the performance of each satellite instrument. In order to do so, a series of data sensitivity experiments have been performed within RRFS, specifically adding one type of satellite in each experiment by the order of ABI, AMSUA, ATMS, MHS, CrIS, and IASI. Compared to these experiments, better forecast scores for relative humidity within 12-hours were obtained from GOES-16/18 ABI clear sky data. Polar satellite radiance data has a relatively small impact on all forecasts. More details about each type of satellite data’s performance in RRFS will also be discussed in the presentation.
Virtual Posters – Please see the UIFCW 2023 Slack#virtual-poster workspace to view the posters (Abstracts below)
Evaluation of Land-Atmosphere Coupling Processes and Climatological Bias in the UFS Global Coupled Model
Eunkyo Seoa, Paul A. Dirmeyer, and Michael Barlage
This study investigates the performance of the latter NCEP Unified Forecast System (UFS) Coupled Model prototype simulations (P5-P8) during boreal summer 2011-2017 in regard to coupled land-atmosphere processes and their effect on model bias. Major land physics updates were implemented during the course of model development. Namely, the Noah land surface model was replaced with Noah-MP and the global vegetation dataset was updated starting with P7. These changes occurred along with many other UFS improvements. This study investigates UFS ability to simulate observed surface conditions in 35-day predictions based on the fidelity of model land surface processes. Several land surface states and fluxes are evaluated against flux tower observations across the globe and segmented coupling processes are also diagnosed using process-based multivariate metrics. Near-surface meteorological variables generally improve, especially surface air temperature, and the land-atmosphere coupling metrics better represent the observed covariance between surface soil moisture and surface fluxes of moisture and radiation. Moreover, this study finds that temperature biases over the contiguous United States are connected to the model’s ability to simulate the different balances of coupled processes between water-limited and energy-limited regions. Sensitivity to land initial conditions is also implicated as a source of forecast error. Above all, this study presents a blueprint for the validation of coupled land-atmosphere behavior in forecast models, which is a crucial model development task to assure forecast fidelity from day one through subseasonal timescales.
As a specific example, the figure below shows the spatial distribution of the land coupling regime during JJA over CONUS. Panels (a) and (b) represent validation targets, constructed from the Global Land Evaporation Amsterdam Model (GLEAM; Martens et al., 2017) gridded latent heat flux estimates. In (a), surface soil moisture is from the time-filtered satellite COMBINED European Space Agency (ESA) Climate Change Initiative (CCI) Soil Moisture v06.1 dataset (Dorigo et al. 2017) as corrected by Seo and Dirmeyer, (2022). In (b) both latent heat flux and surface soil moisture are from GLEAM. The remaining panels show UFS performance from successive prototypes as derived from 35-day subseasonal suites of retrospective forecasts: (c) P5, (d) P6, (e) P7, and (f) P8. Shadings indicate correlations indicated in the colored square: latent heat flux LH to surface soil moisture SSM (x-axis) and net radiation Rn (y-axis). The distributions of the kernel density estimations from GLEAM (black), P6 (orange), P7 (green), and P8 (cyan) along corresponding axes are shown as marginal distributions along the edges of the color square. Each curve has been normalized to have the same maximum value. P7 and P8 show marked differences including a more distinct area of strong positive correlation of LH with Rn over the eastern half of the US, consistent with the validation panels. P7 and P8 appear to have a more widespread area of high correlation between LH and SSM across the Southwest, but this correlation remains too weak in the Northwest.
Evaluating the Multiscale Implementation of Valid Time Shifting for a Convective-Scale FV3-LAM EnVar System During the 2022 Hazardous Weather Testbed
Nicholas Gasperoni, Xuguang Wang, Yongming Wang, and Tsung Han Li
The valid time shifting (VTS) method is a cost-effective way to increase the background ensemble size for ensemble-based data assimilation (DA) systems. This is accomplished by including ensemble information at valid times before and after the central analysis time from ensemble forecasts initialized from the base ensemble analysis at previous DA cycle time. In this study, the VTS was implemented within the multiscale GSI-based hybrid ensemble variational (EnVar) system for the realtime 2022 Hazardous Weather Testbed (HWT) Spring Forecasting Experiment (SFE). This hybrid system is run by the Multi-scale data Assimilation and Predictability (MAP) laboratory. The system includes hourly sequential multiscale DA of conventional in-situ observations (mesoscale environment) and radar reflectivity (storm-scale) observations.
During the 2022 SFE, the VTS was tested for both conventional and radar DA components, with the goal of identifying relative impacts of VTS for different scale components of the DA on forecasts of the mesoscale environment and convective systems. Three real-time configurations were ran during the 2022 HWT SFE run with 108-member VTS-expanded ensembles: VTS for individual mesoscale conventional DA (ConVTS) or storm-scale radar DA (RadVTS), and VTS integrated to both scale DA components (BothVTS). Systematic verification demonstrated that BothVTS can capture the DA spread and accuracy of the best performing individual component VTS. Ten-member ensemble forecasts showed BothVTS performs similarly to ConVTS, with RadVTS having better skill at forecast hours 1-6 while Both/ConVTS had better skill at later hours 7-15. An objective splitting of cases by 2-m temperature cold bias revealed that RadVTS performed significantly better than Both/ConVTS out to hour 10 for cold-biased cases, while BothVTS performed overall best for less-biased cases. The latter benefits of BothVTS were demonstrated subjectively with representative cases – BothVTS has improved upscale growth and structure of convective systems compared to RadVTS at mid-range forecasts, helping to correct fast propagation errors, while cases with mature convective systems at initialization time showed BothVTS to correct slow propagation errors from ConVTS in early hours. A soil replacement sensitivity experiment demonstrated improved performance of BothVTS when the underlying extreme cold model bias was reduced. Diagnostics revealed enhanced spurious convection of BothVTS was tied to larger analysis increments in temperature than moisture in the presence of the cold bias, resulting in erroneously high convective instability compared to RadVTS. This study is the first to examine the benefits of a multiscale VTS implementation, showing that if model biases are not extreme, BothVTS can be utilized to improve the overall performance of a multiscale convection-allowing ensemble system.
Sailfish: A ROMS Compatible, Ocean Numerical Model for the GPU
Jose M. Gonzalez-Ondina
GPUs are, by orders of magnitude, the most computationally powerful part of modern computers. Modern GPUs contain thousands to tens of thousands of cores that can perform tens of teraflops. On the memory side, modern GPUs like Nvidia’s A100 and H100 can have up to 80 Gb of fast VRAM, enough to contain big models without the need of domain partitioning and message passing.
Unfortunately, the scientific community has been slow to adapt their numerical models to use GPUs, and the most common approach is to combine GPU code with MPI or to make heavy use of non-VRAM memory. These hybrid approaches incur on the penalty of slow memory transfer, either through MPI message passing or between CPU and GPU memory, negating some of the speedups obtained by using GPUs.
We are developing a new numerical ocean model, written in Python, that could be used as a drop-in replacement of the Regional Ocean Modeling System (ROMS). This code, called Sailfish, solves the primitive equations using algorithms very similar to the ones used in the ROMS, but running almost entirely in the GPU. Sailfish will be able to read ROMS inputs and write the same type of outputs, but it is an entirely new code, using Nvidia’s CUPY library plus some CUDA kernels written in C++. The popularity of Python in academia combined with it being a modern language, allows for a simple and clear code that will be easy to understand and modify.
Preliminary tests show that Sailfish can be hundreds of times faster than sequential ROMS (or~10x in typical parallel configurations). In a cluster like UF’s HiPerGator, with nodes of four A100 GPUs, it could be possible to run ~100 Sailfish simulations per node (in the same time it takes to run a single ROMS simulation using 100 CPUs). This speedup will allow to run large ensembles, allowing for better uncertainty analysis and informing Data Assimilation algorithms, as well as increasing the number of scenarios one can analyze by at least one order of magnitude.
Agile DevOps Processes of Continuous Integration and Deployment for UFS System Development
Jong Kim, Zach Shrader, Mike Lueken, Fernando Andrade-Maldonado, and Rhae-Sung Kim
Source code decomposition shows that various modeling components and external software libraries are involved in the architectural elements of the UFS Weather Model (WM) and applications: atmosphere, ocean, sea ice, wave, aerosol, and coupling infrastructure, etc. EPIC is mandated to maintain a common code base and modeling infrastructure of the UFS system for both research and operational forecasts. Software integrity and quality of incremental code changes are continuously ensured in the agile community-based development and operations (DevOps) processes that adopt continuous integration (CI) and continuous deployment (CD) tools to maintain the required UFS baseline test cases. Various CI/CD tools, such as Jenkins, Git workflow CI, pyGithub, and Docker containers are currently utilized in daily UFS code management and DevOps practices to improve code delivery time without compromising on qualities. With these CI/CD tools, code changes are efficiently coordinated through an automated pipeline that manages repetitive builds and tests to capture potential issues. With benchmarking examples and test results, we review the current status of the application of the CI/CD tools and UFS code management practices.
Assimilating GOES-16 All-Sky ABI Radiances With the HAFS Dual-Resolution EnVar DA System for Hurricane Predictions
Xu Lu and Xuguang Wang
Hurricanes spend the majority of their lifespan over the open ocean, making the optimal utilization of all-sky radiance observations from satellites essential to improving their initializations in numerical weather prediction models. Despite this, the assimilation of all-sky radiances in convection-allowing hurricane prediction is still in its early stages, with limited studies exploring the assimilation of all-sky Advanced Baseline Imager (ABI) observations on-board GOES-16 for hurricane predictions, even after the launch of the first GOES-R series of the next-generation geostationary weather satellites in 2016.
To address this issue, we explore the optimal configurations of all-sky ABI radiance data assimilation using the state-of-the-art Hurricane Analysis and Forecast System (HAFS), the next- generation hurricane modeling and data assimilation system. Specifically, we investigate two scientific questions: (1) What is the best way to estimate observation errors and correct observation biases in HAFS for all-sky ABI assimilation? (2) What is the impact of all-sky ABI assimilation on the storm inner-core structure evolution and intensity prediction?
Multiple experiments have been conducted with Hurricane Laura (2020) prior to its rapid intensification onset, and the results demonstrate the positive impacts of assimilating all-sky ABI radiances in HAFS and the value of bias corrections and adaptive observation errors. These findings will be presented at the conference, along with in-depth diagnostics.
Improving the Background Ensemble Covariance at the Air-Sea Interface for the Fully Coupled Data Assimilation System in HAFS
Xu Lu, Xuguang Wang, Hyun-Sook Kim, H Jun A. Zhang, HeeSook Kang, and Yongzuo Li
Oceans are vital heat and energy sources that play a critical role in the development of hurricanes, working in conjunction with atmospheric changes. However, current data assimilation (DA) practices tend to treat the ocean and atmospheric components of numerical models independently. This approach can create significant inconsistencies at the air-sea interfaces, leading to potentially degraded hurricane predictions in the Hurricane Analysis and Forecast System (HAFS).
To address this issue, our study aims to improve the background ensemble covariance at the air-sea interface for the fully coupled data assimilation system in HAFS. We will begin by running a control experiment with self-cycled HAFS ensemble and vortex initialization (VI) coupled with the HYCOM ocean model. The preliminary investigation will focus on the air-sea background and ensemble status of this control experiment.
To verify the model’s performance, we will incorporate a range of inner-core observations for both the ocean and atmosphere from different field campaigns. These observations include TDR and dropsondes from NOAA P-3 aircraft, SailDrone observations, AXBT probes, and Gliders. Our study will discuss the verification of the control experiment in more detail, with further findings to be presented at the meeting.
An Innovative Approach to Verify Ocean Models and Tools
Quang-Hung Luu and Pavel Tkalich
The reliability of Ocean Models and Tools (OMMT) depends not only on their predictive precision but also on the assurance that their outputs are free from software bugs and defects. The detection of such flaws in OMMT is often a challenging endeavor. This complexity stems from several factors, including the non-linear interplay of multiple physical variables, assumptions made during the modeling process, and the intricacies of discretization and algorithmic resolution. For instance, ascertaining the validity of a sea level elevation prediction at a given grid point in an ocean model or a tidal tool presents significant challenges. How do we distinguish between inaccuracies originating from software defects and those resulting purely from model performance? From the software testing perspective, they suffer from the so-called test oracle problem, which classifies a software that does possess a test oracle for verifying its correctness. This paper introduces a novel methodology to address this issue effectively by leveraging the capabilities of the metamorphic testing technique. To this end, we identify metamorphic relations based on the properties of physical phenomena and utilize these to validate the software’s correctness. The findings from our experiments and the derived insights are presented and discussed. Our study paves a new way to significantly enhance coastal and marine model quality and reliability.
Impacts of Different Physics Schemes on Hurricane Forecasts
Linlin Pan, Kathryn Newman, Mrinal Biswas, and Brianne Nelson
This study investigates the impacts of different schemes on hurricane forecasts using the Unified Forecast System (UFS) based Hurricane Analysis and Forecast System (HAFS). The physics schemes used include the planned operational HAFSv1a, and HAFSv1b. The results from HAFS are compared with the observed results. The impacts of different physics schemes on the wind, temperature and QPF during landing (e.g., hurricane IAN) are investigated. Influences of changing the land surface model, planetary boundary layer (PBL) schemes, and microphysics on hurricane intensity and track will be investigated through sensitivity studies. The model QPF (Quantitative Precipitation Forecast) is verified with observations (e.g., Multi-Radar/Multi-Sensor System gauge corrected data). The forecast wind, specific humidity and temperature during hurricane landing are validated with North American Model (NAM) Data Assimilation System (NDAS) data. More detailed results will be reported in this presentation.
Cloud-Based Workflows for RRFS and 3DRTMA
Raj Panda, Jim Abeles, Ben Blake, Jacob Carley, Annette Gibbs, Ed Colon, Matthew Morris, Guoqing Ge, Terra Ladwig, Manuel Pondeca, Bryan Schuknecht, Kenneth Sperow, Youngsung Jung, Patrick Keown, Curtis Alexander, Arun Chawla and RRFS and RTMA Development Teams
The elastic nature of a large public cloud such as AWS (Amazon Web Services) enables access to compute and storage resources that can be utilized to accelerate critical HPC projects. In this vein, AWS resources and services were utilized to support the development of two different NOAA projects, the RRFS (Rapid Refresh Forecast System) and 3DRTMA (3D Real Time Mesoscale Analysis). The RRFS is NOAA’s next-generation convection-allowing ensemble forecast system, the core components of which were developed, built, and tested in real time NOAA testbeds. A 9-member ensemble forecast system using the prototype RRFS was implemented in its entirety on AWS. Initial and boundary condition data was used from operational GFS & GEFS simulations after suitable data transformations for the chosen RRFS compute domain. The workflow designed for the prototype RRFS implementation on AWS was customized to address the requirements for 3 different testbed experiments: the HWT-SFE (Hazardous Weather Testbed Spring Forecasting Experiment), FFaIR (Flash Flood and Intense Rainfall experiment) and WWE (Winter Weather Experiment). For the 3DRTMA project, a cloud workflow was designed to address multiple steps of data and compute processing including (1) observation prep (2) data assimilation (3) modeling (4) post processing and (5) plotting. Input data files for the cloud workflow were generated and uploaded from an operational on-prem HPC system. This is an example of a workflow encompassing both an on-prem HPC system and a public cloud which comes with its own challenges. While implementing these two cloud
projects, it became evident that special consideration and planning are needed for (1) workflow (2) debugging (3) cost & performance. On the cloud, the types of resources and their availability require serious consideration. A properly chosen cloud configuration for a specific workflow should also be optimized in terms of cost and performance, allowing the fulfillment of the project requirements. While the primary focus of the presentation is about implementation of compute workflows, suggestions will be provided for addressing the remaining two items.
Nowcasting Weather Forecast Setup at Met Simulation
Hari Ome Kumar Pandey
Met Simulation Private Limited, Hyderabad, India, is a startup company initiating nowcasting weather at four synoptic cycles with 3D-VAR data assimilation with GFS initial conditions. We produce forecasts for all over the globe by dividing the world into six regions. Namely, North America (NoA), South America (SoA), Africa (Afr), Europe (Eur), Asia (Asi) and Australia (Aus). All these models run at course resolutions to meet our clients’ requests. We produce forecasts for 18 hours with a 15-minute interval.
In this presentation, I would like to present the validation statistics for model reanalysis data with the observations at 12 different locations in the globe. For this, a statistics experiment is set up with 5-km resolution ECMWF reanalysis data at 3-hour intervals.
Radar Data Assimilation Within JEDI Research Coupled With the UFS SRW App Targeting the Rapid Refresh Forecast System
Jun Park, Ming Xue, Chengsi Liu, Tao Sun, and Chong-chi Tong
OU/CAPS has been working on several research projects employing the Unified Forecast System (UFS) for convection-allowing applications. They include testing and optimal design of multi-physics ensemble targeting the UFS-based Rapid Refresh Forecasting System (RRFS), the implementation and testing of assimilation capabilities of radar and lightning data within Joint-Effort for Data assimilation Integration (JEDI), the impact of Incremental Analysis Update (IAU) with high-frequency radar data assimilation (DA).
Specifically, OU/CAPS has been collaborating with the NOAA/GSL and EMC to develop and test direct radar DA capabilities targeting the future RRFS based on the limited area FV3 (FV3-LAM) from the UFS Short-range Weather Application (SRW APP) and JEDI DA framework. Recently, the following accomplishments have been achieved: (1) migrate radar DA capabilities from GSI into JEDI-FV3; (2) test and tune an Unstructured Mesh Package (BUMP) package in JEDI to produce an optimal model static background error covariance (BEC) and localize ensemble BEC of hydrometeors for radar DA; (3) preliminarily test radar DA using JEDI coupled with FV3-LAM and compare with the results from HRRRv4 and RRFSp2; (4) test radar DA using JEDI-4DEnVar; (5) implement the IAU initialization technique within FV3-LAM framework and evaluate its impact on high-frequency radar DA cycle. The detailed evaluations will be presented at the conference.
Geometric Approaches for Adaptive Domain Decomposition With an Application to the Estimation of the Variance of Analysis Error
James Purser, Miodrag Rancic, Manuel Pondeca, Edward Colon, and Ting Lei
An important, but particularly difficult requirement in setting up a variational analysis scheme, such as the Real-Time Mesoscale Analysis (RTMA), is obtaining an objective estimation of the variance of error in the analysis. This might seem surprising since the precision (inverse of the covariance of error) in the analysis is simply the sum of the precisions of background, observations, and any other independent sources of information that go into the optimal analysis. The difficulty stems from the very large size of the matrices and associated linear systems needing to be inverted, making the adoption of simplifying approximations a practical necessity. One attractive practical approach is to estimate the analysis error variance in the dual space of observation, and to interpolate it to the grid in a final step. The inversion process is thereby carried out in the dual space where it becomes practical to reduce the difficulty by an adaptive domain decomposition such that each geometric domain is a polygon or polygonal prism containing a roughly equal amount of data.
We can further improve upon the method in the way inspired by the preconditioning technique proposed by Daley and Barker (2000) for the Navy’s NAVDAS assimilation and create more than just a single domain decomposition. We do this in such a way that the polygons of the different partitionings overlap in nonredundant ways that ensure that each observation is centrally located with respect to a polygon of at least one of these partitionings. The dual-space estimations of analysis error variance carried out by direct computation locally in each available polygonal domain of the complementary polygonal decompositions can be averaged with a systematic nonuniform weighting to yield a better estimate than would be obtained from a single decomposition.
Our presentation will emphasize the geometrical methods of decomposition relying on a fixed lattice framework to provide a guiding scaffold in this two-dimensional case to produce a triplet of complementary domain-sets, and briefly touch upon an extension of the technique that allows a fully three-dimensional quartet of domain decompositions into polyhedral parts to be created in the corresponding fully three-dimensional case, also adapting to the local data density.
Improvements in Tropical Pacific Sea Surface Temperature Forecasts in Prototype 8 Compared to CFSv2
Sulagna Ray, Lydia Stefanova, Jiande Wang, and Avichal Mehra
The next-generation NWS/NCEP operational GFS and GEFS systems are developed as fully coupled UFS applications. Amongst the applications, the Coupled Seasonal-to-Subseasonal system (UFS-S2S) has been developed iteratively by testing configurations with increasing complexity, adding necessary features gradually, and updating individual components. The latest of the fully coupled prototypes (version 8; P8) is evaluated here with a particular focus on the SST forecasts of tropical Pacific and compared to operational forecasts from CFSv2. The configuration of P8 involves, an atmospheric component with FV3-dynamical core and advanced physics package (GFSv16) resolved on a C384 (~25 kms) and L127 grid; an ocean component – MOM6 at 0.25° and 75 hybrid levels; a sea-ice component – CICE6 on the exact same grid as the ocean; a wave component – WAVEWATCHIII on a 0.5° grid, and an aerosol component with GOCART that does not include any feedback to the atmosphere. CFSv2 on the other hand, consists of a spectral model (GSM) as its atmospheric component on T126 and L64 grid, an ocean component – MOM4 with 0.5° grid that goes finer to 0.25° in the tropics and 40 z levels, and a SIS1 sea-ice component on the same grid as the ocean. P8 has a set of 35-day free forecast-runs initialized twice a month, spanning April, 2011-March, 2018 (168 runs in total), whereas CFSv2 are forecasts initialized daily 4 times a day. The corresponding start dates between P8 and CFSv2 are matched to assess the forecasts over the tropical Pacific. Weeks 3&4 forecasts of the upper ocean in the two forecast systems are assessed, in particular using SST skill scores and mean biases of surface variables such as, SST, windstress, and SSH. Assessments of differences in the mean biases in terms of the physical processes, including the effect of initial ocean conditions are also discussed.
The Earth System Modeling Executable (ESMX): A Tool for Building and Testing NUOPC-Based Coupled Earth System Modeling Applications
Daniel Rosen, Gerhard Theurich, and Ufuk Turuncoglu
The Earth System Modeling Framework (ESMF) 3 is a library of Fortran and C utilities used to drive and couple Earth System Modeling applications. ESMF provides building blocks for coupling applications but it does not provide specifications for coupling applications. The addition of the National Unified Operational Prediction Capability (NUOPC) to ESMF provided those specifications along with generic components but still lacked software that could be executed. The newest layer, the Earth System Modeling Executable (ESMX), completes the software stack through introducing customizable applications.
ESMX, when simplified, is a Earth System model application building package. It uses Python to process a build configuration file and CMake to link model libraries. In the most recent iteration it’s been enhanced with Git repository cloning, model building, and application testing. The combination of Python and YAML simplify the build configuration for end-users while the power of CMake finds files and libraries and links them into the ESMX executable. The test infrastructure is built upon CTest and provides the missing piece of testing coupled applications, data. The ESMX package also includes a lightweight, runtime configurable, data component called the ESMX Data component. This component can be configured with any import and export data desired. A uniform constant is used to fill the export state and value bounds are used to check the import state. Single coupled modeling components can now be built and tested without building a coupled system. Also, entire coupled modeling systems can be built and tested using just a YAML file.
The ESMF team strives to enhance the ESMF library through state of the art features, better performance, and easier usability. The newest ESMX package supports our mission to improve the process of creating Earth System modeling applications.
Optimizing the Use of Small Uncrewed Aircraft Observations in NOAA’s Next-Generation Hurricane Analysis and Forecast System (HAFS)
Kathryn Sellwood, Altuğ Aksoy, Jonathan Poterjoy, Jason Sippel, and Dan Wu
Small uncrewed aircraft systems (sUASs) are capable of filling an important observational gap at low levels of the atmosphere in the tropical cyclone (TC) inner core. As the technology improves and becomes more cost effective, sUAS TC observations are expected to become more common, and it will be important to understand how best to use them. To address this problem, we look at how quality control, observation density and advanced data assimilation techniques might help to derive the maximum benefit from these types of observations. Although some preliminary work has been performed to assimilate sUAS data into the HWRF model/DA system, the experiments discussed in this talk are the first which use HAFS. Observations obtained using the Altius sUAS for the case of Hurricane Ian (2022) are assimilated from either low-resolution observations transmitted in real time or full-resolution post-processed data. The impact of these data on TC initial conditions and subsequent forecasts is evaluated within both the operational HAFS-A and HAFS-B model configurations. Various experiments look at the impact of horizontal density of observations, error estimation, covariance specification and a novel online quality control technique. Horizontal density and error estimation are controlled within the GSI-based data assimilation system and covariance specification is addressed using different methods to produce an ensemble of forecasts which are then used in the calculations. The online quality control is implemented as part of an upgrade to the DA system which is expected to be adopted operationally.
The Hierarchical Testing Framework for the Unified Forecast System (UFS): The UFS Offline Land DA System Example
Yi-Cheng Teng, Stelios Flampouris, and Kim Jong
The Earth Prediction Innovation Center (EPIC) was launched recently by NOAA/OAR/WPO to accelerate the community’s development and integration of innovations to the Unified Forecast System (UFS), NOAA’s community-based, coupled, comprehensive earth modeling framework. To facilitate physics innovations and model development in the UFS, a common testing infrastructure for all the UFS components and applications that can accelerate the transfer of innovations and model improvements into operations is necessary. A prototype of the hierarchical testing framework (HTF) for the UFS is currently under development at EPIC with the potential to serve as a platform for component tuning, unit testing, sanity check of model subsystems, and multiscale case studies. By using the prototype hierarchical testing framework, researchers have the opportunity to selectively disable some of the feedback within the UFS coupled system. This can reduce the number of nonlinear interactions in the model forecasts, making it easier to isolate and understand phenomena of interest. This HTF framework has been implemented in the recent release of the UFS Offline Land Data Assimilation System v1.1.0 as a demonstration. The HTF framework covers not only unit testing itself but also integrates with CI/CD pipeline with containerized approach for continuous release practices.
HAFSv1 Physics Schemes and Future Plans
Weiguo Wang, Xu Li, Ruiyu Sun, Bin Liu, Zhan Zhang, Jongil Han, and Fanglin Yang
Physics schemes used in the two configurations of the first version of operational HAFS are reviewed. While there are minor differences in domain size, vortex initialization threshold, and model parameters used in the two operational configurations, one main difference is in the microphysics schemes. The GFDL single-moment microphysics is used in HAFS-A configuration, while the Thompson double-moment microphysics scheme is used in HAFS-B configuration. The other
the difference is in the adjustment of mixing length scale and model coefficients in the TKE-based EDMF PBL scheme. The impacts of the different microphysics schemes and PBL adjustments on HAFS performance are assessed. Future upgrades focus on increasing the physics diversity, improving intensity forecast at longer lead times, and increasing the detection rate of rapid intensification. To this end, some different schemes for major parameterized physical processes such as PBL, convection, and land models are being considered to be used in two configurations of the next version of HAFS. Also, we are going to present some ongoing experiments including different condensation time scales in microphysics schemes and subgrid flux parameterization. Future plans will be discussed.
Emerging Applications: Coastal and Marine
JEDI-Based Data Assimilation for the UFS Marine Components
Travis Sluka
The Sea-ice, Ocean, and Coupled Assimilation (SOCA) project is a JEDI-based marine data assimilation system for the marine-related UFS components of the ocean (MOM6), sea-ice (CICE6), wave (WAVEWATCHIII), and even ocean biogeochemistry. By leveraging JEDI (the Joint Effort for Data Assimilation Integration), a wide range of advanced data assimilation methods from 3DVAR to hybrid-EnVar/EDA and observation operators are available for these components. The current capabilities, near-term operational plans, and longer-range development vision for SOCA will be presented. The SOCA system has been built with both NOAA/NASA operations and the research community in mind. JCSDA’s role in the development of the forthcoming hybrid-EnVAR marine DA systems for NOAA/EMC’s global medium and subseasonal range forecasting systems (GFSv17/GEFSv13), as well as the high-resolution regional Hurricane Analysis and Forecast System (HAFS), will be presented.
The longer-term plans for SOCA beyond the current year will be discussed, these include coupled data assimilation for ocean surface sensitive radiances (infrared, microwave, and visible) for assimilating sea surface temperature and ocean color observations, utilization of coupled covariances between the various marine earth system components, and possible applications of AI/machine learning.
UFS-Arctic: A Pan-Arctic Regional Application of the UFS
Aaron Wang, Christopher Cox, Amy Solomon, Janet Intrieri, Lisa Bengtsson, Philip Pegion, and Jeffrey Whitaker
Rapid changes in Arctic sea ice increase the need for actionable forecast information for making decisions governing marine and coastal community safety in this challenging environment. In response, the NOAA Physical Sciences Laboratory (PSL) provides experimental, daily forecasts of Arctic weather and sea ice conditions to stakeholders through the Coupled Arctic Forecast System (CAFS) model and has done so since 2016. The system includes dynamical ocean, sea ice, land, and atmospheric models, coupled with a flux coupler. It is initialized with GFS boundary conditions and satellite-derived sea ice concentration and sea surface temperatures, and is run daily to produce pan-Arctic, 0- to 10- day forecasts of sea ice, oceanic, and atmospheric fields. Alaska-specific regional products and (by request) observational campaign support products are made available through a website designed in collaboration with National Weather Service-Alaska Region (NWS-AR) partners.
Compared to other coupled forecast systems, CAFS has been shown to provide more realistic representations of energy exchanges between the ice and atmosphere, which are essential to the thermodynamic and dynamic tendencies of sea ice. However, the architecture of CAFS is different from the Unified Forecast System (UFS). In order to help align research, forecast product, and model development activities related to sea ice forecasting between OAR and NWS, PSL is currently developing a regional UFS application for the Arctic and working to make physics packages used within CAFS and PSL available in the Common Community Physics Package (CCPP). The new Arctic-UFS is being developed from the Hurricane Analysis and Forecast System (HAFS) UFS application, and will be a three-way coupled system (FV3-MOM6-CICE6). The domain is pan-Arctic and will be run at similar or higher spatial resolution than CAFS. In this presentation we provide an overview of the CAFS and our transition activities to the UFS-Arctic.
Development of the Next-Generation UFS Coastal Modeling Framework
Panagiotis Velissariou, Ufuk Turuncoglu, Yunfang Sun, Saeed Moghimi, Ali Abdolali, and Edward Myers
The NOS Storm Surge Modeling Team, in close partnership with its agency, academic, and industry partners, developed CoastalApp, a fully coupled multi-model coastal application (https://github.com/noaa-ocs-modeling/CoastalApp) to advance our understanding and operational modeling capabilities of the coastal ocean processes following Unified Forecast System (UFS) best practices. The goal is to provide a flexible and portable modeling framework for coastal applications that supports, but is not limited to, storm surge, surface waves, sea ice, ocean-atmosphere-ice-land interactions, sediment transport, and water quality studies. The coupling framework can be utilized for deep water to shallow water transformation and coastal applications on various spatial and temporal scales, from short-range to subseasonal-to-seasonal climate variability.
CoastalApp contains multiple model and data components coupled 1-way and/or 2-way using the NUOPC/ESMF coupling infrastructure. In its current status, the application supports the (a) ocean models ADCIRC, SCHISM and FVCOM, (b) atmospheric components ATMESH and PAHM, (c) wave components WAVEWATCH III (WW3) and WW3DATA. CoastalApp and all its modeling components are extensively tested after each upgrade, using its companion application CoastalApp testsuite ( h ttps://github.com/noaa-ocs-modeling/CoastalApp-testsuite). Both applications have their own native build system that supports multiple HPC cluster configurations as well as other Linux clusters and desktops.
The NCAR/ESMF and the NOS Storm Surge modeling teams are actively working with coastal ocean modeling communities to optimize the coupling infrastructure and code management framework of CoastalApp and CoastalApp-testsuite to bring it closer to the UFS System Architecture, aiming at developing the Unified Forecast System (UFS) Coastal Application (ufs_coastal). CoastalAppp will be transitioned to ufs-coastal (https://github.com/oceanmodeling/ufs-coastal/tree/feature/coastal_app) by first forking ufs-weather-model (https://github.com/ufs-community/ufs-weather-model) to be used as the basis to build the next-generation coastal ocean models coupling framework. The ongoing work mainly aims to port existing CoastalApp components (i.e., ADCIRC, SCHISM, WW3) one by one to the UFS Weather Model and make them compatible with the CMEPS mediator to create an up-to-date version of the model, which is named as ufs-coastal. The existing configurations of the CoastalApp, which are found in the CoastalApp testing suite repository, will be also defined as Regression Tests (RTs) similar to UFS Weather Model using existing testing framework capabilities to make them easily run on supported platforms. This will also enable testing of various configurations of the model along with the development and ensure the development in the other applications of UFS Weather Model do not affect the development in the ufs-coastal.
Incorporating Thermohaline Circulation and Hydrology Into Global STOFS 2D+, NOAA’s Fast Integrated Multi-Scale Multi-Process Operational Water Level Model
Joannes J. Westerink, Coleman Blakely, Maria Teresa Contreras Vargas, Guoming Ling, Damrongsak Wirasaet, Al Cerrone, Dylan Wood, William Pringle, Zach Cobell, Shintaro Bunya, Rick Luettich, Edward Myers, Saeed Moghimi, Greg Seroka, Yuji Funakoshi, Liujuan Tang, Lei Shi, Kendra Dresback, Chris Szpilka, Randy Kolar, Margaret Owensby, and Chris Massey
The operational version of NOAA’s Global Surge and Tide Operational Forecast System (STOFS) has been running with variable finite element resolution between 80m and 24km. The model incorporates optimized high resolution along all U.S. coastlines and extends onto the coastal floodplain and is driven by tides and NOAA’s Global Forecast System (GFS-FV3) winds, sea ice, and atmospheric pressure. Resolution is critical in complex inlet systems with jetties such as the St. Johns River entrance and Sabine Lake entrance, intricate shoal systems such as in Shinnecock Bay, and complex intra-tidal cross cut shoals such as the Biscayne Flats in Florida. In addition, as both inlet connections to the ocean and upland dendritic floodplain channel systems narrow, increased resolution becomes increasingly important. Further mesh developments include an optimized global shell and improvements in regional and global bathymetry. Bathymetry in key locations both globally and locally remain the most important controls of model fidelity. Overall Global STOFS 2D is the most accurate global non-data-assimilated model with an M2 tide mean absolute error in deep water of 1.95 cm. Along the U.S. East/Gulf of Mexico coast the M2 tide errors at available NOS tidal stations are summarized as R2 = 0.9848, an mean absolute error = 2.5 cm, and a normalized RMS error = 0.089. Global STOFS 2D incorporates 13.6 million finite element nodes and is fast, running at 2.4 wall clock minutes per day of simulation on 240 TACC Frontera cores.
Process integration advances have now been implemented include coupling with NOAA’s global ocean circulation model, Global RTOFS, in order to incorporate the impact of the ocean’s thermohaline drivers including large current systems such as the Gulf Stream, warm and cold core eddies impinging on the coast, and seasonal steric expansion and contraction, all significantly affecting coastal water levels. In addition, coupling to the National Water Model accounts for upland hydrology and stream flows into the coastal zone. With these drivers in place, Global STOFS 2D+ will be a total water level model and mean water levels will balance locally to correct local mean sea levels with respect to a geodetic vertical reference.
Emerging Applications: Space Weather
A Whole Earth System View for Heliophysics: A NASA/Goddard View
Sassi, R. Lieberman, and S. Pawson
A collaborative effort between the Global Modeling and Data Assimilation Office (GMAO) and the Ionosphere Thermosphere and Mesosphere (ITM) Physics Laboratory at NASA/Goddard Space Flight Center (GSFC) aims to extend the lid of GEOS to the thermosphere, and eventually include space weather effects. The development pathway of the new model is designed to increase the vertical domain of the GEOS model system incrementally and sequentially, in order to achieve
- combined use of high-altitude observations in NASA’s comprehensive Earth system models,
- resolution on of thermospheric processes to support NASA missions and geospace physics challenges, and
- whole-atmosphere modeling to include electrodynamics and space weather.
In this talk we present plans of combining capabilities developed separately in the Earth and Heliophysics domains. This phased collaboration will lead to whole-atmosphere modeling and analysis activity that maximizes the science impact of NASA’s existing and future observations of the mesosphere, thermosphere, and space weather.
The Whole Atmosphere Model (WAM) Application of NOAA’s Unified Forecast System
Kevin Viner
The UFS-WAM application is being developed by NOAA’s EMC (Environmental Modeling Center) and SWPC (Space Weather Prediction Center) as a replacement for the existing GSM-WAM (Global Spectral Model) application. The UFS-WAM extends the GSM-WAM capabilities to include:
- the unified paradigm of the UFS (which uses the FV3 dynamics core and CCPP physics),
- both nonhydrostatic and deep atmosphere effects that are vital to accurate thermospheric prediction, and
- the multi-species chemistry effects that dominate in the thermosphere.
Comparisons between the first iteration of the UFS-WAM and GSM-WAM will be shown, along with examples of unique capabilities of the new system.
Development of WACCM-X as the SIMA Geospace Component
Hanli Liu
The System for Integrated Modeling of the Atmosphere (SIMA) is a cross-lab effort at NCAR to build a unified community atmospheric modeling framework for use in an Earth System Model, such as the Community Earth System Model (CESM). It is envisioned to be a flexible and interoperable system that enables diverse configurations of the atmosphere model inside of CESM, for applications spanning minutes to centuries and cloud to global scales, including atmospheric forecasts and projections of the atmospheric state and composition from the surface into the geospace. The Whole Atmosphere Community Climate Model with thermosphere/ionosphere extension (WACCM-X) is being developed as the geospace component of SIMA. I will discuss three ongoing efforts in this development: the global high-resolution capability, the adaptation of the non-hydrostatic dynamical core (MPAS-A), and the coupling with the magnetosphere model (GAMERA). These developments will pave the way for building a whole geospace model system.
Going Beyond the Terrestrial: Space Weather Verification Using METplus
Jonathan L. Vigh, Terrance G. Onsager, Tara L. Jensen, Dominic J. Fuller-Rowell, Jun Wang, Mihail Codrescu, Tibor Durgonics, Charlotte Martinkus, Naomi Maruyama, Frank Centinello, Timothy J. Fuller-Rowell, and Robert Steenburgh
NOAA’s Space Weather Prediction Center (SWPC) and the National Center for Atmospheric Research (NCAR) have been working together since 2018 to advance verification capabilities for space weather. The capabilities are being built using the Model Evaluation Tools (MET) and METplus. MET is an efficient, configurable, state-of-the-art suite of verification tools developed by NCAR. METplus is a corresponding suite of python wrappers and other supporting capabilities which allow for complex real-time and retrospective verification workflows to be simplified and codified for robustness and reproducibility. This abstract provides an overview of the capabilities which have been developed thus far.
The evaluation of space weather prediction techniques presents unique challenges and opportunities. One challenge is the use of parameters and forecast outputs that are not used in terrestrial weather prediction (e.g., Total Electron Count, TEC; electron density profiles, magnetic field direction and strength, height of ionospheric reflectivity layers, ENLIL spiral), yet many of these parameters have similarities with terrestrial weather (e.g., TEC is somewhat analogous to precipitable water, the height of ionospheric reflectivity layers are analogous to cloud ceiling heights). Another challenge is that many of the observational and model formats used in space weather prediction are quite different from those used in terrestrial prediction. These are surmountable through METplus’s powerful python embedding features, which allow the system to provide a generalized capability to read any model data source with user-written python codes. Like terrestrial weather, space weather has distinct areas of interest, such as the location of the auroral oval, areas of high TEC values, or areas of scintillation). MET’s object-based MODE tool is well suited for identifying objects and comparing the resulting object attributes such as displacement error and intensity, offering more insights beyond what traditional verification metrics provide.
The capabilities being developed through this effort are unified through Space Weather Prediction Center Real-time (SWPC-RT), a platform-independent verification system which we have developed to apply advanced methods and techniques to space weather verification. The system is containerized, allowing for easy installation on Windows, MacOSX, and Linux platforms without the complexities of installing and compiling MET. One main configuration file allows users to set up and configure multiple workflows and select which capabilities to run, which models to compare, and which graphical outputs to be generated. Users can use the system as is or develop their own use case by patterning after any of the 8 distinct use cases which currently are supported in the system. The system is being structured to allow both real-time and retrospective evaluation workflows.
SWPC-RT currently offers the capability to conduct gridded comparisons of ionospheric TEC between various analyses and models and stratify the resulting verification statistics by complex mask regions and time-varying quality flags, which SWPC has used to evaluate the efficacy of commercial space weather radio occultation observations. At the time of this writing, several new capabilities are being added. The first capability involves the point-wise comparison of scintillation observations to gradients in TEC fields which foster the dynamical imbalances that lead to scintillation. The second capability is the verification of SWPC’s official (human-produced) forecasts of the Kp index. Ultimately, SWPC-RT will provide routine and robust statistics, comparisons, and diagnostics to assist SWPC forecasters in evaluating model guidance (e.g., WAM-IPE, GloTEC, CTIPe) and SWPC’s official forecast products.
Emerging Applications: Air Quality
Simulating Radiative Effect of Smoke and Dust Aerosols Using NOAA’s Next-Generation Storm-Scale Numerical Weather Prediction Model
Ravan Ahmadov and Team
The National Oceanic and Atmospheric Administration (NOAA) has been undergoing a transition of its numerical weather prediction (NWP) models to the Unified Forecasting System (UFS)
(https://ufscommunity.org/). As part of this effort, the NOAA Global Systems Laboratory (GSL) and other teams have developed a storm-scale NWP model called the Rapid Refresh Forecasting System (RRFS). In order to enhance its capabilities, we have integrated smoke and dust simulation capabilities into the RRFS model. This coupled model, known as RRFS-SD, is currently being tested in real time over the CONUS domain, and the experimental smoke and dust forecast products are made available to users through a public web-site (https://rapidrefresh.noaa.gov/RRFS-SD/). Notably, the RRFS-SD model includes the radiative feedback of the smoke and dust aerosols.
The summer of 2020 witnessed the most severe fire season in modern US history, with numerous devastating wildfires burning in the western region. The resulting smoke emissions led to poor air quality, affecting millions of people across the country. Moreover, the widespread smoke plumes significantly reduced solar radiation, thereby impacting weather conditions. To investigate the effects of smoke on meteorology, we conducted extensive retrospective simulations of the RRFS-SD model at 3 km spatial resolution over the CONUS domain for September 2020. Multiple sensitivity simulations were performed to estimate the influence of smoke on meteorological variables. The model simulations were extensively evaluated using ground-based PM2.5 and aerosol optical depth observations. The weather and visibility simulations were evaluated using surface-based meteorological and radiosounding observations. Our findings indicate that the RRFS-SD model, incorporating direct aerosol feedback, demonstrates substantial improvements in downward solar radiation and temperature simulations, thus improving weather forecasting overall. Additionally, we discuss the uncertainties involved in estimating the smoke radiative feedback within the RRFS-SD model.
Overall, this study provides insights into the integration of smoke and dust simulations into the RRFS model, showcasing its potential to enhance air quality and weather predictions, and contribute to a better understanding of the impacts of smoke on meteorology.
Enhancing Wildfire Predictions With the UFS-AQM Online Prediction System: A Case Study of Alberta Fires
Jianping Huang, Ivanka Stajner, Fanglin Yang, Jeff McQueen, Ho-Chun Huang, Kai Wang, Chan-Hoo, Jeon, Raffaele Montuoro, Brian Curtis, Hyundeok Choi, Hiaxia Liu, Barry Baker, Daniel Tong, Youhua Tang, Patrick Campbell, James Wilczak, Dave Allured, Irina Djalalova, Shobha Kondragunta, Chuanyu Xu, Fangjun Li, and Xiaoyang Zhang
Wildfires play a significant role as a source of atmospheric primary fine particulate matter (PM2.5) emissions in the United States (US). Accurate representation of wildfire emissions is crucial for reliable air quality predictions. Recently, a series of wildfires originated in Alberta, Canada, and have been transported to the US, resulting in a substantial impact on air quality over the middle-western US regions from May 16, 2023, persisting for approximately one week.
To improve air quality predictions, NOAA developed and tested a novel air quality prediction system, the Finite-Volume Cubed-Sphere (FV3) Dynamical Core-based atmospheric model-coupled regional Air Quality System (AQM). This system is based on the EPA Community Multiscale Air Quality (CMAQ) model, and is built within the framework of Unified Forecast System (UFS). However, using NESDIS’ Regional hourly Advanced Baseline Imager (ABI) and Visible Infrared Imaging Radiometer Suite (VIIRS) Emissions (RAVE) fire data, the UFS-AQM system under-predicted surface PM2.5 concentrations during the significant Alberta fire event. In contrast, NOAA’s operational air prediction system, an offline CMAQ prediction system driven by the Global Forecast System (GFS), incorporating the NESDIS operational product called the Global Biomass Burning Emissions Product (GBBEPx) for wildfire emission calculations, produced more reasonable PM2.5 predictions.
This study aims to evaluate various aspects of wildfire emissions, including intensity, emission scaling factors, speciation, diurnal variation, and plume-rise algorithm, as well as assess their impact on PM2.5 predictions through various numerical sensitivity experiments. The real-time predictions from both the online-CMAQ and operational systems will be assessed using AirNow hourly observational data provided by the Environmental Protection Agency. The primary objective of this research is to enhance the representation of wildfire emissions and improve air quality predictions during wildfire events.
Extending JEDI-Based Global Aerosol Assimilation System to Improve Aerosol Prediction in Unified Forecast System
Bo Huang, Mariusz Pagowsk, Cory Martin, Andrew Tangborn, Maryam Abdi-Oskouei, Jérôme E. Barré, Shobha Kondragunta, Georg Grell, and Gregory Frost
A Joint Effort for Data assimilation Integration (JEDI)-based three-dimensional ensemble-variational (3DEnVar) aerosol assimilation system (Figure 1; Huang et al., 2023, JAMES; https://doi.org/10.1029/2022MS003232) was developed for the operational Global Ensemble Forecast System – Aerosols (GEFS-Aerosols) at NOAA/NWS/NCEP. This system has been evaluated in the near real-time (NRT) experiments using the Common Community Physics Package (CCPP)-based GEFS-Aerosols at NOAA/OAR/GSL. Our NRT experiments show that assimilating 550 nm aerosol optical depth (AOD) retrievals from the Visible Infrared Imaging Radiometer Suite (VIIRS) instruments significantly improves model skills in simulating AOD by verifying against various AOD retrievals or analysis, though its constraints on specific aerosol species remain very limited due to its integral nature of AOD.
One of the ongoing UFS-R2O (United Forecast System – Research to Operations) efforts at NOAA aims to integrate aerosol prediction within UFS (UFS-Aerosols). It is planned to replace the standalone GEFS-Aerosols with UFS-Aerosols for operational global aerosol forecasting at NCEP in the near future. Compared to GEFS-Aerosols, UFS-Aerosols is coupled with NASA’s updated Goddard Chemistry Aerosol Radiation and Transport (GOCART) model that includes additional nitrate aerosols in three size bins, adopts improved biomass burning and dust emissions, and allows for aerosol-radiation interactions. To support future operational global aerosol prediction, this JEDI-based aerosol assimilation system for GEFS-Aerosols was recently extended to UFS-Aerosols. The AOD forward operator and its tangent-linear and adjoint models in JEDI Unified Forward Operator (UFO) were further extended to account for added nitrate aerosol species in UFS-Aerosols. Relevant developments have been tested and merged in JEDI’s internal repository. Preliminary experiment results over a three-week cycling period suggested that assimilating VIIRS 550 nm AOD retrievals demonstrated improved AOD simulation in the model when verified against various AOD retrievals or reanalyses. To have a robust evaluation over prolonged periods, we are configuring a NRT experiment at NOAA/OAR/GSL using this extended aerosol assimilation system for UFS-Aerosols. The NRT aerosol assimilation results using UFS-Aerosols will be presented. Challenges and plans to further improve this aerosol assimilation system for UFS-Aerosols will be discussed.
Development of the Configurable ATmospheric Chemistry (CATChem) Model and Its Application Within the Unified Forecast System Forming a Unified UFS-Chem
Barry Baker, Rebecca Schwantes, Georg Grell, Jian He, Jordan Schnell, Siyuan Wang, Meng Li, Brian McDonald, Greg Frost, Li Zhang, Shan Sun, Ravan Ahmadov, Patrick Campbell, Youhua Tang, Quazi Rasool, Colin Harkins, Louisa Emmons, Gabriele Pfister, and Matthew Dawson
NOAA’s Unified Forecasting System (UFS) is a community-based Earth modeling system that plans to provide a framework to efficiently incorporate research advances into NOAA’s operational forecasts. Currently, there is an array of modules of varying complexity coupled within the UFS to cover all of NOAA air composition needs. The simplified aerosol scheme, GOCART, is incorporated into the UFS with minimal gas-phase chemistry for seasonal to subseasonal scales, while the Community Multiscale Air Quality Model (CMAQ) is coupled to forecast air quality. We will discuss our plans for adding innovative research capabilities for improving chemistry and aerosol processes and thereby predictions of air quality and atmospheric composition into the UFS. These enhanced research capabilities will include:
- Options to use gas and aerosol chemical mechanisms of varying complexity.
- Ability to easily couple different mechanisms to different physics options.
- Development of a more flexible emissions processing system.
- Creation of a unified inline component to process emissions of varying complexity
- Further investment of model evaluation tools like MELODIES-MONET (https://melodies-monet.readthedocs.io) that efficiently compare model results against a variety of observations.
CATChem (Configurable ATmospheric Chemistry) is a modeling component that includes all chemical and aerosol processes needed to perform atmospheric chemistry and composition simulations within a model through a flexible, easy to modify, and well-documented infrastructure. CATChem will include the following processes: chemical kinetics, aerosols, photolysis, wet deposition, dry deposition, connections to emissions, and connection to physics schemes. The first use of CATChem will be connecting it to NOAA’s Unified Forecasting System to create UFS-Chem. CATChem and UFS-Chem are currently under development and more information will be provided soon. In this work, we will use the Model Independent Chemistry Module, which is a component of the MUlti-Scale Infrastructure for Chemistry and Aerosols, led by NCAR. By discussing our plans, we hope to get input early in the process on whether these enhancements will meet the needs of the research and operational communities, so as to ensure the UFS accomplishes one of its main goals to efficiently update NOAA’s operational forecasts with research advances.
Towards a State-of-the-Art Greenhouse Gas Data Assimilation/Flux Inversion Modeling System
Lori Bruhwiler and Andrew Schuh
The White House has recently formed a new interagency working group to coordinate measurement, monitoring, reporting, and verification (MMRV) of the atmospheric budgets of greenhouse gasses. Greenhouse gas data assimilation and flux inversion modeling systems are an important component of a US MMRV strategy because they could allow inventories of emissions and removals to be independently verified by checking consistency with atmospheric observations. NOAA already has greenhouse gas DA/flux inversion systems (CarbonTracker, CarbonTracker-CH4), but they are limited in spatial and temporal resolution. This limitation has implications for the use of high frequency continental in-situ observations in inversions because these observations are difficult to simulate at coarse information and therefore must be de-weighted, throwing out potentially useful information about fluxes. The NOAA Unified Forecast System presents an opportunity to use significantly higher spatial and temporal resolution and an on-line modeling approach where the small-scale transport features are allowed to evolve in response to large-scale forcing. In addition, prior flux estimates can potentially be estimated using the same driving meteorology as the transport. Furthermore, the GEFS reanalysis ensemble can be used to estimate model transport error, which has not been possible in the current modeling framework. We describe progress using the UFS model to simulate CO2, CH4
and SF6 , and discuss important limitations that must be resolved in order to build an operational NOAA MMRV system.
Emerging Technologies: AI
Hybrid AI for Multiscale Wildfire Risk Assessment and Mitigation
Jared Goldman
The risk of wildfires has increased significantly in recent years and touched communities not previously at high risk (Burke et al., 2021; Radeloff et al., 2018). Effective mitigation of wildfire risk is essential to reduce the potential for catastrophic losses. Accurate assessment of the wildfire risk at a property level will enable fire departments and homeowners to take appropriate mitigation measures and reduce losses. We developed a Wildfire Integrated Modeling, Prediction, and Learning Environment (WIMPLE), a hybrid AI (HAI) tool for wildfire risk assessment under a NASA-funded project. WIMPLE is based on our Scruff HAI framework (Pfeffer & Lynn, 2018), which provides integration of different types of AI models, sharing and composition of models, and spatiotemporal flexibility.
WIMPLE combines models of different spatiotemporal resolutions from regional to property level to generate a stochastic risk assessment of wildfire occurrence and impact on a property. The user provides the WIMPLE system with the location and layout of their property and WIMPLE provides the user with information on the risk posed by wildfires to their property. These risk metrics include risk of a wildfire in the surrounding region, risk of a wildfire in the property, and expected damage to the property (USD). These values are calculated using several distinct models all integrated under the Scruff HAI framework.
The foundational component of the HAI model is the environmental condition model. The environmental condition model generates a probability distribution of likely future environmental conditions from marginalizing the NASA Earth Exchange Global Daily Downscaled Projections (Thrasher et al., 2022) by year and season. The advantage of using this dataset is that it incorporates different climate change trajectories in the estimations of future environmental conditions. This allows the underlying predicted environmental conditions for other model components to reflect the progress of climate change.
The distributions of environmental conditions generated by the environmental condition model are used in the regional fire likelihood, regional fire spread, and property fire spread models. The regional fire likelihood model uses a Random Forest model trained on data from past fires and climatic conditions in a region to determine the likelihood of fires occurring in that region in the future. It generates the probability of a fire occurring in a region based on the predicted environmental conditions from the environmental condition model. The regional fire spread model uses the set of environmental conditions from the environmental condition model in combination with the USFS
regional wildfire simulation tool, FlamMap (Finney, 2006) to generate a likely area of effect of a fire impacting the target region. Finally, the property fire spread model uses the conditions from the environmental condition model and a layout of the property supplied by the user to simulate a fire
spreading across the property with Fire Dynamics Simulator (FDS) (Kevin McGrattan, et al., 2013). This component simulates the damage and trajectory of a fire through the target property under different initial ignition points and uses the results to determine the expected damage to the property in USD.
The goal of WIMPLE is to provide homeowners with information on the risk wildfires pose to their property and suggest mitigation approaches to lessen that risk. WIMPLE uses explainable AI (XAI) techniques to visualize the outputs of the model components in the user interface at the regional and property level scales. Each of these scales informs different characteristics of wildfire risk. The regional risk visualization conveys the risk of wildfires around the property, potentially motivating the end user to consider alternative house locations. The property level visualization shows the simulation of different paths that a fire could take across a property, highlighting potential hazards that property owners should address.
In future development, we hope to extend the property level simulation to include counterfactual simulations to highlight the important role of mitigating fire hazards on a homeowner’s property. For instance, a counterfactual could read: “Under the current property layout, the expected damage from wildfires is $300,000. However, if you remove the trees at points A and B, the expected damage would decrease by 30% to $210,000.” This statement would be accompanied by simulations of a fire in the property without the trees mentioned in the mitigation suggestion above. This allows the end user to see the visual and statistical impacts of wildfire mitigation measures, hopefully motivating them to take action to protect their property.
Application of ML/AI for Calibration of Parameters of the Multigrid Beta Filter (MGBF) and Its Performance Within the Three-Dimensional Real-Time Mesoscale Analysis (3D RTMA) Project
Miodrag Rancic, R. James Purser, Manuel De Pondeca, Edward Colon, and Ting Lei Lynker
The Multigrid Beta Filter (MGBF), a technique developed at EMC for synthesis of covariances in a data assimilation system, is incorporated in the Three-Dimensional Real-Time Mesoscale Analysis (3D RTMA) and applied to modeling of background error covariance. The 3D RTMA is configured to run on the North America’s Rapid Refresh (RRFS) domain using an RRFS model to generate the background fields. The method for determination of parameters of the filter, that is, its calibration, is using as an input the correlation lengths derived from the error statistics generated by the NMC method. This paper describes a Machine Learning (ML) version of the calibration approach, based on application of a Deep Neural Network (DNN), which is replacing the original lookup table method in the hope of improving the computational efficiency, and presents the first results derived within the Three-Dimensional Real-Time Mesoscale Analysis (3D RTMA) project.
Emerging Technologies: Cloud Computing
Spack-stack: A Reproducible, Many-Platform Software Stack for Operational Weather Applications
Alexander J.W. Richert, Dominikus Heinzeller, Cameron Book, Edward Hartnett, Stephen Herbener, Hang Lei, and Mark A. Potts
Large-scale numerical weather prediction applications such as the Unified Forecast System (UFS) and the Joint Effort for Data assimilation Integration (JEDI) depend on dozens of software libraries, and providing those dependencies in a stable and reproducible manner is a significant challenge. Spack is a widely used package manager specifically designed for installing scientific software on high-performance computing (HPC) systems and cloud platforms. We present spack-stack, a set of configurations and extensions for Spack. By leveraging Spack’s extensive built-in capabilities, repository of build recipes, and community support, spack-stack achieves robust, reproducible software stacks that are readily implemented across the development, HPC, and cloud resources of the National Weather Service, the Joint Center for Satellite Data Assimilation, and other partner organizations. We demonstrate spack-stack’s capabilities in supporting NOAA models, as well as the JCSDA JEDI framework. We discuss configurations and customizations available through spack-stack, as well as future plans and challenges.
An Overview of the IOOS Coastal Modeling Cloud Sandbox: An Emerging Technology
Ali Abdolali, Graeme Aggett, Nels Frazier, Kelly Knee, Michael Lalime, Katherine Moore Powell, Derrick Snowden, Patrick Tripp, Tiffany C. Vance, Micah Wengren, and Zach Wills
The IOOS Coastal Modeling Cloud Sandbox is an infrastructure designed to support collaboration and enhance model development. It provides a framework for developing, modifying and running models in the cloud. It provides repeatable configurations, model code and required libraries, input data and analysis of model outputs. The Sandbox supports not only the development of services and models, but also Cloud HPC to run and validate models. This project aims to enhance the platform’s accessibility by establishing a version of the Sandbox on NOAA cloud resources. This project will also provide resources for further development, refinement, and testing of the Sandbox to support future collaborations (hindcast, nowcast, and forecast) This endeavor will promote collaboration among NOAA Line Offices and eventually extend to the wider coastal modeling research community.
The upcoming conference will feature an overview of the project, including the project timeline for deploying an NOAA instance of the Sandbox and improving associated utilities. This will involve integrating coastal applications based on the aforementioned model components, model pre and post-processing steps, statistical analysis, and engaging the user community to improve and utilize the platform.
Augmenting Covariance Operators With Machine Learning: Generating Dedicated Datasets in the Cloud and a Prototype Model
Sergey Frolov, Timothy A. Smith, Peter Vaillancourt, Jeffrey Whitaker, Zofia Stanley, Wei Huang, Henry R. Winterbottom, and Clara Draper
NOAA’s suite of data assimilation (DA) and forecasting applications rely on ensemble-based methods to obtain credible estimates of model uncertainty. However, the immense computational cost of propagating an ensemble will impede the implementation of model improvements, such as grid refinement and model coupling. In this talk, we discuss plans to use machine learning methods to augment the ensemble, in order to improve the uncertainty representation without increasing the number of members. We highlight a successful prototype model that estimates vertical correlations, a necessary quantity for strongly coupled DA. The prototype estimates atmosphere-ocean surface temperature correlations based on a 5-member ensemble average of near surface fields like 2-meter humidity and the ocean mixed layer depth.
The training dataset that we use to develop our prototype is based on an 80-member ensemble from a single 24 hour forecast. Our goal is to generate operators that evolve dynamically in time with the underlying flow, and represent statistical properties more accurately than what we can achieve with the existing dataset. Therefore, we discuss our plans to generate two dedicated datasets using the weakly coupled UFS modeling framework. One dataset will use a 1 degree horizontal grid spacing with ~800 ensemble members, while the second will use a finer, 1⁄4 degree horizontal grid spacing with ~240 ensemble members. The limitations inherent to our currently available high-tenancy queue-bound computing systems motivate us to exercise our reanalysis workflow in the cloud. We discuss some of the advantages and challenges associated with this shift. Finally, the datasets will be made freely available in Zarr format, which is friendly to both machine learning and traditional weather and climate analysis workflows.
Updates & Challenges of UFS Applications: S2S/GEFS/SFS
Impact of Stochastic Physics in Coupled Simulations of the UFS and CESM
Philip Pegion, Judith Berner. and Xiao-Wei Quan
The Weather Program Office funded a study to evaluate the impact that stochastic physics has on the S2S timescales in two coupled models. As part of this project, both groups produced long control simulations (50 years for UFS, and 100 years for CESM2), and they also generated an additional simulation with stochastic physics active. The UFS simulations were carried out with a 1-degree version of Prototype 8, and the stochastic schemes active were SKEB and SPPT, and CESM2 simulations only added SPPT.
The overall climate variability of ENSO and other modes of climate variability generally improved with the addition of stochastic physics. The deterministic UFS simulation has too weak ENSO variability, and CESM2 has too strong ENSO variability, and both of them move closer to observations, which is consistent with previous studies with CESM1 and the ECMWF coupled model. The presentation will show the value of 1-degree simulations in developing stochastic parameterizations as well as a scientific tool for improving S2S predictions with the UFS.
Diagnosing Sea Ice in the Unified Forecast System (UFS)
Neil Barton, Robert Grumbine, Dmitry Dukhovskoy, Philip Pegion, and Avichal Mehra
The Unified Forecast System (UFS) is NOAA’s fully-coupled Earth system modeling framework for weather and climate scales that couples multiple models/components for an accurate representation and predictions of the earth system. UFS is currently being developed for future versions of the Global Forecasting System (GFS) and the Global Ensemble Forecasting System (GEFS). A new aspect of the UFS compared to previous versions of GFS and GEFS is the addition of the Community ICE Model (CICE) sea ice model and the Modular Ocean Model version 6 (MOM6) ocean model. This presentation will focus on results from the CICE model, in particular large-scale sea ice characteristics from develop runs of the UFS. In general, the large scale sea ice is reasonable in the UFS runs. During the NH summer months, Arctic sea ice has a negative bias in the first couple days of the forecast, while Antarctic sea ice has a positive bias throughout most months. After the current results are examined, future steps of UFS development of sea ice will also be discussed.
NOAA’s Seasonal Forecast System (SFS) Development Plan and Project Management
Yan Xue, Kevin Garrett, Jessie Carman, Avichal Mehra, and Phil Pegion
NOAA has recently received funding for developing an operational Seasonal Forecast System (SFS) at National Weather Service (NWS). This SFS model will enable NWS to provide critical long range predictions for: water resources including flood and drought; storm severity and frequency; hurricane intensity and frequency; marine heat waves; extreme heat or cold wave; extreme winds; fire severity and danger; and other environmental factors, nationally and globally.
The SFS will build upon and extend the capabilities of the subseasonal forecast system based on
the atmosphere-land-ocean-sea ice-wave-aerosol coupled Unified Forecast System (UFS). Accurate SFS requires better physics descriptions of slowly changing processes on the land, in the oceans, for ice, and for atmospheric composition. Data assimilation improvements for the land, ocean and sea ice states are needed in order to more accurately represent the initial states of those model components that provide the long-term memory of the Earth System. A historical reanalysis and reforecast will be done for model calibration and to further improve seasonal forecast outlooks along with post-processing methods including machine learning.
The Office of Science and Technology Integration Modeling Program Division (OSTI-Modeling) of NWS and the Subseasonal-to-Seasonal (S2S) Program of the Weather Program Office of OAR are jointly funding a SFS Project. The SFS team, composed of scientists from NCEP centers, OAR labs and NCAR, is developing a SFS Development Plan as well as a SFS Implementation Plan in which the requirements and needs in development as well as the work plans towards implementation are documented. We will present the key features in the SFS Development Plan developed by the team, and solicit feedback and comments from the community.
The Development of Coupled GEFS: Status and Challenges
Bing Fu, Yuejian Zhu, Philip Pegion, Hong Guan, Eric Sinsky, Xianwu Xue, Jiayi Peng, Fanglin Yang and Avichal Mehra
NOAA/NCEP is working on the next implementation around 2025, which introduces a fully coupled UFS Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) to replace the current operational uncoupled GFS/GEFS. In preparation for this, a series of coupled ensemble prototypes have been developed targeting the final version of GEFSv13. Four coupled ensemble prototypes (EP1-EP4) have been developed following the model innovations from all the coupled model components, especially from the atmospheric physics and land surface model. The updates of stochastic physics schemes for atmosphere and introductions of ocean stochastics are also integrated into the coupled model aiming to better represent the coupled model uncertainties. In order to evaluate model performance from weather to subseasonal time scales, a 2-year period experiment, initialized weekly, 11 members, out to 35-day forecast, has been finished for the first three ensemble prototypes (EP1-EP3). The experiment for EP4 is also planned and will be carried out soon. In this presentation, we will update the status of the coupled GEFS and present the evaluations from the first three ensemble prototypes, and preliminary results from EP4. The challenges associated with computation resources and model stability will also be discussed.
Examining the Sensitivity of SST to Ocean Initial Conditions in Seasonal Forecast
Shan Sun, Rainer Bleck, Ben Green, Yuejian Zhu, Bing Fu, Philip Pegion, and Wanqiu Wang
We examined the sea surface temperature (SST) in a suite of seasonal forecast experiments utilizing NOAA’s fully coupled Unified Forecast System (UFS) model. The UFS incorporates the atmospheric model FV3 with the Global Forecast System (GFS) physics package V17, as well as the MOM6 ocean model, WW3 wave model, and CICE6 sea ice model. The control experiment was based on the UFS coupled model Prototype 8, except with a 50 km mesh instead of 25 km in the atmospheric model. Discrepancies in global energy fluxes at the top and bottom of the model atmosphere emphasize the need to examine conservation of energy and gaseous/condensed mass field constituents in the physics equations.
We conducted a second experiment identical in all respects to the control experiment, except for the use of ocean initial conditions from the Ocean Reanalysis System 5. The results showed a reduction in bias for both ocean temperature and atmospheric energy budget compared to the control experiments. This highlights the long-term impact of ocean subsurface initial conditions on the evolution of the coupled system. We will also discuss the differences in cloud and precipitation patterns seen in the two experiments.
Making the Unified Forecast System Cool
Investigating the Radiative Impact of Saharan Dust Aerosols on Medium-Range Forecasts for African Easterly Waves in the Unified Forecast System
Christian Barreto-Schuler, Dustin Grogan, Christopher Thorncroft, and Sarah Lu
Aerosols produce some of the largest uncertainties in numerical weather prediction due to their efficient absorption and scattering of solar and terrestrial radiation, which modifies the thermodynamic structure and energy budgets of the atmosphere. During summertime, the hot and dry Central West Sahara (15-250N, 15-50W) has some of the largest dust emissions globally. In this peri-region, dry convection mixes the dust throughout the Saharan boundary layer (1000-500 hPa). Meanwhile, differential heating by the Saharan Desert sets up a mid-tropospheric African easterly jet (AEJ) across North Africa (at ≈650 hPa, 150N) that transports the dust westward across the Atlantic. The AEJ also facilitates the development of synoptic scale African easterly waves (AEWs), which can directly interact with the dust aerosols.
In this study, we use the Unified Forecast system (UFS) that incorporates aerosol effects into the radiation scheme to investigate their impacts on medium-range (3-7 days) weather forecasts. The model is used to examine an AEW that developed by the end of July 2020 and interacted with a synoptic-scale dust plume near the coast of West Africa. Our goal is to quantify the impact of the aerosol radiative effects on the synoptic weather system. To do this, we conduct sensitivity experiments by running forecasts with different aerosol configurations. The experiments are then compared with each other as well as with the MERRA-2 reanalysis. Preliminary results at the level of 850hPa (where dust influence is expected to be stronger) indicate that the amount of aerosols play an important role in the spatial and temporal evolution of the AEW (Fig.1 and 2). For instance, differences in the AEW’s amplitude and the horizontal configuration of the cyclonic relative vorticity are seen. This also leads to variations in the precipitation forecast.
Developing Guidelines for Measuring, Defining and Fostering Innovations in Earth Prediction Systems at NOAA’s Weather Program Office
Laura Dailey, Jose-Henrique Alves, Jessie Carman, Jordan Dale, Maoyi Huang, Chandra Kondragunta, and John Ten Hoeve
This investigation is a part of research activities for my 2023 William M. Lapenta Summer internship at NOAA’s Weather Program Office (WPO) and was inspired by the WPO FY2023 Innovations for Community Modeling Competition. Through this competition, WPO funds dynamic research to advance forecasting methods for the Unified Forecast System (UFS) and to promote innovations in numerical weather predictions that enhance accuracy, efficiency, and reliability. My research answers the following questions to help set guidelines for how WPO and the weather community as a whole measure innovation: What is innovation? How can one define innovation in the weather and climate enterprise of Earth scientists, numerical modelers, forecasters, and the general public? How do government, private industry, academia, and other organizations in the Weather Enterprise understand innovation? How can innovation be measured? How do we describe the impact or value of innovative research? How can we measure progress and learn from failures? The outcomes of my research will help WPO fund innovative approaches to developments in Earth prediction systems rather than incremental changes. This preliminary framework will provide guidelines for reviewers to score grant proposals based on innovation and to bring in new and fresh ideas to operational forecasting. It will help identify opportunities in WPO initiatives, such as the WPO Innovation for Next-Generation Scientists fellowship, to better align research with its mission of improving Earth prediction systems such as the UFS, and gaining new perspectives from all members in the community. This presentation will also emphasize additional activities and efforts that WPO can implement to better foster innovation and help to draw in the rising generation.
Representation Matters: Insights, Strategies, and Perspectives from the Inaugural UFS/EPIC Student Ambassador
Alekya Srinivasan, Jennifer Vogt, Krishna Kumar, Maoyi Huang, Neil Jacobs
Interpreting and visualizing the future for programming and technological advancements is essential for adapting to the ever changing scientific community. Creating the intern position of Student Ambassador for the Unified Forecast System (UFS) signifies the importance of engaging with a diverse background of users across the Weather Enterprise. Composed of industry, private sector, and academia, the Weather Enterprise is a crucial component of maintaining community engagement connections and technical support. However, recent research shows that academic participation levels are lower compared to industry or the private sector. In order to promote innovative change, this project will venture and combine two different routes: (1) outreach with qualitative data from universities and (2) evaluating tutorials/training materials and reconfiguring application infrastructure and source code to Jupyter Notebooks. Contacting universities with renowned atmospheric science and/or computer science programs to discover current and future interests regarding programming and/or Numerical Weather Prediction, will inform scientists from different backgrounds about how they can further adapt to the community’s needs. Reconfiguring Short-Range Weather application infrastructure and source code to Jupyter Notebooks will further expand the accessibility and inclusivity of public UFS GitHub repositories. This presentation entails the Student Ambassador-designed UFS Student Engagement Plan, which will include all findings and ideas from an undergraduate student perspective.