www-txt.gif (7823 bytes)

WWW Technical Progress Report on the Global Data Processing System 1999

THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION

NATIONAL WEATHER SERVICE: U.S.A.


1. Highlights For Calendar Year 1999

Application software development efforts were focused on the conversion from the Cray computer systems to the new IBM SP system throughout all of calendar year 1999. On 27 September, at the height of the conversion effort and just prior to the completion of the Cray C90 code conversion, the C90 suffered catastrophic fire damage and was taken out of service. Even though the C90 code conversion was nearly complete, the IBM SP was not available to NCEP operations because it was being physically relocated from the Suitland Federal Center to the Bowie Census Computer Center. From 27 September to 17 November, when the IBM SP installation in Bowie was completed, NCEP production was run in a degraded backup configuration, utilizing forecast products created on the Cray J916s and from the NOAA Forecast Systems Laboratory (FSL), Air Force Weather Agency (AFWA) and Navy Fleet Numerical Meteorology and Oceanography Center (FNMOC). Table 1 describes the configuration of NCEP production during that period.

Table 1. NWS computer configuration at Federal Building 4 (FB4), Suitland Federal Center, Suitland, Maryland during the period 27 September - 17 November, 1999

Forecast Model Normal Operation Backup Configuration
NGM 2/day from Regional data assimilation system 2/day
RUC Hourly FSL RUC - hourly
Eta 4/day - 32 Km 2/day - 80 km (00z & 12Z)

AFWA MM5 36 Km (06Z & 18Z)

Aviation 4/day - T126, 84 hr fcst 2/day - T126
MRF 1/day - 384 hr fcst 1/day - 168 hr fcst
Hurricane 4/day GFDL 3-nested 2/day GFDL 2-nested

FNMOC 2/day 3-nested

 

On 17 November, the NCEP production suite was returned to full production mode with all C90 applications running on the IBM SP. The conversion of Cray J916 applications continued for the remainder of the calendar year and is expected to be completed by April 1, 2000.

The major changes introduced into the NCEP Operational Production Suite in 1999 as part of the Cray to IBM SP conversion were:

+ NGM is initialized from the Eta analysis at 00Z and 12Z. The Regional Data Assimilation System (RDAS) with its attending Optimum Interpolation (OI) Analysis was not converted to the IBM SP.

 

2. Equipment In Use

2.1 Status at the End of 1999

Within the Suitland, Maryland, Federal Center computer complex (FB4), there is an optical fiber based TCP/IP network, as well as Network Systems Corporation (NSC) routers and various networking hub equipment. Broadband communication links, both Fiber Distributed Data Interface (FDDI) and FDDI Network Service (FNS), are connected to the NOAA Science Center (NSC), Silver Spring Metro Center (SSMC2), and the Goddard Space Flight Center (GSFC). The GSFC link provides access to the Internet. Additional high speed links tie in the NCEP Centers located in Miami, Florida (Tropical Prediction Center), Kansas City, Missouri (Aviation Weather Center) and Norman, Oklahoma (Storm Prediction Center).

A large number of network-connected scientific workstations (mostly SGI, HP, and Sun machines) are used throughout NCEP. Selected UNIX workstations and UNIX-based communications servers are available through telephone access. This provides dial-in capability for NCEP and other approved users to all network attached machines including the Crays.

There are two Cray J916s in the Federal Center complex. The J-916s are connected to each other via High-Performance Parallel Interface (HIPPI, 100 MB/s) channels and switches, as well as through FDDI (10 MB/s) and ethernet connections. Both Cray systems have access to a Redundant Array of Independent Disks (RAID) -technology Network Disk Array (NDA) with a capacity of nearly 850 GBs. The HIPPI channels and switches are currently used only for direct non-shared access to the NDA from each machine. The 850 GBs of NDA storage is apportioned among the two Crays. NFS cross-mounting allows access of most file systems on both Cray systems for non-operational use.

The Census Bowie Computer Center houses the IBM SP computer system. The networking infrastructure at the Census Bowie Computer Center (CBCC) consists of two Fore System ASX-200BX ATM switches, one Fore Systems Power Hub-8000, and one IBM Accend Router.

The ASX-200BX's are connected together by an ATM OC12 (622MB) fiber optic link. These two switches are also connected to the Power Hub and the Accend Router via OC3 (155MB) fiber optic link. The Power Hub provides (100BaseT) communications to local computer systems and the Ascend router provides communications to the IBM SP.

Bell Atlantic provides high speed ATM OC3 (155MB) communications from the ASX-200BX's systems to the National Weather Service (NWS) located at the World Weather Building (WWB) in Camp Springs, MD and also provides 10MB Fast Network Service to the NWS in Silver Spring, MD.

Table 2 details the NWS computer configuration at the Suitland Federal Center and Census Bowie Computer Center existing at the end of 1999.

TABLE 2. NWS computer configuration at Federal Building 4 (FB4), Suitland Federal Center, Suitland, Maryland, and the Census Bowie Computer Center, Bowie, Maryland as of Dec. 31, 1999.

12/31/99

CONFIG.

IBM SP

Cray

J-916

Cray

J-916

Processors

768

16

16

Memory

208 GB

2048 MB

2048 MB

Operating

System(s)

AIX

UNICOS 10

UNICOS 10

Disk Storage

4.6 TB

116 GB

116 GB

Shared Storage Resources: 12/31/1999

 

Network Disk Array

1270 GB

Automated Cartridge Library

9 TB

Cray Reel Library

23,000 tapes

 

The large scale numerical weather forecast models and data assimilation systems are run on the IBM SP. In step with the model’s processing, model output is incrementally transferred to one of the J-916s. On this system, the application programs generate bulletins and graphic products which are made available to the on-site forecasters and to National Weather Service’s Office of Systems Operations (OSO) for distribution. The second J-916 serves as a backup machine should the other Cray not be available.

A Storage Technology Corporation (STK) Automated Cartridge System Library System (ACSLS), consisting of four Library Storage Modules (LSM), provides both Crays access to approximately 10 terabytes of near-line storage, and one of the J-916s access up to an additional 50 terabytes of near-line storage. This LSM was installed in August, 1997. These devices support almost all the tape processing accomplished through the Crays. One supported function is the hierarchical data migration using Cray's Data Migration Facility (DMF) software. This provides the user community with 91 GBs of online storage backed by 2.1 TB of near-line storage. The Automated Cartridge Libraries also manage the 23,000 Cray Reel Library repository.

2.2 Future Plans

NCEP procured an IBM SP computer system which will replace the current Cray systems in early 2000. The contract was awarded in October 1998, and the system was accepted in June 1999. However, the Cray J-916s will continue to be utilized for operational purposes until early 2000. Additional UNIX equipment is planned to be purchased to replace the data ingest workstations, and the Supervisor Monitor Scheduler (SMS) workstations.

 

3. Observational Data Ingest and Access System

3.1 Status at the End of 1999

3.1.1 Observational Data-Ingest

NCEP receives the majority of its data from the Global Telecommunications System (GTS), the NOAA Environmental Satellite, Data, and Information Service (NESDIS), and aviation data circuits. Table 3 contains a summary of the types and amounts of data available to NCEP’s global data assimilation system during January, 2000. The GTS and aviation circuit bulletins are transferred from the NWS Telecommunications Gateway to NCEP’s Central Operations (NCO) over two 56 kbps lines. Each circuit is interfaced through an X.25 pad connected to a PC running a LINUX operating system with software to accumulate the incoming data-stream in files. Each file is open for 20 seconds after which the decayed-file is queued to the Distributive

Brokered Network (DBNet) server for distributive processing. Files containing GTS observational data are networked to one of two Silicon Graphics Origin 200 workstations. There the data-stream file is parsed for bulletins which are then passed to the Local Data Manager (LDM). The LDM controls continuous processing of a bank of on-line decoders by using a bulletin header pattern-matching algorithm. Files containing GTS gridded-data are parsed on the LINUX PC, "tagged by type" for identification, and then transferred directly to the Cray J916s by DBNet. There, all data are stored in appropriate accumulating data files according to the type of data. Some observational data and gridded data from other producers (e.g., satellite observations from NESDIS) are processed in batch mode on the Cray J916s as the data become available.

 

Table 3. Summary of data used in NCEP’s global data assimilation system (GDAS). Data counts are averages for January 2000.

  GDAS Cycle Run 0000 UTC 0600 UTC 1200 UTC 1800 UTC Daily Total
  GDAS Cycle Data Cutoff Time 0600 UTC 0940 UTC 2000 UTC 2200 UTC  
Data Category Data Sub-Category          
Land Sfc Synoptic

METAR

16,348

24,172

16,224

24,490

16,823

26,444

16,424

26,370

65,819

101,476

Sub-total 40,520 40,714 43,267 42,794 167,295
Marine Sfc Ship

Drifting Buoy

Moored Buoy

CMAN

814

1,720

732

405

787

1,545

736

405

798

2,177

735

395

781

2,117

732

404

3,180

7,559

2,935

1,609

Sub-total 3,671 3,473 4,105 4,034 15,283
Land

Soundings

Fixed Land RAOB

Mobile Land RAOB

Dropsonde

Pibal

Profiler

NEXRAD Wind

609

4

5

75

214

1,125

118

4

1

91

214

1,121

606

4

0

78

218

1,100

111

3

1

76

212

1,143

1,444

15

7

320

858

4,489

Sub-total 2,032 1,549 2,006 1,546 7,133
Aircraft AIREP

PIREP

AMDAR

ACARS

RECCO

884

323

2,951

11,547

5

1,002

69

3,678

8,023

1

1,093

165

4,350

6,534

1

1,107

444

4,163

11,261

3

4,086

1,001

15,142

37,365

10

Sub-total 15,710 12,773 12,143 16,978 57,604
Satellite

Radiances

GOES

SBUV

TOVS1B - HIRS

TOVS1B - HIRS3

TOVS1B - MSU

TOVS1B - AMSUA

22,701 319

67,356

35,909

8,340

66,625

21,897

250

59,680

19,771

7,325

34,590

24,426

332

77,144

36,912

9,369

66,321

21,143

314

65,686

30,872

8,780

56,626

90,167

1,215

269,866

123,464

33,814

224,162

Sub-total 201,250 143,513 214,504 183,421 742,688
Satellite

Cloud Winds

US High Density

US Picture Triplet

Japan

Europe

37,461

459

557

1,141

22,898

224

604

1,275

25,537

486

671

1,500

34,985

275

664

1,035

120,881

1,444

2,496

4,951

Sub-total 39,618 25,001 28,194 36,959 129,772
Satellite

Surface

SSM/I Neural Net 3 Wind Speeds

ERS Scatterometer Wind

14,067

60,916

11,552

41,084

16,612

81,493

13,196

55,513

55,427

239,006

Sub-total 74,983 52,636 98,105 68,709 294,433
Overall Total 377,784 279,659 402,324 354,441 1,414,208

 

3.1.2 Decoder Processing

The decoder software is designed to divide processing into two independent parts: an observation-parser and an application-encoder. In between, a common data-interface is utilized so that different encoding software can be conveniently interchanged to meet the requirements of different applications. The two primary data representation forms used by application software at NCO are the World Meteorological Organization (WMO) Binary Universal Form for the Representation of meteorological data (BUFR) for numerical weather prediction (NWP) modeling needs and the GEneral Meteorological PAcKage (GEMPAK) form for interactive forecasting needs. Both are flexible, compact, self-defining data representation forms. The same observation-parser software can produce both of these data representation forms using the common in-memory interface and the appropriate application encoding software.

3.1.3 NWP Database Ingest

The observational decoders process in parallel, parsing observations and encoding them into the WMO BUFR data representation form. Each decoder stores its encoded observations in memory until the array reaches 10,000 bytes or the decoder’s observation type or subtype changes or the end of a bulletin is reached. The contents of the decoder’s old array are then saved in a file on disk, and a new array in memory is acquired. Special binding software is used to manage the decoder files so that a file accumulates encoded observations until the file’s time window has expired. Once a new file is automatically opened, the old file is available for transfer. This file aging technique allows the decoding of new observations and the transfer of decoded observations to be executed in parallel without fracturing a file or an observation. Files are aged for two minutes so there is an average one minute delay in the availability of an observation after it has been decoded. Aged files from all decoders are accumulated into a single file before being transferred to the Cray J916s. The automatic DBNet transfer process triggers the release of a job on the Cray J916s which parses each message in the BUFR file by type, subtype and date/time information, opens the appropriate standard UNIX sequential file and adds the message at the end of the file. The J916 BUFR database consists of UNIX subdirectories and files and an arbitrary "database root" directory which facilitates parallel testing and recovery on other platforms. Each file is described by the UNIX path/ filename convention as "yyyymmdd/bmmm/xxsss", where "yyyymmdd" is the date during which each observation is valid, "mmm" is the message type, and "sss" is the message subtype. Each file contains all BUFR messages with a particular message type and subtype valid on a particular day. Observational files remain on-line for several days before migration to off-line cartridges. During the on-line time, there is open access to them for accumulating late arriving observations and for research and study.

3.1.4 Data Access

The process of accessing the observational data base and retrieving a certain set of data is accomplished in several stages by a number of FORTRAN codes. This process is operationally run many times a day to assemble data for model assimilation and dissemination. The script that manages the retrieval of observations provides users with a wide range of options. These include observational date/time windows, specification of geographic regions, data specification and combination, duplicate checking and part merging, and parallel processing. The primary retrieval code (DUMPMD) performs the initial stage of all data dumping by retrieving subsets of the database that contain all the database messages valid for the time window requested by a user. DUMPMD looks only at the date in BUFR section one to determine which messages to copy. This will result in a set containing possibly more data than was requested, but allows DUMPMD to function very efficiently. A final ‘winnowing’ of the data to a set with the exact time window requested is done by the duplicate checking and merging codes applied to data as the second stage of the process. Finally, manual quality marks are applied to the data extracted. The quality marks are provided by two NCEP groups: the NCO Senior Duty Meteorologists (SDMs) and the Marine Prediction Center (MPC).

3.2 Future Plans

There are several major changes anticipated for the observational ingest system in 2000. The first involves moving the database ingest and data access from the Cray J916s to the IBM RS/6000 SP. The second involves migrating the observational data ingest processing from the SGI Origin 200s to the IBM RS/6000 SP. The benefits of these changes are faster processing and quicker recovery from outages.

 

4. Quality Control System

4.1 Status at the End of 1999

Quality control (QC) of data is performed at NCEP, but the quality controlled data is not disseminated on the GTS. However, QC information is included in various monthly reports disseminated by the NCEP. The data quality control system for numerical weather prediction at the NCEP has been designed to operate in two phases: interactive and automated. The nature of the quality control procedure is somewhat different for the two phases.

4.1.1 Interactive Phase

During the first phase, interactive quality control is accomplished by the MPC and the SDMs. The MPC personnel use an interactive system that provides an evaluation of the quality of the marine surface data provided by buoys (drifting and stationary) and ships based on comparisons with the model’s first guess, the provider’s track, and a history file of the observation provider. The MPC personnel can flag the data as to the quality, and this is stored in a file on the mainframe for use during the assimilation phase. The SDM performs a similar process of quality assessment for radiosonde temperature, wind and height data and aircraft temperature and wind reports. The SDMs use an interactive program which initiates the "off-line" running of two of the automated quality control programs (described in the next paragraph) and review the programs’ decisions before making additional or negating quality assessment decisions. The SDMs use satellite pictures, meteorological graphics, continuity of data, past station performance and horizontal data comparisons or "buddy checks" to decide whether or not to override automatic data QC flags.

4.1.2 Automated Phase

In the automated phase, the first step is to include any manual quality marks attached to the data by MPC personnel and the SDMs. This occurs when time-windowed BUFR data dump files are created from the NCEP BUFR observational data base. Next is the preprocessing program (PREPDATA) which makes some simple quality control decisions to handle special problems and re-codes the data in a special BUFR format with descriptors to handle and track quality control changes. In the process, certain classes of data, e.g., surface marine reports over land, and vertical azimuth Doppler radar winds, are flagged for non-use for the assimilation but are included for monitoring purposes. A subsequent program (PREVENTS) applies the first guess background and observational errors to the observations. Under special conditions (e.g., data too far under the model surface), observations are flagged for non-use by the assimilation.

Separate automated quality control algorithms for radiosonde, non-automated aircraft and wind profiler reports are run next. The purpose of these algorithms is to eliminate or correct erroneous observations that arise from location, transcription or communications errors. Attempts are made, when appropriate, to correct commonly occurring types of errors. Radiosonde temperature and height data pass through the Complex Quality Control of Heights and Temperatures (CQCHT) program (Gandin, 1989), which makes extensive hydrostatic, baseline, and horizontal and vertical consistency checks based upon differences from the 6-hour forecast. Corrections and quality values are then applied to the radiosonde data. In April 1997, a new CQCHT algorithm was installed that performs the quality control for all levels as a whole, rather than considering the mandatory levels first, and then the significant levels. In addition, an improvement was made to the way in which the hydrostatic residuals are calculated and used (Collins, 1998). AIREP, PIREP, and AMDAR aircraft reports are also quality controlled through a track-checking procedure by the Aircraft Quality Control (ACQC) program. In addition, AIREP and PIREP reports are quality controlled in two ways: isolated reports are compared to the first guess and groups of reports in close geographical proximity are inter-compared. Both CQCHT and ACQC are also run "offline" by the SDM. Finally, wind profiler reports are quality controlled with a complex quality control program using multiple checks based on differences from a 6-hour forecast, including a height-time check.

The final part of the assimilation is for all data types to be checked using an optimum interpolation based quality control (OIQC) algorithm, which uses the results of both phases of quality control. As with any complex quality control procedures, this program operates in a parallel rather than a serial mode. That is, a number of independent checks (horizontal, vertical, geostrophic) are performed using all admitted observations. Each observation is subjected to an optimum interpolation formalism using all observations but itself in each check. A final quality decision (keep, toss, or reduced confidence weight) is made based on the results from all individual checks and any manual quality marks attached to the data by the duty personnel. Results from all the checks are kept in an annotated observational data base.

4.2 Future Plans

A review of the quality control system will be conducted in 2000 with the purpose of planning upgrades to various parts of the system. One part of this review will be to investigate the methodology for inclusion of quality control functions within the three-dimensional variational (3DVAR) analysis. This should lead to a method that unifies the analysis and the quality control and which is applicable to all data types. If found to be successful, the use of a separate, automated quality control step for all data will not be needed, although separate QC for radiosonde heights and temperatures, aircraft and Profiler data will remain useful. A separate quality control procedure for vertical azimuth display wind reports from the NEXRAD system will be implemented in Spring, 2000.

 

5. Monitoring System

5.1 Status at the End of 1999

5.1.1 Real-time Monitoring

As mentioned in the previous paragraphs, "real-time" monitoring of the incoming GTS and satellite data is performed by a number of computer programs which run automatically before each assimilation, or are run interactively by the NCEP Central Operations SDMs, and provide information on possible action. If there are observational types or geographic areas devoid of data, the SDM will request Washington DC Regional Telecommunications Hub (RTH) assistance in obtaining the observations. The SDM may also hold up starting a NWP model to ensure sufficient data are available. Four times a day, a web site is updated with reports on what data has been received by US supported upper air sites.

5.1.2 Next-day Monitoring

"Next-day" data assessment monitoring is accomplished by routinely running a variety of diagnostics on the previous day's output from the decoders, the operational quality control programs, and the NWP analyses to detect problems. When problems are detected, steps are taken to determine the origin of the problem(s), to delineate possible solutions or improvements, and to contact appropriate data providers if it is an encoding or instrument problem.

5.1.3 Delayed-time Monitoring

"Delayed-time" monitoring includes a twice weekly automated review of our production reject list and monthly reports on the quantity and quality of data in accordance with the WMO/CBS that are shared with other GDPS centers. A monthly report is prepared showing the quality, quantity, and timeliness of US supported sites. Monthly statistics on hydrostatic checks and guess values of station pressure are used to help find elevation or barometric problems at upper air sites. This monitoring system includes statistics on meteorological data that can be used for maintaining our reject list and for contacting sites with problems.

5.2 Future Plans

The operational capability to find current upper air reports that are in reality duplicates of old data and to track-check ACARS aircraft data will be improved. New software will be developed to provide the capability to automatically diagnose deficiencies in the numbers of reports within various data categories and subcategories and alert the SDMs of deficiencies. New procedures and software will be added to improve real-time monitoring.

 

6. Forecasting System

6.1 Global Forecasting System

6.1.1 Status of the Global Forecasting System at the End of 1999

Global Forecast System Configuration: The global forecasting system consists of:

a) The final (FNL) Global Data Assimilation System (GDAS), an assimilation cycle with 6-hourly updates and late data cut-off times;

b) The aviation (AVN) analyses and 84-hour forecasts, run at 0000, 0600, 1200, and 1800 UTC with a data cut-off of 2 hours and 45 minutes using the 6-hour forecast from the FNL as the first guess;

c) A once per day 16-day medium-range forecast (MRF) from 0000 UTC using FNL initial conditions and producing high resolution T126 predictions to 7 days and lower-resolution T62 predictions from 7 to 16 days; and

d) Ensembles of global 16-day forecasts from perturbed FNL initial conditions (five forecasts from 1200 UTC, and twelve forecasts from 0000 UTC).

Global Data Assimilation System: Global data assimilation for the FNL and AVN is done with a multi-variate Spectral Statistical Interpolation (SSI). This is a 3-dimensional variational technique in which a linear balance constraint is incorporated, negating the need for a separate initialization step. The analyzed variables are the associated Legendre spectral coefficients of temperature, vorticity, divergence, water vapor mixing ratio, and the natural logarithm of surface pressure (lnPsfc). All global analyses are done on 28 sigma levels at a T126 spectral truncation. Two new data sources added in 1999 were radiances from the NOAA -15 Advanced Microwave Sounding Unit (AMSU-A) and the High Resolution Infrared Radiation Sounder (HIRS/3). Data cut-off times are 0600, 0930, 2100, and 2130 UTC for the 0000, 0600, 1200, and 1800 UTC FNL analyses, respectively, and 0245, 0845, 1445, and 2045 UTC for the 0000, 0600, 1200, and 1800 UTC AVN anayses..

Global Forecast Model: The global forecast model (Sela 1980, 1982) has the associated Legendre coefficients of lnPsfc, temperature, vorticity, divergence and water vapor mixing ratio as its prognostic variables. The vertical domain includes the entire depth of the atmosphere and is discretized with 28 sigma layers. The Legendre series for all variables are truncated at either T126 (FNL, AVN and the first seven days of the MRF) or T62 (MRF for days 8 through 16 and the ensemble forecasts) triangular truncation. A semi-implicit time integration scheme is used. The model includes a full set of parameterizations for physical processes, including moist convection, cloud-radiation interactions, stability dependent vertical diffusion, evaporation of falling rain, similarity theory derived boundary layer processes, land surface vegetation effects, surface hydrology, and horizontal diffusion. See the references in Kalnay et al (1994) for details.

Global Forecast System Products: Products from the global system include:

a) Gridded (GRIB) Sea level pressure (SLP) and height (H), temperature (T), zonal wind component (U), meridional wind component (V), and relative humidity (R) at a number of constant pressure levels every 6 hours for the first 60 hours and at 72 hours of the twice daily AVN forecasts;

b) Specialized aviation grids (GRIB) with tropopause H, T, and pressure as well as fields depicting the altitude and intensity of maximum winds;

c) Extended forecasts (3.5-10 days, every 12 hrs) of SLP, H, U, V, and R at 1000, 850 and 500 hPa issued once per day; and

d) A large number of graphic products.

6.1.2 Future Plans for the Global Forecasting System

Near term changes planned for the production suite include:

a) Increasing the resolution of the global system (FNL and AVN analyses and forecasts and MRF forecast) from T126 to T170; increasing the number of vertical levels to 42. Extend the AVN forecasts to 126 hours at 0000 and 1200 UTC to provide guidance and boundary forcing for the Hurricane model.

b) Higher vertical resolution (60 levels) to make better use of satellite radiances.

c) Modifying convection and tropical storm initialization procedures to reduce false alarms and to improve guidance for tropical storms.

d) Including Goes-10 sounder and Quickscat observations.

e) Incorporating prognostic cloud water for the MRF model.

f) Implementing a new quality control procedure for radiosondes.

6.2 Regional Forecast System

6.2.1 Status of the Regional Forecasting Systems at the End of 1999

Regional Forecast System Configuration: The regional systems are:

a) The Mesoscale Eta Forecast Model, which provides high resolution (32 km and 45 levels) forecasts over North America four times daily (0000, 0600, 1200, 1800 UTC) out to 48 hours;

b) The Rapid Update Cycle (RUC) System, which generates (40 km and 40 level) analyses and 3-hour forecasts for the contiguous United States every hour with 12-hr forecasts eight times per day on a 3-hourly cycle; and

c) The Nested Grid Model (NGM), whose North American grid has approximately 90 km resolution on 16 layers, and which generates twice daily 48-hr forecasts for the Northern Hemisphere.

Regional Forecast System Data Assimilation: Initial conditions for the four Meso Eta forecasts are produced by a multivariate 3-dimensional variational (3DVAR) analysis which uses as its first guess a 3 hour Meso Eta forecast from the Eta-based Data Assimilation System (EDAS - Rogers, et al., 1996). The EDAS is a fully cycled system using 3-hour Meso Eta forecasts as a background and global fields only for lateral boundary conditions. Data cut-off is at 1 hour and 10 minutes past the nominal analysis times. No initialization is applied.

Initial conditions for the RUC Model are provided by an optimum interpolation methodology which analyzes directly on the model grid points and on its hybrid sigma-theta vertical coordinate surfaces. Hourly data cut-off times are 18 minutes with special dumps at 55 minutes after 0000 and 1200 UTC. A digital filter initialization is applied.

Until the fire in NCEP’s operational Cray C-90 computer, initial conditions for the twice daily NGM forecasts came from a hemispheric optimum interpolation analysis which used as its first guess a 3 hour NGM forecast from the Regional Data Assimilation System (RDAS). The RDAS performed 3 hour updates during a 12 hour pre-forecast period but started from the global fields each 12-hours and so was not a fully cycled system. Data cut-off times are 2 hours past the synoptic time. An implicit normal mode initialization was used. With the loss of the C-90 computer in September of 1999, the remaining operational computers no longer provided the resources needed to run the RDAS, so initial conditions for the NGM are now provided by a static optimum interpolation analysis using the 6 hour GDAS forecast as a first guess.

Regional Forecast System Models: The Mesoscale Eta forecast model ( Black et al., 1993 & 1994; Mesinger et al, 1988) has surface pressure, temperature, u, v, turbulent kinetic energy, water vapor mixing ratio and cloud water/ice mixing ratio as its prognostic variables. The vertical domain is discretized with 45 eta layers with the top of the model currently set at 25 mb. The horizontal domain is a 32 km semi-staggered Arakawa E-grid covering all of North America. An Euler-backward time integration scheme is used. The model is based on precise dynamics and numerics (Janjic 1974, 1979, 1984; Mesinger 1973, 1977), a step-mountain terrain representation (Mesinger 1984) and includes a full set of parameterizations for physical processes, including Janjic (1994) modified Betts-Miller convection, Mellor-Yamada turbulent exchange, Fels-Schwartzkopf (1974) radiation, a land surface scheme with 4 soil layers (Chen et al. 1996) and a predictive cloud scheme (Zhao and Carr 1997, Zhao et al. 1997). The lateral boundary conditions are derived from the prior global AVN forecast at a 3 hour frequency.

The RUC system was developed by the NOAA/Forecast Systems Laboratory under the name of Mesoscale Analysis and Prediction System (MAPS) (Benjamin et al, 1991). The RUC run provides high-frequency, short-term forecasts on a 40-km resolution domain covering the lower 48 United States and adjacent areas of Canada, Mexico, and ocean. Run with a data cutoff of 18 minutes, the analysis relies heavily on asynoptic data from surface reports, profilers, and especially ACARS aircraft data. One of its unique aspects is its use of a hybrid vertical coordinate that is primarily isentropic. Most of its 40 levels are isentropic except for layers in the lowest 1-2 km of the atmosphere where terrain-following coordinates are used. The two types of surfaces change smoothly from one to another. A full package of physics is included with 5 cloud / precipitation species carried as history variables of the model.

The NGM model (Phillips, 1979) uses a flux formulation of the primitive equations, and has surface pressure, P, and Pu, Pv, PQ and Pq as prognostic variables where Q is potential temperature and q specific humidity. The finest of the nested grids has a resolution of 85 km at 45EN and covers North America and the surrounding oceans. The coarser hemispheric grid has a resolution of 170 km. Fourth-order horizontal differencing and a Lax-Wendroff time integration scheme are used. Vertical discretization is done using 16 sigma levels. Parameterized physical processes include surface friction, grid-scale precipitation, dry and moist convection, and vertical diffusion.

Regional Forecast System Products: Products from the various regional systems include:

a) Heights, winds, temperatures:

(1) Meso Eta (to 48 hours) every 25 hPa and every 3 hours at winds aloft altitudes;

(2) RUC (to 12 hours) every 25 hPa and hourly; and

(4) NGM (to 48 hours) every 50 hPa and every 6 hours.

b) 3, 6 and 12 hour precipitation totals;

c) Freezing level;

d) Relative humidity;

e) Tropopause information;

f) Many model fields in GRIB format;

g) Hourly soundings in BUFR; and

h) Hundreds of graphical output products and alphanumeric bulletins

Operational Techniques for Application of Regional Forecast System Products: Model Output Statistics (MOS) forecasts of an assortment of weather parameters such as probability of precipitation, maximum and minimum temperatures, indicators of possible severe convection, etc. are generated from NGM model output. These forecasts are made using regression techniques based on statistics from many years of NGM forecasts.

6.2.2 Future Plans for the Regional Forecast System

Near term plans for the Regional Forecasting System are:

a)    Extend Eta 0000 and 1200 runs to 60 hours initially and then out to 84 hours.

b)    Implement new 3D-VAR analysis to analyze radiances directly from polar-orbiting and geostationary satellites and radial velocities from WSR-88D radars and increase analysis update frequency to every hour.

c)    Increase horizontal resolution of Eta model to 22 km and vertical levels to 50.

d)    Make improvements to Eta model physics in land-surface interactions, convection, grid-scale precipitation and diffusion.

e)    Initialize NGM using Eta 3D-VAR analysis over North America and the AVN over the rest of the grid. Data cutoff will be 1 hour and 10 minutes.

f)    Implement 10 member Short-Range Ensemble Forecasting system to run twice per day to 60 hours on 48 km North American domain.

g)    Continue development of the nonhydrostatic version of the Eta model
(http://sgi62.wwb.noaa.gov:8080/mesojanjic/)

6.3 Specialized Forecasts

Specialized forecasts and systems include the following:

a) A Hurricane (HCN) Run is performed when requested by NCEP's Tropical Prediction Center (TPC). The HCN forecast model is the Geophysical Fluid Dynamics Laboratory (GFDL) Hurricane Model (GHM), which is a triply-nested model with resolutions of 1.0, 1/3, and 1/6 degree latitude resolution and 18 vertical levels. The outermost domain extends 75E in the meridional and longitudinal directions. Initial conditions are obtained from the current AVN run. Input parameters for each storm are provided by the TPC and include the latitude and longitude of the storm's center, current storm motion, the central pressure, and radii of 15 m/s and 50 m/s winds. Output from the model consists primarily of forecast track positions and maximum wind speeds but also includes various horizontal fields on pressure surfaces (such as winds and sea-level pressure), and some graphic products such as a swath of maximum wind speeds and total precipitation throughout the 72 hour forecast occurring at each model grid point.

b) A Hawaii run of the Regional Spectral Model (RSM) provides forecasts over the Hawaiian Islands at a very high resolution (10 km) from 00 and 12 UTC out to 48 hours for distribution to Hawaii via FTP (INTERNET). The RSM is identical to the global spectral model used in the AVN, MRF and FNL, except it is run at much higher resolution. Initial conditions for this run are interpolated from the AVN initial conditions. During the post-fire period, the RSM was run on a smaller computer which delayed its output by several hours. It will be moved to the new IBM computer early in 2000. A 10 km nested version of the Eta is being prepared as a replacement for the RSM.

c) An easily reconfigured, multi-platform version of the Eta has been developed and made available (http://sgi62.wwb.noaa.gov:8080/wrkstn_eta/). This system allows a user to run the Eta at his own site on a UNIX/LINUX workstation and provides the ability to download initial and boundary data in real-time from NCEP models.

e) The global WAve Model (WAM) (WAMDI Group, 1988) runs twice daily with AVN forecast forcing on a 2.5 degree grid, and makes global wave height predictions out to 72 hours. A new Global ocean wave forecast model (NOAA WAVEWATCH III, NWW3) will be implemented early in 2000 to be run twice daily on a 1.25 x 1.00 degree latitude/longitude grid from 78N to 78S, producing wave directions, frequencies and heights out to 72 hours (http://polar.wwb.noaa.gov/waves/Welcome.html).

f) Daily global Sea Surface Temperature (SST) analyses are made with an optimum interpolation technique which combines in-situ and satellite observations. Weekly SST analyses derived with this system are used as lower boundary conditions in the global assimilation and forecasts.

g) A storm surge model makes twice daily predictions for the East Coast of the United States out to 48 hours. The model has also been applied to the Gulf of Mexico, Virgin Islands, Puerto Rico, Guam, and Oahu, HI.

h) Wave models are run twice daily to provide sea state forecasts for the Gulf of Mexico and the Gulf of Alaska. Regional models, one covering the western half of the Atlantic Ocean and the Gulf of Mexico and the other covering the Gulf of Alaska and the Bering Sea, based on the new NWW3, have been developed and will be implemented early in 2000 to replace the current models.

i) A once per day forecast of an Ultraviolet Index (UVI) (Long, et al., 1996).

j) A seasonal ensemble climate forecast run consisting of a 20 member ensemble of an atmospheric general circulation model (AGCM). The forecasts are run once per month with 28 levels and a horizontal resolution of approximately 300 km (T42). It produces seasonally averaged forecasts out to 7 months.

k) A sea ice drift model provides guidance for the drift distance and direction over the northern hemisphere, and along the ice edges in both hemispheres. This year the guidance was extended from day 7 out to day 16.

6.4 Verification of Forecast Products - 1999

Annual verification statistics are calculated for NCEP’s global models by comparing the model forecast to the verifying analysis and to the interpolated, closest verifying radiosonde (see Tables 4 and 5).

Table 4. Verification Against Analyses for 1999.

 

-

AVN
24 hr

AVN
72 hr

MRF
120 hr

500 hPa Geopotential RMSE (m) - - -
Northern Hemisphere

12.4

35.5

62.9

Southern Hemisphere

18.7

46.6

75.9

250hPa Wind RMSVE (m/s) - - -
Northern Hemisphere

4.9

10.9

16.3

Southern Hemisphere

5.3

11.8

17.5

Tropics

4.2

7.3

8.9

850hPa Wind RMSVE (m/s) - - -
Tropics

2.8

4.4

5.2

 

Table 5. Verification Against Radiosondes for 1999.

--

AVN

24 hr

AVN

72 hr

MRF

120 hr

500hPa Geopotential RMSE (m) - - -
North America

15.9

37.7

62.6

Europe

15.6

36.6

67.1

Asia

16.4

30.4

46.6

Australia/New Zealand

12.5

27.6

43.0

250hPa Wind RMSVE (m/s) - - -
North America

7.3

13.1

18.7

Europe

6.7

12.3

18.6

Asia

7.3

11.9

15.4

Australia/New Zealand

7.0

11.4

15.3

Tropics

6.6

8.3

12.4

850hPa Wind RMSVE (m/s) - - -
Tropics

4.5

5.5

6.3

 

7. REFERENCES

Benjamin, S. G., K. A. Brewster, R. Brummer, B. F. Jewett, T. W. Schlatter, T. L. Smith, and P. A. Stamus, 1991: An isentropic three-hourly data assimilation system using ACARS aircraft observations. Mon. Wea. Rev., 119, 888-906.

Black, T. L., et al., 1993: The step-mountain eta coordinate model: 80-km "early" version and objective verifications. NWS Technical Procedures Bulletin N0. 412, NOAA, U.S. Department of Commerce.

Black, T. L., 1994: The New NMC Mesoscale Eta Model: Description and Forecast Examples. Wea. Forecasting, 9, 265-278.

Chen, F., K. Mitchell, J. Schaake, Y. Xue, H.-L. Pan, V. Koren, Q.Y. Duan, M. Ek and A. Betts, 1996: Modeling of land surface evaporation by four schemes and comparison with FIFE observations. J. Geophy. Research, 101, 7251-7268.

Collins, W. G., 1998: The new complex quality control of rawinsonde heights and temperatures at the National Centers for Environmental Prediction. WMO CAS/JSC WGNE Research Activities in Atmospheric and Oceanic Modelling, Report No. 27, 1.20.

Fels, S.B., and M.D. Schwarztkopf, 1975: The simplified exchange approximation: A new method for radiative transfer calculations. J. Atmos. Sci., 32, 1475-1488.

Gandin, L. S., 1989: Complex quality control of meteorological observations. Mon. Wea. Rev., 116, 1137-1156.

Janjic, Z. I., 1974: A stable centered difference scheme free of two-grid-interval noise. Mon. Wea. Rev., 102, 319-323.

Janjic, Z. I., 1979: Forward-backward scheme modified to prevent two-grid-interval noise and its application in sigma coordinate models. Contrib. Atmos. Phys., 52, 69-84.

Janjic, Z. I., 1984: Non-linear advection schemes and energy cascade on semi-staggerd grids. Mon. Wea. Rev., 112, 1234-1245.

Janjic, Z.I., 1994: The step-mountain Eta coordinate model: further developments of the convection, viscous sublayer, and turbulence closure schemes. Mon. Wea. Rev., 122, 927-945.

Kalnay, E., G. DiMego, S. Lord, M. Kanamitsu, A. Leetmaa and D. B. Rao, 1994: NMC modeling and data assimilation plans for 1994-1998. Preprint Volume, 10th Conference on Numerical Weather Prediction, Portland, OR, July 1994, AMS, 143-148.

Long, C. S., A. J. Miller, H.-T. Lee, J.D.Wild, R. Przywarty and D. Hufford, 1996: Ultraviolet index forecasts issued by the National Weather Service. Bull. Amer. Meteor. Soc., 77, 729-748.

Mesinger, F., 1973: A method for construction of second-order accuracy difference schemes permitting no false two-grid interval wave in the height field. Tellus, 25, 444-458.

Mesinger, F., 1977: Forward-backward scheme, and its use in a limited area model. Contrib. Atmos. Phys., 50, 200-210.

Mesinger, F., Z.I. Janjic, S. Nickovic, D. Gavrilov and D.G. Deaven, 1988: The step-mountain coordinate: model description and performance for cases of alpine lee cyclogenesis and for a case of an Appalachian redevelopment. Mon. Wea. Rev., 116, 1493-1518.

Phillips, N. A., 1979: The Nested Grid Model. NOAA Technical Report NWS 22, Silver Spring, MD, 80 pp.

Sela, J. G., 1980: Spectral Modeling at NMC. Mon. Wea. Rev., 108, 1279-1292.

Sela, J. G., 1982: The NMC spectral model, NOAA Technical Report NWS 30, Silver Spring, MD, 36 pp.

WAMDI Group, 1988: The WAM Model --- A third generation ocean wave prediction model. J. Phys. Oceanog., 18, 1775-1810.

Zhao, Q., and F. H. Carr, 1997: A prognostic cloud scheme for operational NWP models. Mon. Wea. Rev., 125, 1931-1953.

Zhao, Q., T. L. Black, M. E. Baldwin, 1997: Implementation of the cloud prediction scheme in the Eta model at NCEP. Wea. Forecasting, 12, 697-712.


WMO Front, About WMO, WWW Front, Library, International Weather