|
WWW Technical Progress Report on the Global Data Processing System 1999 THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER SERVICE: U.S.A. 1. Highlights For Calendar Year 1999 Application software development efforts were focused on the conversion from the Cray computer systems to the new IBM SP system throughout all of calendar year 1999. On 27 September, at the height of the conversion effort and just prior to the completion of the Cray C90 code conversion, the C90 suffered catastrophic fire damage and was taken out of service. Even though the C90 code conversion was nearly complete, the IBM SP was not available to NCEP operations because it was being physically relocated from the Suitland Federal Center to the Bowie Census Computer Center. From 27 September to 17 November, when the IBM SP installation in Bowie was completed, NCEP production was run in a degraded backup configuration, utilizing forecast products created on the Cray J916s and from the NOAA Forecast Systems Laboratory (FSL), Air Force Weather Agency (AFWA) and Navy Fleet Numerical Meteorology and Oceanography Center (FNMOC). Table 1 describes the configuration of NCEP production during that period. Table 1. NWS computer configuration at Federal Building 4 (FB4), Suitland Federal Center, Suitland, Maryland during the period 27 September - 17 November, 1999
On 17 November, the NCEP production suite was returned to full production mode with all C90 applications running on the IBM SP. The conversion of Cray J916 applications continued for the remainder of the calendar year and is expected to be completed by April 1, 2000. The major changes introduced into the NCEP Operational Production Suite in 1999 as part of the Cray to IBM SP conversion were:
2. Equipment In Use 2.1 Status at the End of 1999 Within the Suitland, Maryland, Federal Center computer complex (FB4), there is an optical fiber based TCP/IP network, as well as Network Systems Corporation (NSC) routers and various networking hub equipment. Broadband communication links, both Fiber Distributed Data Interface (FDDI) and FDDI Network Service (FNS), are connected to the NOAA Science Center (NSC), Silver Spring Metro Center (SSMC2), and the Goddard Space Flight Center (GSFC). The GSFC link provides access to the Internet. Additional high speed links tie in the NCEP Centers located in Miami, Florida (Tropical Prediction Center), Kansas City, Missouri (Aviation Weather Center) and Norman, Oklahoma (Storm Prediction Center). A large number of network-connected scientific workstations (mostly SGI, HP, and Sun machines) are used throughout NCEP. Selected UNIX workstations and UNIX-based communications servers are available through telephone access. This provides dial-in capability for NCEP and other approved users to all network attached machines including the Crays. There are two Cray J916s in the Federal Center complex. The J-916s are connected to each other via High-Performance Parallel Interface (HIPPI, 100 MB/s) channels and switches, as well as through FDDI (10 MB/s) and ethernet connections. Both Cray systems have access to a Redundant Array of Independent Disks (RAID) -technology Network Disk Array (NDA) with a capacity of nearly 850 GBs. The HIPPI channels and switches are currently used only for direct non-shared access to the NDA from each machine. The 850 GBs of NDA storage is apportioned among the two Crays. NFS cross-mounting allows access of most file systems on both Cray systems for non-operational use. The Census Bowie Computer Center houses the IBM SP computer system. The networking infrastructure at the Census Bowie Computer Center (CBCC) consists of two Fore System ASX-200BX ATM switches, one Fore Systems Power Hub-8000, and one IBM Accend Router. The ASX-200BX's are connected together by an ATM OC12 (622MB) fiber optic link. These two switches are also connected to the Power Hub and the Accend Router via OC3 (155MB) fiber optic link. The Power Hub provides (100BaseT) communications to local computer systems and the Ascend router provides communications to the IBM SP. Bell Atlantic provides high speed ATM OC3 (155MB) communications from the ASX-200BX's systems to the National Weather Service (NWS) located at the World Weather Building (WWB) in Camp Springs, MD and also provides 10MB Fast Network Service to the NWS in Silver Spring, MD. Table 2 details the NWS computer configuration at the Suitland Federal Center and Census Bowie Computer Center existing at the end of 1999. TABLE 2. NWS computer configuration at Federal Building 4 (FB4), Suitland Federal Center, Suitland, Maryland, and the Census Bowie Computer Center, Bowie, Maryland as of Dec. 31, 1999.
The large scale numerical weather forecast models and data assimilation systems are run on the IBM SP. In step with the models processing, model output is incrementally transferred to one of the J-916s. On this system, the application programs generate bulletins and graphic products which are made available to the on-site forecasters and to National Weather Services Office of Systems Operations (OSO) for distribution. The second J-916 serves as a backup machine should the other Cray not be available. A Storage Technology Corporation (STK) Automated Cartridge System Library System (ACSLS), consisting of four Library Storage Modules (LSM), provides both Crays access to approximately 10 terabytes of near-line storage, and one of the J-916s access up to an additional 50 terabytes of near-line storage. This LSM was installed in August, 1997. These devices support almost all the tape processing accomplished through the Crays. One supported function is the hierarchical data migration using Cray's Data Migration Facility (DMF) software. This provides the user community with 91 GBs of online storage backed by 2.1 TB of near-line storage. The Automated Cartridge Libraries also manage the 23,000 Cray Reel Library repository. 2.2 Future Plans NCEP procured an IBM SP computer system which will replace the current Cray systems in early 2000. The contract was awarded in October 1998, and the system was accepted in June 1999. However, the Cray J-916s will continue to be utilized for operational purposes until early 2000. Additional UNIX equipment is planned to be purchased to replace the data ingest workstations, and the Supervisor Monitor Scheduler (SMS) workstations.
3. Observational Data Ingest and Access System 3.1 Status at the End of 1999 3.1.1 Observational Data-Ingest NCEP receives the majority of its data from the Global Telecommunications System (GTS), the NOAA Environmental Satellite, Data, and Information Service (NESDIS), and aviation data circuits. Table 3 contains a summary of the types and amounts of data available to NCEPs global data assimilation system during January, 2000. The GTS and aviation circuit bulletins are transferred from the NWS Telecommunications Gateway to NCEPs Central Operations (NCO) over two 56 kbps lines. Each circuit is interfaced through an X.25 pad connected to a PC running a LINUX operating system with software to accumulate the incoming data-stream in files. Each file is open for 20 seconds after which the decayed-file is queued to the Distributive Brokered Network (DBNet) server for distributive processing. Files containing GTS observational data are networked to one of two Silicon Graphics Origin 200 workstations. There the data-stream file is parsed for bulletins which are then passed to the Local Data Manager (LDM). The LDM controls continuous processing of a bank of on-line decoders by using a bulletin header pattern-matching algorithm. Files containing GTS gridded-data are parsed on the LINUX PC, "tagged by type" for identification, and then transferred directly to the Cray J916s by DBNet. There, all data are stored in appropriate accumulating data files according to the type of data. Some observational data and gridded data from other producers (e.g., satellite observations from NESDIS) are processed in batch mode on the Cray J916s as the data become available.
Table 3. Summary of data used in NCEPs global data assimilation system (GDAS). Data counts are averages for January 2000.
3.1.2 Decoder Processing The decoder software is designed to divide processing into two independent parts: an observation-parser and an application-encoder. In between, a common data-interface is utilized so that different encoding software can be conveniently interchanged to meet the requirements of different applications. The two primary data representation forms used by application software at NCO are the World Meteorological Organization (WMO) Binary Universal Form for the Representation of meteorological data (BUFR) for numerical weather prediction (NWP) modeling needs and the GEneral Meteorological PAcKage (GEMPAK) form for interactive forecasting needs. Both are flexible, compact, self-defining data representation forms. The same observation-parser software can produce both of these data representation forms using the common in-memory interface and the appropriate application encoding software. 3.1.3 NWP Database Ingest The observational decoders process in parallel, parsing observations and encoding them into the WMO BUFR data representation form. Each decoder stores its encoded observations in memory until the array reaches 10,000 bytes or the decoders observation type or subtype changes or the end of a bulletin is reached. The contents of the decoders old array are then saved in a file on disk, and a new array in memory is acquired. Special binding software is used to manage the decoder files so that a file accumulates encoded observations until the files time window has expired. Once a new file is automatically opened, the old file is available for transfer. This file aging technique allows the decoding of new observations and the transfer of decoded observations to be executed in parallel without fracturing a file or an observation. Files are aged for two minutes so there is an average one minute delay in the availability of an observation after it has been decoded. Aged files from all decoders are accumulated into a single file before being transferred to the Cray J916s. The automatic DBNet transfer process triggers the release of a job on the Cray J916s which parses each message in the BUFR file by type, subtype and date/time information, opens the appropriate standard UNIX sequential file and adds the message at the end of the file. The J916 BUFR database consists of UNIX subdirectories and files and an arbitrary "database root" directory which facilitates parallel testing and recovery on other platforms. Each file is described by the UNIX path/ filename convention as "yyyymmdd/bmmm/xxsss", where "yyyymmdd" is the date during which each observation is valid, "mmm" is the message type, and "sss" is the message subtype. Each file contains all BUFR messages with a particular message type and subtype valid on a particular day. Observational files remain on-line for several days before migration to off-line cartridges. During the on-line time, there is open access to them for accumulating late arriving observations and for research and study. 3.1.4 Data Access The process of accessing the observational data base and retrieving a certain set of data is accomplished in several stages by a number of FORTRAN codes. This process is operationally run many times a day to assemble data for model assimilation and dissemination. The script that manages the retrieval of observations provides users with a wide range of options. These include observational date/time windows, specification of geographic regions, data specification and combination, duplicate checking and part merging, and parallel processing. The primary retrieval code (DUMPMD) performs the initial stage of all data dumping by retrieving subsets of the database that contain all the database messages valid for the time window requested by a user. DUMPMD looks only at the date in BUFR section one to determine which messages to copy. This will result in a set containing possibly more data than was requested, but allows DUMPMD to function very efficiently. A final winnowing of the data to a set with the exact time window requested is done by the duplicate checking and merging codes applied to data as the second stage of the process. Finally, manual quality marks are applied to the data extracted. The quality marks are provided by two NCEP groups: the NCO Senior Duty Meteorologists (SDMs) and the Marine Prediction Center (MPC). 3.2 Future Plans There are several major changes anticipated for the observational ingest system in 2000. The first involves moving the database ingest and data access from the Cray J916s to the IBM RS/6000 SP. The second involves migrating the observational data ingest processing from the SGI Origin 200s to the IBM RS/6000 SP. The benefits of these changes are faster processing and quicker recovery from outages.
4. Quality Control System 4.1 Status at the End of 1999 Quality control (QC) of data is performed at NCEP, but the quality controlled data is not disseminated on the GTS. However, QC information is included in various monthly reports disseminated by the NCEP. The data quality control system for numerical weather prediction at the NCEP has been designed to operate in two phases: interactive and automated. The nature of the quality control procedure is somewhat different for the two phases. 4.1.1 Interactive Phase During the first phase, interactive quality control is accomplished by the MPC and the SDMs. The MPC personnel use an interactive system that provides an evaluation of the quality of the marine surface data provided by buoys (drifting and stationary) and ships based on comparisons with the models first guess, the providers track, and a history file of the observation provider. The MPC personnel can flag the data as to the quality, and this is stored in a file on the mainframe for use during the assimilation phase. The SDM performs a similar process of quality assessment for radiosonde temperature, wind and height data and aircraft temperature and wind reports. The SDMs use an interactive program which initiates the "off-line" running of two of the automated quality control programs (described in the next paragraph) and review the programs decisions before making additional or negating quality assessment decisions. The SDMs use satellite pictures, meteorological graphics, continuity of data, past station performance and horizontal data comparisons or "buddy checks" to decide whether or not to override automatic data QC flags. 4.1.2 Automated Phase In the automated phase, the first step is to include any manual quality marks attached to the data by MPC personnel and the SDMs. This occurs when time-windowed BUFR data dump files are created from the NCEP BUFR observational data base. Next is the preprocessing program (PREPDATA) which makes some simple quality control decisions to handle special problems and re-codes the data in a special BUFR format with descriptors to handle and track quality control changes. In the process, certain classes of data, e.g., surface marine reports over land, and vertical azimuth Doppler radar winds, are flagged for non-use for the assimilation but are included for monitoring purposes. A subsequent program (PREVENTS) applies the first guess background and observational errors to the observations. Under special conditions (e.g., data too far under the model surface), observations are flagged for non-use by the assimilation. Separate automated quality control algorithms for radiosonde, non-automated aircraft and wind profiler reports are run next. The purpose of these algorithms is to eliminate or correct erroneous observations that arise from location, transcription or communications errors. Attempts are made, when appropriate, to correct commonly occurring types of errors. Radiosonde temperature and height data pass through the Complex Quality Control of Heights and Temperatures (CQCHT) program (Gandin, 1989), which makes extensive hydrostatic, baseline, and horizontal and vertical consistency checks based upon differences from the 6-hour forecast. Corrections and quality values are then applied to the radiosonde data. In April 1997, a new CQCHT algorithm was installed that performs the quality control for all levels as a whole, rather than considering the mandatory levels first, and then the significant levels. In addition, an improvement was made to the way in which the hydrostatic residuals are calculated and used (Collins, 1998). AIREP, PIREP, and AMDAR aircraft reports are also quality controlled through a track-checking procedure by the Aircraft Quality Control (ACQC) program. In addition, AIREP and PIREP reports are quality controlled in two ways: isolated reports are compared to the first guess and groups of reports in close geographical proximity are inter-compared. Both CQCHT and ACQC are also run "offline" by the SDM. Finally, wind profiler reports are quality controlled with a complex quality control program using multiple checks based on differences from a 6-hour forecast, including a height-time check. The final part of the assimilation is for all data types to be checked using an optimum interpolation based quality control (OIQC) algorithm, which uses the results of both phases of quality control. As with any complex quality control procedures, this program operates in a parallel rather than a serial mode. That is, a number of independent checks (horizontal, vertical, geostrophic) are performed using all admitted observations. Each observation is subjected to an optimum interpolation formalism using all observations but itself in each check. A final quality decision (keep, toss, or reduced confidence weight) is made based on the results from all individual checks and any manual quality marks attached to the data by the duty personnel. Results from all the checks are kept in an annotated observational data base. 4.2 Future Plans A review of the quality control system will be conducted in 2000 with the purpose of planning upgrades to various parts of the system. One part of this review will be to investigate the methodology for inclusion of quality control functions within the three-dimensional variational (3DVAR) analysis. This should lead to a method that unifies the analysis and the quality control and which is applicable to all data types. If found to be successful, the use of a separate, automated quality control step for all data will not be needed, although separate QC for radiosonde heights and temperatures, aircraft and Profiler data will remain useful. A separate quality control procedure for vertical azimuth display wind reports from the NEXRAD system will be implemented in Spring, 2000.
5. Monitoring System 5.1 Status at the End of 1999 5.1.1 Real-time Monitoring As mentioned in the previous paragraphs, "real-time" monitoring of the incoming GTS and satellite data is performed by a number of computer programs which run automatically before each assimilation, or are run interactively by the NCEP Central Operations SDMs, and provide information on possible action. If there are observational types or geographic areas devoid of data, the SDM will request Washington DC Regional Telecommunications Hub (RTH) assistance in obtaining the observations. The SDM may also hold up starting a NWP model to ensure sufficient data are available. Four times a day, a web site is updated with reports on what data has been received by US supported upper air sites. 5.1.2 Next-day Monitoring "Next-day" data assessment monitoring is accomplished by routinely running a variety of diagnostics on the previous day's output from the decoders, the operational quality control programs, and the NWP analyses to detect problems. When problems are detected, steps are taken to determine the origin of the problem(s), to delineate possible solutions or improvements, and to contact appropriate data providers if it is an encoding or instrument problem. 5.1.3 Delayed-time Monitoring "Delayed-time" monitoring includes a twice weekly automated review of our production reject list and monthly reports on the quantity and quality of data in accordance with the WMO/CBS that are shared with other GDPS centers. A monthly report is prepared showing the quality, quantity, and timeliness of US supported sites. Monthly statistics on hydrostatic checks and guess values of station pressure are used to help find elevation or barometric problems at upper air sites. This monitoring system includes statistics on meteorological data that can be used for maintaining our reject list and for contacting sites with problems. 5.2 Future Plans The operational capability to find current upper air reports that are in reality duplicates of old data and to track-check ACARS aircraft data will be improved. New software will be developed to provide the capability to automatically diagnose deficiencies in the numbers of reports within various data categories and subcategories and alert the SDMs of deficiencies. New procedures and software will be added to improve real-time monitoring.
6. Forecasting System 6.1 Global Forecasting System 6.1.1 Status of the Global Forecasting System at the End of 1999 Global Forecast System Configuration: The global forecasting system consists of:
Global Data Assimilation System: Global data assimilation for the FNL and AVN is done with a multi-variate Spectral Statistical Interpolation (SSI). This is a 3-dimensional variational technique in which a linear balance constraint is incorporated, negating the need for a separate initialization step. The analyzed variables are the associated Legendre spectral coefficients of temperature, vorticity, divergence, water vapor mixing ratio, and the natural logarithm of surface pressure (lnPsfc). All global analyses are done on 28 sigma levels at a T126 spectral truncation. Two new data sources added in 1999 were radiances from the NOAA -15 Advanced Microwave Sounding Unit (AMSU-A) and the High Resolution Infrared Radiation Sounder (HIRS/3). Data cut-off times are 0600, 0930, 2100, and 2130 UTC for the 0000, 0600, 1200, and 1800 UTC FNL analyses, respectively, and 0245, 0845, 1445, and 2045 UTC for the 0000, 0600, 1200, and 1800 UTC AVN anayses.. Global Forecast Model: The global forecast model (Sela 1980, 1982) has the associated Legendre coefficients of lnPsfc, temperature, vorticity, divergence and water vapor mixing ratio as its prognostic variables. The vertical domain includes the entire depth of the atmosphere and is discretized with 28 sigma layers. The Legendre series for all variables are truncated at either T126 (FNL, AVN and the first seven days of the MRF) or T62 (MRF for days 8 through 16 and the ensemble forecasts) triangular truncation. A semi-implicit time integration scheme is used. The model includes a full set of parameterizations for physical processes, including moist convection, cloud-radiation interactions, stability dependent vertical diffusion, evaporation of falling rain, similarity theory derived boundary layer processes, land surface vegetation effects, surface hydrology, and horizontal diffusion. See the references in Kalnay et al (1994) for details. Global Forecast System Products: Products from the global system include:
d) A large number of graphic products. 6.1.2 Future Plans for the Global Forecasting System Near term changes planned for the production suite include:
6.2 Regional Forecast System 6.2.1 Status of the Regional Forecasting Systems at the End of 1999 Regional Forecast System Configuration: The regional systems are:
Regional Forecast System Data Assimilation: Initial conditions for the four Meso Eta forecasts are produced by a multivariate 3-dimensional variational (3DVAR) analysis which uses as its first guess a 3 hour Meso Eta forecast from the Eta-based Data Assimilation System (EDAS - Rogers, et al., 1996). The EDAS is a fully cycled system using 3-hour Meso Eta forecasts as a background and global fields only for lateral boundary conditions. Data cut-off is at 1 hour and 10 minutes past the nominal analysis times. No initialization is applied. Initial conditions for the RUC Model are provided by an optimum interpolation methodology which analyzes directly on the model grid points and on its hybrid sigma-theta vertical coordinate surfaces. Hourly data cut-off times are 18 minutes with special dumps at 55 minutes after 0000 and 1200 UTC. A digital filter initialization is applied. Until the fire in NCEPs operational Cray C-90 computer, initial conditions for the twice daily NGM forecasts came from a hemispheric optimum interpolation analysis which used as its first guess a 3 hour NGM forecast from the Regional Data Assimilation System (RDAS). The RDAS performed 3 hour updates during a 12 hour pre-forecast period but started from the global fields each 12-hours and so was not a fully cycled system. Data cut-off times are 2 hours past the synoptic time. An implicit normal mode initialization was used. With the loss of the C-90 computer in September of 1999, the remaining operational computers no longer provided the resources needed to run the RDAS, so initial conditions for the NGM are now provided by a static optimum interpolation analysis using the 6 hour GDAS forecast as a first guess. Regional Forecast System Models: The Mesoscale Eta forecast model ( Black et al., 1993 & 1994; Mesinger et al, 1988) has surface pressure, temperature, u, v, turbulent kinetic energy, water vapor mixing ratio and cloud water/ice mixing ratio as its prognostic variables. The vertical domain is discretized with 45 eta layers with the top of the model currently set at 25 mb. The horizontal domain is a 32 km semi-staggered Arakawa E-grid covering all of North America. An Euler-backward time integration scheme is used. The model is based on precise dynamics and numerics (Janjic 1974, 1979, 1984; Mesinger 1973, 1977), a step-mountain terrain representation (Mesinger 1984) and includes a full set of parameterizations for physical processes, including Janjic (1994) modified Betts-Miller convection, Mellor-Yamada turbulent exchange, Fels-Schwartzkopf (1974) radiation, a land surface scheme with 4 soil layers (Chen et al. 1996) and a predictive cloud scheme (Zhao and Carr 1997, Zhao et al. 1997). The lateral boundary conditions are derived from the prior global AVN forecast at a 3 hour frequency. The RUC system was developed by the NOAA/Forecast Systems Laboratory under the name of Mesoscale Analysis and Prediction System (MAPS) (Benjamin et al, 1991). The RUC run provides high-frequency, short-term forecasts on a 40-km resolution domain covering the lower 48 United States and adjacent areas of Canada, Mexico, and ocean. Run with a data cutoff of 18 minutes, the analysis relies heavily on asynoptic data from surface reports, profilers, and especially ACARS aircraft data. One of its unique aspects is its use of a hybrid vertical coordinate that is primarily isentropic. Most of its 40 levels are isentropic except for layers in the lowest 1-2 km of the atmosphere where terrain-following coordinates are used. The two types of surfaces change smoothly from one to another. A full package of physics is included with 5 cloud / precipitation species carried as history variables of the model. The NGM model (Phillips, 1979) uses a flux formulation of the primitive equations, and has surface pressure, P, and Pu, Pv, PQ and Pq as prognostic variables where Q is potential temperature and q specific humidity. The finest of the nested grids has a resolution of 85 km at 45EN and covers North America and the surrounding oceans. The coarser hemispheric grid has a resolution of 170 km. Fourth-order horizontal differencing and a Lax-Wendroff time integration scheme are used. Vertical discretization is done using 16 sigma levels. Parameterized physical processes include surface friction, grid-scale precipitation, dry and moist convection, and vertical diffusion. Regional Forecast System Products: Products from the various regional systems include:
Operational Techniques for Application of Regional Forecast System Products: Model Output Statistics (MOS) forecasts of an assortment of weather parameters such as probability of precipitation, maximum and minimum temperatures, indicators of possible severe convection, etc. are generated from NGM model output. These forecasts are made using regression techniques based on statistics from many years of NGM forecasts. 6.2.2 Future Plans for the Regional Forecast System Near term plans for the Regional Forecasting System are:
6.3 Specialized Forecasts Specialized forecasts and systems include the following:
6.4 Verification of Forecast Products - 1999 Annual verification statistics are calculated for NCEPs global models by comparing the model forecast to the verifying analysis and to the interpolated, closest verifying radiosonde (see Tables 4 and 5). Table 4. Verification Against Analyses for 1999.
Table 5. Verification Against Radiosondes for 1999.
7. REFERENCES Benjamin, S. G., K. A. Brewster, R. Brummer, B. F. Jewett, T. W. Schlatter, T. L. Smith, and P. A. Stamus, 1991: An isentropic three-hourly data assimilation system using ACARS aircraft observations. Mon. Wea. Rev., 119, 888-906. Black, T. L., et al., 1993: The step-mountain eta coordinate model: 80-km "early" version and objective verifications. NWS Technical Procedures Bulletin N0. 412, NOAA, U.S. Department of Commerce. Black, T. L., 1994: The New NMC Mesoscale Eta Model: Description and Forecast Examples. Wea. Forecasting, 9, 265-278. Chen, F., K. Mitchell, J. Schaake, Y. Xue, H.-L. Pan, V. Koren, Q.Y. Duan, M. Ek and A. Betts, 1996: Modeling of land surface evaporation by four schemes and comparison with FIFE observations. J. Geophy. Research, 101, 7251-7268. Collins, W. G., 1998: The new complex quality control of rawinsonde heights and temperatures at the National Centers for Environmental Prediction. WMO CAS/JSC WGNE Research Activities in Atmospheric and Oceanic Modelling, Report No. 27, 1.20. Fels, S.B., and M.D. Schwarztkopf, 1975: The simplified exchange approximation: A new method for radiative transfer calculations. J. Atmos. Sci., 32, 1475-1488. Gandin, L. S., 1989: Complex quality control of meteorological observations. Mon. Wea. Rev., 116, 1137-1156. Janjic, Z. I., 1974: A stable centered difference scheme free of two-grid-interval noise. Mon. Wea. Rev., 102, 319-323. Janjic, Z. I., 1979: Forward-backward scheme modified to prevent two-grid-interval noise and its application in sigma coordinate models. Contrib. Atmos. Phys., 52, 69-84. Janjic, Z. I., 1984: Non-linear advection schemes and energy cascade on semi-staggerd grids. Mon. Wea. Rev., 112, 1234-1245. Janjic, Z.I., 1994: The step-mountain Eta coordinate model: further developments of the convection, viscous sublayer, and turbulence closure schemes. Mon. Wea. Rev., 122, 927-945. Kalnay, E., G. DiMego, S. Lord, M. Kanamitsu, A. Leetmaa and D. B. Rao, 1994: NMC modeling and data assimilation plans for 1994-1998. Preprint Volume, 10th Conference on Numerical Weather Prediction, Portland, OR, July 1994, AMS, 143-148. Long, C. S., A. J. Miller, H.-T. Lee, J.D.Wild, R. Przywarty and D. Hufford, 1996: Ultraviolet index forecasts issued by the National Weather Service. Bull. Amer. Meteor. Soc., 77, 729-748. Mesinger, F., 1973: A method for construction of second-order accuracy difference schemes permitting no false two-grid interval wave in the height field. Tellus, 25, 444-458. Mesinger, F., 1977: Forward-backward scheme, and its use in a limited area model. Contrib. Atmos. Phys., 50, 200-210. Mesinger, F., Z.I. Janjic, S. Nickovic, D. Gavrilov and D.G. Deaven, 1988: The step-mountain coordinate: model description and performance for cases of alpine lee cyclogenesis and for a case of an Appalachian redevelopment. Mon. Wea. Rev., 116, 1493-1518. Phillips, N. A., 1979: The Nested Grid Model. NOAA Technical Report NWS 22, Silver Spring, MD, 80 pp. Sela, J. G., 1980: Spectral Modeling at NMC. Mon. Wea. Rev., 108, 1279-1292. Sela, J. G., 1982: The NMC spectral model, NOAA Technical Report NWS 30, Silver Spring, MD, 36 pp. WAMDI Group, 1988: The WAM Model --- A third generation ocean wave prediction model. J. Phys. Oceanog., 18, 1775-1810. Zhao, Q., and F. H. Carr, 1997: A prognostic cloud scheme for operational NWP models. Mon. Wea. Rev., 125, 1931-1953. Zhao, Q., T. L. Black, M. E. Baldwin, 1997: Implementation of the cloud prediction scheme in the Eta model at NCEP. Wea. Forecasting, 12, 697-712. |