ANNEXES

Annex I

Annex to paragraph 4.3.6 of the general summary

INSTRUMENT CATALOGUE

The Commission agreed that the content and format of the Instrument Catalogue should be based on the example as follows:

Questionnaire to manufacturers of meteorological instruments and observing systems

Please provide the following information where applicable:

Name of Manufacturer:

Address:[street or P.O. Box]

[city, state]

[postcode]

[country]

Telephone: +

Fax: +

Telex:

E-mail:

Internet URL: http://...

If you are manufacturing instruments or observing systems for measurements in one or more of the following variables, please "tick" the appropriate boxes:

Item1

Variable/system

x

Name/type of instrument or sensor

I.2

Temperature

I.3

Pressure

I.4

Humidity

I.5

Surface wind

I.6

Precipitation

I.7

Radiation

I.8

Sunshine duration

I.9

Visibility

I.10

Evaporation

I.11

Soil moisture

I.12

Upper air pressure, temperature, humidity

I.13

Upper wind

I.14

Present and past weather and state of the ground

I.15

Clouds

I.16

Ozone

II.1

Automatic weather stations

II.2

Aeronautical meteorological stations

II.5

Profiling techniques (for the boundary layer and troposphere)

II.6

Rocket measurements (in the stratosphere and mesosphere)

II.7

Locating the sources of atmospherics

II.8

Satellite observations

II.9

Radar measurements

II.10

Balloon techniques

III.1

Data acquisition techniques
III.2 Quality control and calibration

Place: ............................... Date: .................... Signature: .............................................

1 The numbering applied is in conformity with the parts and chapters of the sixth edition of the Guide to Meteorological Instruments and Methods of Observation (WMO-No. 8).

2 WMO Instrument Catalogue Information Sheet


Example

WMO Instruments Catalogue Variable: Radiation

Information Sheet2 Instrument: Light bulb

Manufacturer: Bulbs Inc., xxx

Type: FRT/1997

- Example only -

Catalogue No.: x.y.z

Updated: ... 1998

General information

1. Principle of operation3 (e.g. Platinum resistance

thermometer):

2. Main technical characteristics

2.1 Application:

2.2 Measuring range:

2.3 Uncertainty:

2.4 Time constant:

2.5 Averaging time:

2.6 Reliability

2.6.1 Mean time between failures:

2.6.2 Calibration and maintenance interval:

2.7 Interface and output details:

2.8 Power requirements:

2.9 Quantitative information on operating environment (such as: temperature, humidity, wind conditions or whether suitable to work under severe weather conditions, e.g. icing, sand storm, air pollution):

Experiences and other information

3. Intercomparisons and tests performed:

4. Costs

4.1 recommended price:

4.2 annual operating costs:

5. Name, address, etc. of the manufacturer:

Tel.:

Fax.:

Telex:

E-mail:

URL (Internet server): http://

6. References, patents and registered marks information:

7. ISO, ASTM, IEC, CEN, DIN-information on standard references:

________

2 The World Meteorological Organization (WMO) does not accept responsibility for the information contained in this sheet.

3 See Guide to Meteorological Instruments and Methods of Observation.

 


Annex II

Annex to paragraph 5.1.8 of the general summary

GUIDELINES FOR ORGANIZING RADIOSONDE INTERCOMPARISONS AND FOR THE ESTABLISHMENT OF TEST SITES

PART I — GUIDELINES FOR ORGANIZING RADIOSONDE INTERCOMPARISONS

 

1. Introduction

1.1 These guidelines assume that procedures that may be established by various test facilities are consistent with procedures established by other national and international organizations. They also assume that an Organizing Committee (OC) will be formed of participants (Members) interested in comparing radiosondes and that at least one non-participant will be included with ability to provide guidance for conducting the intercomparison. The involvement of an independent non-participant is important in order to avoid bias during the planning of the intercomparison. Consideration must also be given to whether radiosonde manufacturers’ personnel should actively participate or whether independent operational personnel of the host should prepare and fly such radiosondes.

1.2 All intercomparisons differ from each other to some extent, therefore, these guidelines are to be construed only as a generalized checklist of tasks needing to be accomplished. Modifications should be made by the OC, as required, but the validity of the results and scientific evaluation should not be compromised.

1.3 Final reports of previous intercomparisons and organizational meeting reports of other OCs may serve as an example of the methods that can be adopted for the intercomparison. These previous reports should be maintained and made available by the Secretariat.

2. Objectives of intercomparisons

2.1 The intercomparison objectives must be clear, must list what is expected from the intercomparisons and identify how results will be disseminated. The OC is tasked to examine the achievements to be expected from the radiosonde intercomparison and to identify and anticipate any potential problem. The OC’s role is to provide guidance, but it must also prepare clear and detailed statements of the main objectives and agree on the criteria to be used in evaluating the results. The OC should also determine how best to guarantee the success of the intercomparison by drawing on background knowledge and accumulated experience from previous intercomparisons.

3. Place, date and duration of intercomparison

3.1 The host facility should provide to the OC and to the participants a description of the proposed intercomparison site and facilities (locations, etc.), environmental and climatological conditions, and site topography. The host facility should also name a Project Leader (PL) or Project Manager who will be responsible for the day-to-day operation and act as the facility point of contact.

3.2 The OC should visit the proposed site to determine the suitability of its facilities and to propose changes, as necessary. After the OC agrees that the site and facilities are adequate, a site and environmental description should be prepared by the PL for distribution to the participants. The PL, who is familiar with his facility’s schedule, must decide the date for the start of the intercomparison, as well as its duration. A copy of this schedule shall be delivered to the OC.

3.3 In addition to the starting date of the intercomparisons, the PL should propose a date when his facility will be available for the installation of the participant’s equipment and arrange for connections to the data acquisition system. Time should be allowed for all of the participants to check and test equipment prior to starting the intercomparison and to allow additional time to familiarize the operators with the procedures of the host facility.

4. Participation

4.1 As required, the PL and/or OC should invite, through the Secretary-General of WMO, participation of Members. However, once participants are identified, the PL should handle all further contacts.

4.2 The PL should draft a detailed questionnaire to be sent by the Secretary-General to each participant in order to obtain information on each instrument type proposed to be intercompared. Participants are expected to provide information on their space, communication, unique hardware hookup requirements, and software characteristics. They also should provide adequate documentation describing their ground and balloon-borne instrumentation.

4.3 It is important that participants provide information about their radiosonde calibration procedures against recognized standards. Although it is expected that operational radiosondes will be intercompared, this may not always be the case; new or research-type radiosondes may be considered for participation with the agreement of all of the participants, the PL, and the OC.

5. Responsibilities

5.1 Participants

5.1.1 The participants shall be responsible for the transportation of their own equipment and costs associated with this transportation.

5.1.2 The participants should install and remove their own equipment with the cognizance of the PL. The host facility shall assist with unpacking and packing, as appropriate.

5.1.3 The participants shall provide all necessary accessories, mounting hardware for ground equipment, signal and power cables, spare parts and expendables unique to their system. The participants shall have available (in the event assistance from the host facility becomes necessary) detailed instructions and manuals needed for equipment installation, operation, maintenance and, if applicable, calibration.

5.2 Host facility

5.2.1 The host facility should assist participants in the unpacking and installation of equipment as necessary, and provide storage capability to house expendables, spare parts, manuals, etc.

5.2.2 The host facility should provide auxiliary equipment as necessary, if available.

5.2.3 The host facility should assist the participants with connections to the host facility’s data acquisition equipment, as necessary.

5.2.4 The host shall insure that all legal obligations relating to upper-air measurements (e.g., the host country’s aviation regulations, frequency utilization, etc.) are properly met.

5.2.5 The host facility may provide information on accommodations, local transportation, daily logistics support, etc., but is not obligated to subsidize costs associated with personnel accommodations.

6. Rules during the intercomparison

6.1 The PL shall exercise control of all tests. He will keep a record of each balloon launch, together with all the relevant information on the radiosondes used in the flight and the weather conditions.

6.2 Changes in equipment or software will be permitted with the cognizance and concurrence of the PL. Notification to the other participants is necessary. The PL shall maintain a log containing a record of all the equipment participating in the comparison and any changes that occur.

6.3 Minor repairs (e.g., fuse replacement, etc.) not affecting instrumentation performance are allowed. The PL should be made aware of these minor repairs and also submit the information to the record log.

6.4 Calibration checks and equipment servicing by participants requiring a specialist or specific equipment will be permitted after notification to the PL.

6.5 Any problem that compromises the intercomparison results or the performance of any equipment shall be addressed by the PL.

7. Data acquisition

7.1 The OC should agree on appropriate data acquisition procedures such as measurement frequency, sampling intervals, data averaging, data reduction (this may be limited to individual participant’s capability), data formats, real-time quality control, post-analysis quality control, data reports, etc.

7.2 All data acquisition hardware and software provided by the host facility should be well tested before commencement of the intercomparison.

7.3 The time delay between observation and delivery of data to the PL shall be established by the PL and agreed on by the participants. One hour after the end of the observation (balloon burst) should be considered to be adequate.

7.4 The responsibility for checking data prior to analysis, the quality control steps to follow, and delivery of the final data rests with the PL.

7.5 Data storage media shall be the PL’s decision after taking into consideration the capability of the host facility, but the media used to return final test data to participants may vary in accordance with each of the participant’s computer ability. The PL should be cognizant of these requirements.

7.6 The PL has responsibility for providing final data to all participants and, therefore, the host facility must be able to receive all individual data files from each participant.

8. Data processing and analysis

8.1 Data analysis

8.1.1 A framework for data analysis should be encouraged and decided upon even prior to beginning the actual intercomparison. This framework should be included as part of the experimental plan.

8.1.2 There must be agreement among the participants as to methods of data conversion, calibration and correction algorithms, terms and abbreviations, constants, and a comprehensive description of proposed statistical analysis methods.

8.1.3 The OC should verify the appropriateness of the analysis procedures selected.

8.1.4 The results of the intercomparisons should be reviewed by the OC, who should consider the contents and recommendations given in the final report.

8.2 Data processing and database availability

8.2.1 All essential meteorological and environmental data shall be stored in a database for further use and analysis by the participants. The PL shall exercise control of these data.

8.2.2 After completion of the intercomparison, the PL shall provide a complete set of all of the participants’ data to each participant.

9. Final report of the intercomparison

9.1 The PL shall prepare the draft final report which shall be submitted to the OC and to the participating members for their comments and amendments. A time limit for reply should be specified.

9.2 Comments and amendments should be returned to the PL with copies also going to the OC.

9.3 When the amended draft final report is ready, it should be submitted to the OC, who may wish to meet for discussions, if necessary, or who may agree to the final document.

9.4 After the OC approves the final document for publication, it should then be sent to the Secretariat for publication and distribution by WMO.

10. Final comments

10.1 The OC may agree that intermediate results may be presented only by the PL, and that participants may present limited data at technical conferences, except that their own test data may be used without limitation. Once the WMO Secretariat has scheduled the final report for publication, the WMO shall make the data available to all Members who request them. The Members are then free to analyse the data and present the results at meetings and in publications.

 

Part II — GUIDELINES FOR THE ESTABLISHMENT OF TEST SITES

1. Introduction

1.1 In order to support the long-term stability of the global upper-air observing system, it is essential to retain the capability of performing quantitative radiosonde comparisons. Current and new operational radiosonde systems must be checked against references during flight on a regular basis. Members must ensure that a minimum number of test sites with the necessary infrastructure for performing radiosonde comparison tests are retained.

1.2 Experience with the series of WMO Radiosonde Intercomparisons since 1984 has shown that it is necessary to have a range of sites in order to compare the radiosondes over a variety of flight conditions.

1.3 Relative humidity sensor performance is particularly dependent on the conditions during a test, e.g. the amount of cloud and rain encountered during ascents, or whether surface humidity is high or low.

1.4 Daytime temperature errors depend on the solar albedo, and hence the surface albedo and cloud cover. Thus, temperature errors found at coastal sites may differ significantly from continental sites. Infrared errors on temperature sensors will not only depend on surface conditions, and cloud distribution, but also on atmospheric temperature. Thus, infrared temperature errors in the tropics (for instance near the tropopause) will be quite different from those at mid-latitudes.

1.5 The errors of many upper-wind observing systems depend on the distance the balloon travels from the launch site (and also the elevation of the balloon from the launch site). Thus, comparison tests must cover situations with weak upper winds and also strong upper winds.

2. Facilities required at locations

2.1 Locations suitable for testing should have enough buildings/office space to provide work areas to support the operations of at least four different systems.

2.2 The site should have good quality surface measurements of temperature, relative humidity, pressure and wind, measured near the radiosonde launch sites. Additional reference quality measurements of temperature pressure and relative humidity would be beneficial.

2.3 The test site should have a method of providing absolute measurements of geopotential height during test flights (either using a tracking radar or a Global Positioning System (GPS) radiosonde capable of producing accurate heights).

2.4 Supplementary observing systems, such as laser ceilometers, aerosol lidars, relative humidity lidars, ground-based radiometers and interferometers, may also prove useful.

2.5 The site must be cleared by the national air traffic control authorities for launching larger balloons (3000 g) with payloads of up to 5 kg. Balloon sheds must be able to cope with launching these large balloons.

3. Suggested geographical locations

3.1 In order to facilitate testing by the main manufacturers, it is suggested that test sites should be retained or established in mid-latitudes in North America, Europe and Asia. Ideally, each of these regions would have a minimum of two sites, one representing coastal (marine) conditions, and another representing conditions in a mid-continent location.

3.2 In addition, it is suggested that a minimum of two test locations should be identified in tropical locations, particularly for tests of relative humidity sensors.

3.3 If the main test sites noted above do not provide adequate samples of extreme conditions for relative humidity sensors (e.g. very dry low-level conditions), it may be necessary to identify further test sites in an arid area, or where surface temperatures are very cold (less than –30°C in winter).


Annex III

Annex to paragraph 9.4 of the general summary

PROVISIONAL PROGRAMME OF WMO INTERNATIONAL COMPARISONS AND EVALUATIONS OF METEOROLOGICAL INSTRUMENTS (1998–2002)

Number

Title of proposed WMO intercomparisons

Year(s)

Site(s)

1

Ninth International Pyrheliometer Comparison (IPC-IX)

2000

WRC, Switzerland

2

Regional Pyrheliometer Comparisons (RPCs)1

2000–2002

Either in conjunction with IPC-IX or at RPCs concerned

3

Regional/National Thermometer Screen/Shielding Intercomparisons

(As required)

ongoing

In various climatic regions

4

International Hygrometer Intercomparison

(Tentative)

  In various climatic regions

5

International Long-wave Radiometers and Sunphotometer Intercomparison

(Tentative)

   

6

International/National Radiosonde Intercomparisons

(Tentative)

  Preferably in tropical regions

7

Testing and Evaluation of GPS Radiosonde Systems

(Tentative)

   

8

International Intercomparison of UV Radiation Instruments (preferably in conjunction with an ozonesonde comparison)

(Tentative)

   

9

International Rainfall Intensity Measurement Intercomparison2

(Tentative)

  In various climatic regions

10

Intercomparison designed to establish precipitation correction procedures for the special conditions encountered in the polar regions

(Tentative)

  In polar regions

11

National/Regional Evaporation Pan Intercomparisons

(As required)

ongoing

In various climatic regions

1 Contained in the programmes of the Regional Associations concerned.

2 Depending on the recommendations of an expert meeting to be organized on this issue.


Annex IV

Annex to paragraph 12.2 of the general summary

fifth wmo long-term plan

 

Programme 1.6 — Instruments and Methods of Observation (IMOP)

Purpose and scope

6.1.32 The purpose of the Instruments and Methods of Observation Programme (IMOP) is to coordinate, standardize and advance technology, systems and methods for observing meteorological and related environmental variables to ensure the required availability and high quality of the relevant systems and techniques which are fundamental for all WMO Programmes. The programme ensures publication of technical regulations and guidance material on observing practices and methods and performance characteristics of instruments.

6.1.33 IMOP supports implementation and operation of all types of observational systems through coordinated calibrations and intercomparisons, and development of quality-control standards and procedures. It also develops guidelines and proposals for supporting activities in capacity building in that specific area and arranges and supports training programmes for instrument experts and technicians.

Main long-term objectives

6.1.34 The main long-term objectives of IMOP are:

(a) Improvements in the quality of observations and measurements of meteorological and related environmental variables through coordinating and promoting the use of modern methods and technology to meet the requirements of operational applications and research;

(b) Promoting the effective and economic use of observing technology/systems through training and technology transfer for coping with specific needs of developing countries.

Implementation for the period 2000–2003

6.1.35 Implementation components will include:

(a) The development of methodologies and reference instruments to ensure the global availability of observations of high quality;

(b) Reviewing and developing guidance material and recommendations for instrument performance, observing methodology, data processing algorithms, calibration, installation, maintenance and quality assurance;

(c) Planning, coordinating and conducting instrument intercomparisons, calibrations and other trials in accordance with standardized procedures, and publishing the results for use by Members and manufacturers;

(d) Monitoring and promoting calibration and validation activities of surface- and space-based remote sensing techniques;

(e) Promotion of Members’ participation in relevant activities of the International Organization for Standardization (ISO) and of other international organizations to ensure consistency in the relevant standardizations of ISO and those of WMO;

(f) Development and periodic distribution of information on new/improved observing and data acquisition technologies, as regards their technical and quality specifications and related economic considerations to satisfy the needs of both developed and developing countries;

(g) Promotion of research and development of new or improved solutions in the field of measuring technology through interaction between Members’ experts and instrument manufacturers and other appropriate organizations. This will also include development of instrumentation that is more efficient in the use of radio-frequency bandwidths and more cost-effective with respect to investment, maintenance and/or operation;

(h) Building capacity in developing countries in the field of instrumentation and methods of observation will be addressed through training events at regional or subregional level and technology transfer activities centred on the most pressing instrumentation problems;

(i) Development of guidance material for assisting Members in selecting the most cost-effective and economic use of observational technology, and conducting studies on identification of the specific regional needs and priorities with essential involvement of the RICs.