The RODEO project develops a user interface and Application Programming Interfaces (API) for accessing meteorological datasets declared as High Value Datasets (HVD) by the EU Implementing Regulation (EU) 2023/138 under the EU Open Data Directive (EU) 2019/1024. The project also fosters the engagement between data providers and data users for enhancing the understanding of technical solutions being available for sharing and accessing the HVD datasets. This project provides a sustainable and standardized system for sharing real-time surface weather observations in line with the HVD regulation and WMO WIS 2.0 strategy. The real-time surface weather observations are made available through open web services, so that they can be accessed by anyone.
The weather radar data is also considered as HVDs, and therefore, one of the goals of RODEO is to supply near real-time weather radar observations. The radar data will be published on both a message queue using MQTT and EDR compliant APIs. Metadata will also be made available through OGC Records APIs. The system architecture is portable, scalable and modular for taking into account possible future extensions to existing networks and datasets.
There are three types of data available via ORD.
- European single-site radar data are available through the EUMETNET OPERA programme, both as a 24-hour rolling cache and as an extensive archive. The data are provided in BUFR format for older datasets and in ODIM HDF5 format for more recent ones.
- European composite products — including maximum reflectivity factor, instantaneous rain rate, and 1-hour rainfall accumulation — are available both as a 24-hour rolling cache and as a long-term archive dating back to 2012. These products are provided by the EUMETNET OPERA programme in ODIM HDF5 and cloud-optimized GeoTIFF formats.
- National radar product, e.g. national radar composites, rain rate composites, accumulation products, and echo tops. These are provided as a link to be downloaded from the national interfaces, and typically in ODIM HDF5 or cloud-optimized GeoTiffs.
-
The interface to ORD is posting a json structured file with the required metadata and a link to the file for each ready product
-
National radar volume data is not shared via ORD API, only products. The data sharing is happening via OPERA to ORD API (if data sharing is authorized)
-
Product format needs to be either ODIM H5 or cloud optimized GeoTiff
-
Products are locally hosted in a national data store
-
Products can be accessible via an api or a public data store
The ORD system includes three endpoints for ingesting and sharing data:
- Used for uploading and sharing BUFR files.
- For OPERA to ingest the European single site data to European Weather Cloud S3 storage
- The ingester module:
- Extracts metadata from BUFR files and stores it in the database.
- Uploads the original (or renamed) BUFR file to the ORD S3 bucket.
- Processes ODIM files.
- For OPERA to ingest the European single site data and OPERA composites to European Weather Cloud S3 storage
- The ingester module:
- Extracts metadata from ODIM files and stores it in the database.
- Uploads the original (or renamed) ODIM file to the ORD S3 bucket.
- Enables sharing locally stored radar data.
- For National Meteorological Services (NMSs) to provide national products via ORD
- Users provide radar metadata through the JSON endpoint.
This tool includes a JSON message generator for creating custom json_upload_schema
files and a validator script to verify the schema. The message generator creates distinct JSON schemas for each quantity at each level.
Product | level value | notes |
---|---|---|
SCAN | int(elangle *100) | Elevation of the SCAN |
PVOL | int(elangle *100) | Elevation of the current dataset |
PPI | Product parameter | Elevation angle used |
CAPPI | Product parameter | Layer height above the radar |
PCAPPI | Product parameter | Layer height above the radar |
ETOP | Product parameter | Reflectivity level threshold |
EBASE | Product parameter | Reflectivity level threshold |
RHI | int(Product parameter *100) | Azimuth angle |
VIL | Product parameter | Top heights of the integration layer |
PVOL | int(elangle *100) | Elevation of the current dataset |
COMP | 0 | Other composites: CMAX, HMAX, etc... |
git clone https://github.com/EUMETNET/openradardata-validator.git
Create new python3 envinronment
cd openradardata-validator
python3 -m venv .venv
source .venv/bin/activate
Install requirements
pip install --upgrade pip
pip install -r ./requirements-odim.txt
pip install -r ./requirements-validator.txt
Set env value
export ORD_VALIDATOR_DIR=/path_to_validator_dir/
python3 ./odim2ordmsg.py /path_to_ODIM_file/ODIM_file.h5
Run the schema validator
python3 ./ord_validator.py ./examples/odim/T_PAZA43_C_LPMG_20241008051005.h5.json
The output:
./examples/odim/T_PAZA43_C_LPMG_20241008051005.h5.json
Schemas: ['./schemas/openradardata-spec.json', './examples/odim/T_PAZA43_C_LPMG_20241008051005.h5.json']
Read Openradar schema: ./schemas/openradardata-spec.json
Read msg: ./examples/odim/T_PAZA43_C_LPMG_20241008051005.h5.json
Validation OK: 2024-10-08T05:10:05Z 0-20000-0-08556 0 TH
Validation OK: 2024-10-08T05:10:05Z 0-20000-0-08556 0 DBZH
Validation OK: 2024-10-08T05:10:05Z 0-20000-0-08556 0 VRADH
Where:
Schemas
: list of shemas, first two are the validator schemasRead...
: reads the (validator) schemasValidation OK
: validates the each measure in thejson_upload_schema
and prints thedate
level
andquantity