J5.2 Serving Users with Sentinel Data

Monday, 23 January 2017: 1:45 PM
611 (Washington State Convention Center )
Oystein Godøy, Norwegian Meteorological Institute, Oslo, Norway; and N. Budewitz and L. A. Breivik

Introduction

The European Space Agency has developed a new series of satellites which are dedicated to specific missions. This is the Sentinels and they are focusing on the operational needs of the Copernicus programme. A number of satellites have been planned and some have been launched, producing data which can be used for both operational and scientific purposes.

The Norwegian Meteorological Institute is developing/implementing two different systems supporting efficient utilisation of Sentinel data by various user communities. The Norwegian Space Centre is establishing a National Ground Segment (NBS) for satellite data, initially focusing on easy access to Sentinel data for scientific and operational users while ESA is developing the Collaborative Ground Segment through which a number of European nodes are sharing the load of serving users with Sentinel data – a demanding task concerning the amount of data involved.

The requirements on these systems, their current status and the underlying infrastructure used to serve data are reported. Collaborative Ground Segment

CGS is an effort to spread the load of serving a large number of users with large amounts of data on a number of contributing nodes. It is an effort coordinated by ESA and all the nodes run the same software that is developed and maintained centrally by ESA. A node contributing to the collaborative ground segment has been established at the Norwegian Meteorological Institute and using a new hardware configuration this has proven reliable supporting high throughput of data. The functionality offered is the same as offered by the other nodes in the collaborative ground segment. More specific user services for these data are under development within the National Ground Segment. National Ground Segment

NBS is targeting non-expert users as well as expert satellite users. Following this requirement, the NBS setup is designed using lessons learned in data management efforts like the International Polar Year as well as a number of national e-infrastructure projects supported by the Research Council of Norway. This implies that the system is a metadata driven system following the same approach as WMO Information System and INSPIRE, but emphasising the need for semantic translations and dynamic transformation of datasets upon user request. The initial setup will use the same software as CGS, while the more non-expert functionality is developed. However, also the system under development will use much the same approach as the DHUS software, although with a different emphasis on user functionality and integration with other types of data (e.g. weather information). Functionality under development includes transformation services for data, including reprojection and subsetting of data using web services either integrated in the human frontend (implemented using Drupal) or used directly. Metadata indexing is done using SolR and data are served through THREDDS Data Servers allowing aggregation across physical files and accessing the data as data streams (read what you need of a dataset).

NBS is in itself also a collaborative ground segment as it by design is physically distributed and allows integration of a large number of production and/or data centres (see illustration below).
Hardware infrastructure

Provided the large volumes of data to be served, lessons learned from similar efforts at the Norwegian Meteorological Institute have shown that an efficient hardware infrastructure is required. Bottlenecks identified previously have been removed. This includes modification of network topology, file systems used and introduction of a dedicated pre and post processing environment with full redundancy across data rooms. The processing infrastructure enables the Norwegian Meteorological Institute to scale up in processing e.g. Sentinel product ingestion chains via Docker and storage capacity by adding further storage servers when it is required. Furthermore, the system provides End-to-End checksumof all data/metadata and network traffic.

In addition to serving the needs of the two efforts mentioned above, this infrastructure is also used to serve the Copernicus Marine Environment Monitoring Service Arctic and North-West Shelf Marine Forecasting Centres as well as the Ocean and Sea Ice Thematic Assembly Centre. Summary

Based on lessons learned in a number of internal and external projects, a new hardware and job control environment has been implemented. This environment is able to support the large data volumes required by users of the emerging Sentinel satellites nationally and regionally (Europe) and has proven an efficient tool for scientific and operational data management and dissemination.

- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner