wiki:WikiStart

Version 4 (modified by lnerger, 10 years ago) (diff)

--

Welcome to the home of PDAF - the Parallel Data Assimilation Framework

PDAF is developed,
hosted and maintained at the
Computing Center of Alfred Wegener Institute.

PDAF enables data assimilation.

Data assimilation is used to combine observations of some system, like the ocean or the atmosphere, with a numerical model simulating this system. As both, the observations and the numerical model are not exact, the combined information of both sources can provide a better estimate of the true state of the system than each of the information sources alone.

Performing data assimilation with advanced algorithms and large-scale models is computationally extremely demanding. However, when ensemble-based data assimilation algorithms are used, the assimilation system can use supercomputers very efficiently. PDAF simplifies the implementation of data assimilation systems for supercomputers using existing model code by the following:

  1. PDAF provides fully implemented, parallelized, and optimized ensemble-based algorithms for data assimilation. Currently, these are ensemble-based Kalman filters like the LSEIK, LETKF, and EnKF methods.
  2. PDAF is attached to the model source code by minimal changes to the code. These changes only concern the general part of the code, but not the numerics of it.
  3. PDAF is called through a well-defined standard interface. This allows, for example, to switch between the LSEIK and LETKF methods without additional coding. In addition, different observational data sets can be used within a single executable.
  4. PDAF provides parallelization support for the data assimilation system. Of your numerical model is already parallelized, PDAF enables the data assimilation system to run several model tasks in parallel within a single executable.

This Wiki contains a growing documentation for PDAF. We will also add a possibility to register and download PDAF.

Documentation