84 | | To compile the extended model code with PDAF, one has to extend the Makefile for the model. The core part of PDAF can be compiled separately as a library. It can then simply be linked to the model code. This is the strategy followed in the PDAF-package. The user-supplied routines also need to exist and need to be compiled and linked. However, for testing at this stage, only the user-supplied routines used in `PDAF_get_state` as well as the routine `U_collect_state` need to be implemented with functionality. The other routine will only be executed, when an actual analysis is performed. |
| 84 | To compile the extended model code with PDAF, one has to extend the Makefile for the model by adding the additional user-supplied routines. While all of the user-supplied routines need to exist not all of them need to be fully implemented at this time if the following procedure is used. |
| 85 | |
| 86 | At this implementation stage one can use the preprocessor definition `PDAF_NO_UPDATE` (available from Version 1.6.1). With this, the actual analysis step of the chosen filter algorithm is not executed. Accordingly, only the user-supplied routines used in `PDAF_get_state` as well as the routine `U_collect_state` need to be implemented with functionality. The other routines will not be executed, because they are only called during the analysis step. Generally with `PDAF_NO_UPDATE` the program performs just an ensemble integration. That is, PDAF is initialized by `PDAF_init`. Then a forecast is computed by using `PDAF_get_state` and the chosen `PDAF_put_state_*` routine. At the initial time `U_prepoststep` is executed by `PDAF_get_state`. `U_next_obs` will provide the number of time steps to be computed by the model and `U_distributed_state` will initialize the model fields. Subsequently the ensemble integration is performed and the forecast fields are written back to the ensemble array by `U_collect_state`. Upon completion of the forecast phase, the routine `U_prepoststep` is executed twice. The first time is the regular call before the analysis is executed. Thus, it allows to access the forecast ensemble. If the analysis would not be deactivated, the second call to `U_prepoststep` would be after the analysis allowing access to the ensemble directly after the analysis. As the analysis is deactivated here, the ensemble will be the same as in the first call. |
| 87 | |
| 88 | This test allows to check the following: |
| 89 | * Is `U_prepoststep` working correctly? |
| 90 | * Does `U_next_obs` work correctly and is the information from this routine used correctly for the model integration |
| 91 | * Are `U_distribute_state` and `U_collect_state` work correctly? |
| 92 | One could also comment out the actual time stepping part of the model. This would allow to only test the interfacing between PDAF and the model. |
| 93 | |
| 94 | It is important to ensure that the ensemble integration performs correctly. The simplest case should be a parallel configuration in which the number of model tasks equals the ensemble size as here the model tasks always compute forward in time. If the number of model tasks is smaller than the ensemble size, some model tasks will have to integrate multiple states of the ensemble. If a model task has to integrate two states, the model will have to jump back in time for the integration of the second state. It might be that some arrays of the model need to be re-initialized to ensure that the second integration is consistent. Also, one might need to check if the initialization of forcing fields (e.g. wind stress over the ocean) performs correctly for the second integration. (Sometimes model are implemented with the constraint that the model time always increases, which is the normal case for pure model simulations without assimilation.) A useful test is to initialize an ensemble in which all states are equal. If this ensemble is integrated the forecast states of the ensemble should, of course, still be equal. |