7 Model Implementation
7.1.1
Institutions must consider model implementation as a separate phase within the model life-cycle process. The model development phase must take into account the potential constraints of model implementation. However, successful model development does not guarantee a successful implementation. Consequently, the implementation phase must have its own set of documented and approved principles.
7.2 Project Governance
7.2.1
The implementation of a model must be treated as a project with clear governance, planning, funding, resources, reporting and accountabilities.
7.2.2
The implementation of a model must be approved by Senior Management and must only occur after the model development phase is complete and the model is fully approved.
7.2.3
The implementation project must be fully documented and, at a minimum, must include the following components:
(i) Implementation scope, (ii) Implementation plan, (iii) Roles and responsibilities of each party, (iv) Roll-back plan, and (v)
User Acceptance Testing with test cases.
7.2.4
The roles and responsibilities of the parties involved in the model implementation must be defined and documented. At a minimum, the following parties must be identified: (i) the system owner, (ii) the system users, and (iii) the project manager. All parties must be jointly responsible for the timely and effective implementation.
7.2.5
For model implementation, institutions should produce the following key documents, at a minimum:
(i)
User specification documentation: this document should specify requirements regarding the system functionalities from the perspective of users. (ii)
Functional and technical specification documentation: this document should specify the technological requirements based on the user specifications. (iii)
A roll back plan: this document should specify the process by which the implementation can be reversed, if necessary, so that the institution can rely on its previous model.
7.3 Implementation Timing
7.3.1
Institutions must be conscious that models are valid for a limited period of time. Any material delay in implementation diminishes the period during which the model can be used. Newly developed models must be implemented within a reasonable timeframe after the completion of the development phase. This timeframe must be decided upfront and fully documented in the implementation plan.
7.4 System Infrastructure
7.4.1
The IT system infrastructure must be designed to cope with the demand of the model sophistication and the volume of regular production. Institutions must assess that demand during the planning phase. Institutions should be in a position to demonstrate that the technological constraints have been assessed.
7.4.2
The IT system infrastructure should include, at a minimum, three environments: (i) development, (ii) testing, and (iii) production.
7.4.3
Institutions must have a management plan for systems failure. A system that does not comply with the business requirements must be replaced.
7.4.4
In the case of systems provided by a third party, institutions must have a contingency plan to address the risks that may arise if the third party is no longer available to support the institution.
7.4.5
If a system is designed to produce a given set of metrics, then institutions must use that system for the production and reporting of those metrics. If a system is not fit for purpose despite being implemented, institutions must not use a shadow system or a parallel system to produce the metrics that the original system was meant to produce, while maintaining the deficient original system. Instead, institutions must decommission any deficient system and fully replace it by a functioning system.
7.5 User Acceptance Testing
7.5.1
Institutions must ensure that a User Acceptance Testing (“UAT”) phase is performed as part of the system implementation plan. The objective of this phase is to ensure that the models are suitably implemented according to the agreed specifications.
7.5.2
The model implementation team must define a test plan and test cases to assess the full scope of the system functionalities, both from a technical perspective and modelling perspective. The test cases should be constructed with gradually increasing complexity. In particular, the test cases should be designed in order to assess each functionality, first independently and then jointly. The test cases should also capture extreme and erroneous inputs. Partial model replication must be used as much as possible.
7.5.3
There must be at least two (2) rounds of UAT to guarantee the correct implementation of the model. Generally, the first round is used to identify issues, while the second round is used to verify that the issues have been remediated.
7.5.4
The UAT test cases and results must be fully documented as part of the model implementation documentation. The test case inputs, results and computation replications must be stored and must be available for as long as the model is used in production.
7.5.5
Institutions must ensure that UAT tests and results are recorded and can be presented to the CBUAE, other regulators and/or auditors to assess whether a model has been implemented successfully. In particular, all rounds of UAT test cases and results must be available upon request from the CBUAE, as long as a model is used in production.
7.5.6
The UAT must be considered successful only upon the sign-off from all identified stakeholders on the UAT results. The UAT plan and results must be approved by the Model Oversight Committee.
7.5.7
Institutions must ensure that the model being implemented remains unchanged during the testing phase.
7.6 Production Testing
7.6.1
Institutions must ensure that a production testing phase is performed as part of the system implementation plan. The objective of this phase is to guarantee the robustness of the system from a technology perspective according to the functional and technical specification documentation.
7.6.2
In particular, the production testing phase must ensure that systems can cope with the volume of data in production and can run within an appropriate execution time.
7.7 Spreadsheet Implementation
7.7.1
It is not recommended that institutions use spreadsheet tools for the usage of material models and the production of metrics used for regular decision-making. More robust systems are preferred. Nevertheless, if spreadsheets are the only possible modelling environment available initially, the standards in 7.7.2 must apply, at a minimum.
7.7.2
Spreadsheet implementation should follow a quality standard as follows:
(i) The spreadsheet should be constructed with a logical flow, (ii) Formulae should be easily traceable, (iii)
Formulae should be short and constructed in a way that they are easily interpreted. It is recommended to split long formula into separate components, (iv) Tables should include titles, units and comments, (v)
Inputs should not be scattered across the sheets but they should be grouped in one worksheet/table, (vi) Hardcoded entries (i.e. fixed inputs) should be clearly identified, (vii)
Tabs should be clean, i.e. when the implementation is completed, all work in progress should be removed, (viii) Instructions should be included in one or several tabs, and (ix)
Wherever suitable, cells should be locked and worksheets protected, preferably by password.
7.7.3
Models implemented in spreadsheets that deviate from the above criteria must not be employed for regular production.
7.7.4
To ensure their robust implementation, spreadsheet tools must include consistency checks. Common consistency checks include: (i) computing the same results through different methods, (ii) ensuring that a specific set of inputs leads to the correct expected output values, and (iii) ensuring that the sensitivities of outputs to changes in inputs are matching expected values.
7.7.5
If an institution employ spreadsheets for regular production, a rigorous maker-checker process must be implemented and documented. The review of spreadsheet tools must be included in the scope of the independent validation process. In addition, a clear version control should be implemented.