fbpx

UK Ministry of Defence Completes Projects Within Time and Cost Budgets Set With Risk Analysis Forecasts

Dec. 17, 2021
Lumivero
Published: Dec. 17, 2021

UK Ministry of Defense projects can be large and complex, including major equipment procurement projects. A Project Risk Maturity Model incorporating @RISK has increased the number of projects delivered on time and in budget.

The Ministry of Defense (MoD) has seen a significant increase in the number of projects being delivered within their approved time schedule and budget as a result of using a Project Risk Maturity Model (RMM) to enhance its project risk modelling.

Using the MoD example to explain the principle behind the Project RMM, this case study demonstrates that combining @RISK (which has proved itself to be a good tool for modelling risk on other major defense projects) with Project RMM brings added assurance that risk models have been developed using a capable risk management process. This concept can be applied across a variety of industry sectors.

Validating the Quality of Inputs to Monte Carlo Models

The validity of forecasts made by risk-based models is dependent upon the quality of the input data. Managers are well aware of this, and therefore need assurance that the forecasts from Monte Carlo analysis are realistic. One approach to obtaining this assurance is to measure the capability of the risk management process that has been used to produce risk models. The Project Risk Maturity Model (RMM), performs this measurement and has been demonstrated to help produce risk models that result in realistic forecasts, including models using Palisade’s risk analysis software, @RISK.

This case study is based on major defence equipment procurement projects owned by the UK Ministry of Defence (MoD). After the MoD used the RMM for projects with a combined value of more than £60 billion, it has been shown that risk models used for project approvals became more reliable and realistic.

Reinforcing @RISK Analysis with Project RMM

Martin Hopkinson was the lead developer of the Project Risk Maturity Model and oversaw the MoD’s use of RMM while working as a consultant. He comments: “@RISK is a good tool for cost and net present value (NPV) risk analysis. But its capability is further enhanced with the use the RMM to check that the process used to develop the input data has been good enough. This is particularly important on high-risk major projects such as those sponsored by the MoD, since project authorization decisions can be very costly.”

Risk Analysis at the MoD

MoD equipment projects include the development and manufacture of new military equipment for the UK’s Army, Royal Navy and Royal Air Force. Risk is often increased on these large and complex projects by objectives that tend to push the limits of technical feasibility. As a result, the MoD is highly committed to its project risk management process. Risk-based forecasts produced by Monte Carlo analysis are required as part of the project approval process.

"Quantitative risk analysis forecasts can be used successfully as a key tool in an organisation’s process for approving major projects, and @RISK has been proven to be a good tool for modelling risk on major UK defence projects."Andrew Evans
Decision Analyst, Unilever

Guarding Against Over-Optimistic Risk Forecasts

In 2001, the MoD recognized that too many of its projects ran late and over budget. It traced this to over-optimistic risk analysis forecasts in the early stages, leading to projects passing approval points without adequate scrutiny. This realization prompted the MoD to invest in a tool to measure risk management capability and identify actions for improvement. The Project RMM was chosen as being the best tool for this task.

Linking RMM Assessments, Monte Carlo Analysis and Project Approvals

The Project RMM assesses each project as being at one of four levels of risk management capability, with the ideal being Level 4. It also identifies which aspects of the project’s process need to be prioritized for improvement. When the RMM was first used to assess MoD projects, it became apparent that the improvement of Monte Carlo modelling skills was a priority for many projects. In response, the MoD invested in new guidance, skills training and modelling assurance activities.

In 2004, the MoD introduced a new rule stipulating that, before passing through the main project approval point (Main Gate), projects had to be assessed as having reached Project RMM level 3 or 4. They also had to demonstrate that they achieved a minimum Level 3 for risk estimating capability, based on a selected subset of Risk Maturity Model questions.

Combining improved Monte Carlo modelling practice with RMM-based assurance has paid dividends. Since 2004, there has been a marked improvement in the number of projects delivered on or before their approved schedule and cost objectives. This improvement can be seen in the annual major projects (MPR) reports published by the National Audit Office.

UK Ministry of Defense projects can be large and complex, including major equipment procurement projects. A Project Risk Maturity Model incorporating @RISK has increased the number of projects delivered on time and in budget.

The Ministry of Defence (MoD) has seen a significant increase in the number of projects being delivered within their approved time schedule and budget as a result of using a Project Risk Maturity Model (RMM) to enhance its project risk modelling.

Using the MoD example to explain the principle behind the Project RMM, this case study demonstrates that combining @RISK (which has proved itself to be a good tool for modelling risk on other major defence projects) with Project RMM brings added assurance that risk models have been developed using a capable risk management process. This concept can be applied across a variety of industry sectors.

Validating the Quality of Inputs to Monte Carlo Models

The validity of forecasts made by risk-based models is dependent upon the quality of the input data. Managers are well aware of this, and therefore need assurance that the forecasts from Monte Carlo analysis are realistic. One approach to obtaining this assurance is to measure the capability of the risk management process that has been used to produce risk models. The Project Risk Maturity Model (RMM), performs this measurement and has been demonstrated to help produce risk models that result in realistic forecasts, including models using Palisade’s risk analysis software, @RISK.

This case study is based on major defence equipment procurement projects owned by the UK Ministry of Defence (MoD). After the MoD used the RMM for projects with a combined value of more than £60 billion, it has been shown that risk models used for project approvals became more reliable and realistic.

Reinforcing @RISK Analysis with Project RMM

Martin Hopkinson was the lead developer of the Project Risk Maturity Model and oversaw the MoD’s use of RMM while working as a consultant. He comments: “@RISK is a good tool for cost and net present value (NPV) risk analysis. But its capability is further enhanced with the use the RMM to check that the process used to develop the input data has been good enough. This is particularly important on high-risk major projects such as those sponsored by the MoD, since project authorisation decisions can be very costly.”

Risk Analysis at the MoD

MoD equipment projects include the development and manufacture of new military equipment for the UK’s Army, Royal Navy and Royal Air Force. Risk is often increased on these large and complex projects by objectives that tend to push the limits of technical feasibility. As a result, the MoD is highly committed to its project risk management process. Risk-based forecasts produced by Monte Carlo analysis are required as part of the project approval process.

"Quantitative risk analysis forecasts can be used successfully as a key tool in an organisation’s process for approving major projects, and @RISK has been proven to be a good tool for modelling risk on major UK defence projects."Andrew Evans
Decision Analyst, Unilever

Guarding Against Over-Optimistic Risk Forecasts

In 2001, the MoD recognised that too many of its projects ran late and over budget. It traced this to over-optimistic risk analysis forecasts in the early stages, leading to projects passing approval points without adequate scrutiny. This realisation prompted the MoD to invest in a tool to measure risk management capability and identify actions for improvement. The Project RMM was chosen as being the best tool for this task.

Lessons Learned

Hopkinson concludes: “Quantitative risk analysis forecasts can be used successfully as a key tool in an organisation’s process for approving major projects, and @RISK has been proven to be a good tool for modelling risk on major UK defence projects. However, risk analysis work is enhanced with the assurance that risk models have been developed using a capable risk management process. The Project RMM provides this important validation.”

Martin Hopkinson is the author of the book, ‘The Project Risk Maturity Model – measuring and improving risk management capability’ published in 2011 by Gower. Further information is available at rmcapability.com

UK Ministry of Defense projects can be large and complex, including major equipment procurement projects. A Project Risk Maturity Model incorporating @RISK has increased the number of projects delivered on time and in budget.

The Ministry of Defense (MoD) has seen a significant increase in the number of projects being delivered within their approved time schedule and budget as a result of using a Project Risk Maturity Model (RMM) to enhance its project risk modelling.

Using the MoD example to explain the principle behind the Project RMM, this case study demonstrates that combining @RISK (which has proved itself to be a good tool for modelling risk on other major defense projects) with Project RMM brings added assurance that risk models have been developed using a capable risk management process. This concept can be applied across a variety of industry sectors.

Validating the Quality of Inputs to Monte Carlo Models

The validity of forecasts made by risk-based models is dependent upon the quality of the input data. Managers are well aware of this, and therefore need assurance that the forecasts from Monte Carlo analysis are realistic. One approach to obtaining this assurance is to measure the capability of the risk management process that has been used to produce risk models. The Project Risk Maturity Model (RMM), performs this measurement and has been demonstrated to help produce risk models that result in realistic forecasts, including models using Palisade’s risk analysis software, @RISK.

This case study is based on major defence equipment procurement projects owned by the UK Ministry of Defence (MoD). After the MoD used the RMM for projects with a combined value of more than £60 billion, it has been shown that risk models used for project approvals became more reliable and realistic.

Reinforcing @RISK Analysis with Project RMM

Martin Hopkinson was the lead developer of the Project Risk Maturity Model and oversaw the MoD’s use of RMM while working as a consultant. He comments: “@RISK is a good tool for cost and net present value (NPV) risk analysis. But its capability is further enhanced with the use the RMM to check that the process used to develop the input data has been good enough. This is particularly important on high-risk major projects such as those sponsored by the MoD, since project authorization decisions can be very costly.”

Risk Analysis at the MoD

MoD equipment projects include the development and manufacture of new military equipment for the UK’s Army, Royal Navy and Royal Air Force. Risk is often increased on these large and complex projects by objectives that tend to push the limits of technical feasibility. As a result, the MoD is highly committed to its project risk management process. Risk-based forecasts produced by Monte Carlo analysis are required as part of the project approval process.

"Quantitative risk analysis forecasts can be used successfully as a key tool in an organisation’s process for approving major projects, and @RISK has been proven to be a good tool for modelling risk on major UK defence projects."Andrew Evans
Decision Analyst, Unilever

Guarding Against Over-Optimistic Risk Forecasts

In 2001, the MoD recognized that too many of its projects ran late and over budget. It traced this to over-optimistic risk analysis forecasts in the early stages, leading to projects passing approval points without adequate scrutiny. This realization prompted the MoD to invest in a tool to measure risk management capability and identify actions for improvement. The Project RMM was chosen as being the best tool for this task.

Linking RMM Assessments, Monte Carlo Analysis and Project Approvals

The Project RMM assesses each project as being at one of four levels of risk management capability, with the ideal being Level 4. It also identifies which aspects of the project’s process need to be prioritized for improvement. When the RMM was first used to assess MoD projects, it became apparent that the improvement of Monte Carlo modelling skills was a priority for many projects. In response, the MoD invested in new guidance, skills training and modelling assurance activities.

In 2004, the MoD introduced a new rule stipulating that, before passing through the main project approval point (Main Gate), projects had to be assessed as having reached Project RMM level 3 or 4. They also had to demonstrate that they achieved a minimum Level 3 for risk estimating capability, based on a selected subset of Risk Maturity Model questions.

Combining improved Monte Carlo modelling practice with RMM-based assurance has paid dividends. Since 2004, there has been a marked improvement in the number of projects delivered on or before their approved schedule and cost objectives. This improvement can be seen in the annual major projects (MPR) reports published by the National Audit Office.

UK Ministry of Defense projects can be large and complex, including major equipment procurement projects. A Project Risk Maturity Model incorporating @RISK has increased the number of projects delivered on time and in budget.

The Ministry of Defence (MoD) has seen a significant increase in the number of projects being delivered within their approved time schedule and budget as a result of using a Project Risk Maturity Model (RMM) to enhance its project risk modelling.

Using the MoD example to explain the principle behind the Project RMM, this case study demonstrates that combining @RISK (which has proved itself to be a good tool for modelling risk on other major defence projects) with Project RMM brings added assurance that risk models have been developed using a capable risk management process. This concept can be applied across a variety of industry sectors.

Validating the Quality of Inputs to Monte Carlo Models

The validity of forecasts made by risk-based models is dependent upon the quality of the input data. Managers are well aware of this, and therefore need assurance that the forecasts from Monte Carlo analysis are realistic. One approach to obtaining this assurance is to measure the capability of the risk management process that has been used to produce risk models. The Project Risk Maturity Model (RMM), performs this measurement and has been demonstrated to help produce risk models that result in realistic forecasts, including models using Palisade’s risk analysis software, @RISK.

This case study is based on major defence equipment procurement projects owned by the UK Ministry of Defence (MoD). After the MoD used the RMM for projects with a combined value of more than £60 billion, it has been shown that risk models used for project approvals became more reliable and realistic.

Reinforcing @RISK Analysis with Project RMM

Martin Hopkinson was the lead developer of the Project Risk Maturity Model and oversaw the MoD’s use of RMM while working as a consultant. He comments: “@RISK is a good tool for cost and net present value (NPV) risk analysis. But its capability is further enhanced with the use the RMM to check that the process used to develop the input data has been good enough. This is particularly important on high-risk major projects such as those sponsored by the MoD, since project authorisation decisions can be very costly.”

Risk Analysis at the MoD

MoD equipment projects include the development and manufacture of new military equipment for the UK’s Army, Royal Navy and Royal Air Force. Risk is often increased on these large and complex projects by objectives that tend to push the limits of technical feasibility. As a result, the MoD is highly committed to its project risk management process. Risk-based forecasts produced by Monte Carlo analysis are required as part of the project approval process.

"Quantitative risk analysis forecasts can be used successfully as a key tool in an organisation’s process for approving major projects, and @RISK has been proven to be a good tool for modelling risk on major UK defence projects."Andrew Evans
Decision Analyst, Unilever

Guarding Against Over-Optimistic Risk Forecasts

In 2001, the MoD recognised that too many of its projects ran late and over budget. It traced this to over-optimistic risk analysis forecasts in the early stages, leading to projects passing approval points without adequate scrutiny. This realisation prompted the MoD to invest in a tool to measure risk management capability and identify actions for improvement. The Project RMM was chosen as being the best tool for this task.

Lessons Learned

Hopkinson concludes: “Quantitative risk analysis forecasts can be used successfully as a key tool in an organisation’s process for approving major projects, and @RISK has been proven to be a good tool for modelling risk on major UK defence projects. However, risk analysis work is enhanced with the assurance that risk models have been developed using a capable risk management process. The Project RMM provides this important validation.”

Martin Hopkinson is the author of the book, ‘The Project Risk Maturity Model – measuring and improving risk management capability’ published in 2011 by Gower. Further information is available at rmcapability.com

magnifierarrow-right
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram