![]() |
Common Mathematical Mistakes in Quantitative Modeling Quantitative finance models are becoming increasingly sophisticated and quants are taking on responsibilities for larger and larger decisions. Is this a safe and stable scenario? The speaker will argue that some quants' lack of real-world experience and blind belief in mathematical modelling is a dangerous combination. He will outline some of the common mistakes that quants keep making and show how these can be rectified by using relatively simple mathematics. a. Jensen's inequality arbitrage |
![]() |
Overview of @RISK 5.0 and DecisionTools Suite 5.0 Sam will give a guided tour of the stunning new DecisionTools Suite 5.0 for Microsoft Excel. The DecisionTools Suite has been rewritten and expanded, adding new products to provide the most comprehensive set of quantitative tools available anywhere. All component products offer streamlined interfaces, robust reporting, new features and tighter integration with Excel and each other. The result is a powerful, cohesive package that is more than the sum of its parts. The Suite incorporates flagship @RISK 5.0 for risk analysis using Monte Carlo simulation, and it also adds new versions of PrecisionTree and TopRank. PrecisionTree provides decision analysis with decision trees, while TopRank performs fast, convenient "what-if" sensitivity analysis. Furthermore, the data analysis software NeuralTools and StatTools have been added. NeuralTools performs predictive analysis using neural networks, while StatTools provides time-series forecasting and a wide range of other statistical functions. Rounding out the new Suite is Evolver 5.0, the genetic algorithm optimization tool. Sam will touch on the most important new aspects of the DecisionTools Suite products, and answer any questions you may have. |
![]() |
From Geological Knowledge to
Good Decisions Knut Hollund Industry Focus: Oil and Gas Some of the steps in the decision process for an on-going off-shore field development are presented. The challenge was to turn relatively small geological structures consisting of both unproven prospects and proven oil and gas reserves into a field development with good economic performance. An offshore field development typically involves a huge investment, and it is no surprise that oil companies take several months to develop geophysical interpretations, geological models, flow simulation models and economical models in order to support decisions properly. This workflow may lead to one good estimate, but may fail to describe the upside potential and risk involved. A simple @RISK model was built specifically to support decisions in the early phases of the field development. The model includes a tool for displaying important upside potential seen in geological and geophysical evaluations, and a parametric model for the oil and gas production resulting in a “fast model” that could be applied as part of a @RISK simulation handling all essential elements in the cash flow for various field development concepts. One of the challenges was to make a model that realistically accounted for the information we knew would appear before production start. This was solved using optimization inside the Monte Carlo simulation. The model made it possible to explore important upside potential seen in geophysical evaluations, and answer if and in what sequence further wells should be drilled. An improved understanding of the value for different drilling strategies was gained by studying distributions for in-place oil and gas volumes for various scenarios. The model would quickly give comparable economic figures for various development concepts. The upside potential and risk associated with the development could also be studied. Another important property of the model was the ability to incorporate the results of the appraisal wells and narrow the uncertainty range in the modeling as soon as the information became available.
|
Economic Capital Modelling for Operational Risk Dr Raj Nataraja The Basel II Accord requires larger financial institutions to use the advanced measurement approach to derive their regulatory capital for operational risk. This capital is the reserve that the banks should hold in addition to those for market and credit risk. The operational risk capital should take account of internal loss data, external loss data, risk and control assessment data and scenarios. Further, the frequency of loss events and the severity of losses have to be modelled probabilistically. In addition, the capital should be computed across several business lines and loss event types that may include different levels of granularity and correlations. Integration of all these elements together with some business specific qualitative adjustments that is logical, consistent, and believable is far from simple. With its extensive library of distribution functions, efficient Monte Carlo simulation engine, reporting facilities, GUI, and interaction with VBA, @RISK for Excel provides a model development and validation platform that is flexible for research studies and robust for full-fledged business applications. The present research and consultancy application, originally developed in @RISK4.5 has been migrated to @RISK 5.0 to take advantage of several of the new functionalities including those specific to the financial industry, and those for visualising and validating models. The Economic Capital Model for a generic case study will be demonstrated as part of the presentation, illustrating that @RISK 5.0 is a development tool of choice for probabilistic modelling of financial applications.
|
Brand Risk – Towards a Meeting
of David Abrahams Brands are now widely acknowledged as vital contributors to value creation. Yet if their form and function are to be adequately reflected in models of business risk, brands need to be understood as more than trademarks and ‘reputations’. Decision analysts can usefully apply a more precise anatomy of the brand in their capacity as advisers and facilitators. Meanwhile, there is scope for marketing professionals to develop their familiarity with decision trees and Monte Carlo simulation, both in brand planning and in their advocacy of particular policies. This presentation will consider the reasons for an increasing interest in structured assessments of brand risk. It will offer evaluative frameworks designed to establish common ground between decision analysts and marketing decision-makers. It will explore the role of intermediate methods in promoting the value of stochastic modelling to non-specialists who hold marketing responsibilities.
|
Using @RISK for Traffic Forecast Analysis
Inęs Teles and Fátima Santos The risk analysis of road infrastructure traffic demand studies is traditionally supported by estimated demand values within three scenarios - pessimistic, reference, and optimistic – allowing for a very limited financing risk deterministic analysis. In this presentation the developed approach draws on @RISK software and a traffic assignment model (supported by PTV VISUM software) which makes setting up financing risk analysis in stochastic terms possible, and allows one to benefit from the results coming from @RISK software. In this way, we improve the quality of decision-making by representing traffic demand results in a probabilities distribution histogram, because it allows us to determine the confidence interval for the mean value of traffic demand (or revenue), and identify the most influential input variables for traffic demand through tornado graphics interpretation.
|
Post Denmark Uses @RISK to Reduce Insurance Premiums Frank Lyhne Hansen In order to proactively manage the risk in its everyday business, Post Denmark held a risk mapping workshop following its ownership change in 2005. The event, attended by the risk officers from each of the organisation's business units, identified risks facing Post Denmark, rated the risks in terms of likelihood of occurring and the severity of the consequences should they arise, as well as the degree of control available to prevent them taking place, and then outlined activities to reduce their impact, should they occur. Post Denmark pinpointed risks for which there was a direct correlation to its insurance premium, such as anything connected with its property and automotive fleet, as well as industrial accidents, business liability, post office robberies, theft (both internal and external), and fraud. The organisation also defined the risk of competition from private couriers, and risk from a sharp downturn in business to business mailas a result of the increased use of digital communication. @RISK gives data value
@RISK predicts chance of accidents
@RISK's accuracy reduces insurance premiums Mester explains: “Previously, we were not comfortable taking on more risk as an organisation because we could not be certain how likely it was that a potentially costly incident would occur. We therefore always had to insure against the worst-case scenario. That has now changed – the combination of our data and @RISK's technology has given us the confidence to predict exactly what risks we face, as a result of which we do not need to pay so much for our insurance.”
|
Successfully Valuing Storage: A Real Options Approach to the Valuation of Real Assets
Dr Jennifer I Considine This presentation is concerned with the role of strategic planning in the optimisation of real assets in the energy industry. Specifically, a real options approach using @RISK Monte Carlo simulation methods when used as part of Real Options Valuation strategies. The approach is used to evaluate and value storage facilities in the natural gas industry. Beginning with an overview of standard storage valuation techniques, the paper moves to a real options approach, and an indepth look at dynamic heading strategies, such as back to back call and put options.
|
What’s In a Decision?
Sven Roden This is an interactive group decision-making game to show the fundamental principles of decision-making and decision analysis. This exercise will illustrate how seemingly simple decisions can become complex. A simple analysis of the problem will show how to make the best decisions in the face of uncertainty whilst gaining commitment to action. This is a fun session, but involves a real investment decision. You may be advised to bring some money!
|
Risk Aggregation: Calculating the Cash Flow @ Risk from Sub-Segments to Segments Levels at ArcelorMittal Douglas Cardoso The presentation is about how to use a bottom-up risk assessment approach to obtain the Top Risks of the Group. Based on data collect from 25 sub-segments, by a Risk Reporting Bundle, the risks were aggregated into 6 segments using @RISK to calculate the total Cash Flow at Risk. In a second step, these risks were aggregated to get the Top 10 Risks in the Group Level. It used probabilistic distribution on the financial impact (I) and on the likelihood (P) of each risk. The final Cash Flow at Risk was the sum of the different I x P. For each type of Risk (Risk Domain) a Latin-hyper cube simulation was run, to obtain the range of possible cash flow at risk in MUSD.
|
So You Think You Have the Right Data?
Andrea Dickens Have you often wondered how good your data is? Collecting data about future uncertainties from experts has a number of hidden traps. In this interactive session we will make you aware of some of the most common sources of bias, and suggest ways to overcome them.
|
Delivering Client Value through Uncertainty Management Tim Wells BSc. Adding client value is increasing more than simply ‘providing the deliverable’. In the face of significant uncertainty clients needs to be informed on the robustness of future investment decisions. The application of @RISK to develop project risk budgets at contract stage together with ongoing review and monitoring, allows project partners to be aware of their ongoing risk exposure and informs project management decisions. The following paper explores the application of @RISK in a project risk management context, and also introduces additional applications where probabilistic analysis has helped clients appreciate the role and importance of uncertainty in decision making. |
Using Risk Analysis, Aided by @RISK, On a Water Supply System to Evaluate an Energy Cost Saving Project, at Águas do Douro e Paiva SA
Jaime Gabriel Silva This presentation is a risk analysis case study of an energy cost reduction project especially useful for companies managing a water treatment and supply system. It highlights the advantages and difficulties observed in the model development, as well as the variability in its parameters, having in mind the exchange of information and practices between different companies. The risk analysis helped to sort out the inputs to improve cost estimates prior to the project’s kick off. AdDP faces increasing energy costs and management of the impact on its activity. The company started several actions to minimise costs, mainly oriented to pump efficiency improvement, and to pumping stations functioning hours. While the first group of measures was developed, focused on the system’s management and energy efficiency, the main electricity consumptions were evaluated and a second costs saving group of actions was identified. These actions involved tariff reductions on the largest pumping stations, and pump efficiency corrections on the smaller ones. AdDP’s Engineering Department decided to develop a project evaluation aided by decision analysis tools. A model was built, taking into account all important input variables of the real problem using @RISK. In fact, the significant number of variables with high uncertainty level and the author’s previous experience with @RISK led him to prefer an immediate approach based on a simulation model, with the main inputs described by continuous probability distributions, based on the data available for each variable. Therefore, this @RISK model was used to evaluate the investment’s viability, within the enterprise’s concession life time. The analysis presented here is supported by cost estimates, both for the construction and for the new infrastructures maintenance, as well as on estimates for the future evolution of electricity tariff and supplying volumes. The decision analysis is based on a discounted cash-flow model that estimates costs and benefits through stochastic simulation and allows the analyst to estimate the project’s Net Present Value and to approach its probability distribution. Águas do Douro e Paiva, SA (AdDP) is a water supply company, serving the Oporto region for 30 years, and is now beginning a wastewater system, on a similar basis. |
Presenting Large Scale Forecast David Edison Industry Focus: Manufacturing/Tobacco THE PROJECT: To produce a generic stochastic model making automated daily, monthly and annual sales forecasts for all tobacco products across a range of European regions, allowing users to apply subjectivity at any level to the forecasts being made. THE SYSTEM: An Excel based model using both @RISK and StatTools, deemed to be the optimal solution following a pilot study. The core model involves a regression analysis of sales volumes against a range of historic variables, followed by stochastic sampling of future values of those variables, resultant sales and error terms, as well as incorporating subjective user assumptions. The model automatically sources its 17,000 separate sets of data (product / geography combinations) from ‘cubes’ and the data warehouse, and forecast statistics are returned to cubes for ease of use. Cubes containing forecast data at every percentile of confidence hold approximately 100 million data points but are extremely quick and simple to use and interrogate. THE RESULTS AND BENEFIT: The client has a proven core generic forecasting model which can be used for ad hoc modeling and experimentation, but which also sits at the heart of a fully automated and integrated forecasting system. The whole system runs automatically on a monthly basis, and the user has the ability to make any subjective adjustments within the system in a simple way at any product level, at any geographical level, for any time period and at any confidence level. The end results of the automated forecasts are pre-defined summary exhibits, along with Business Intelligence ‘briefing books’ - a powerful way of allowing users to have access to all forecast statistics in pre-defined but flexible views, where the click of the mouse allows the same view to be seen for a different dataset, or broken down or analyzed in a different way, instantaneously. No more need for a hundred spreadsheets, and no more bulging ring binders! |
Maximising Net Present Value of investment Ujjwal Bharadwaj The presentation describes and demonstrates a methodology to maximise the financial benefit of investment in asset maintenance. The risk based methodology is demonstrated using a spreadsheet model that uses @RISK for Monte Carlo Simulations (MCS). Infrastructure managers are under increasing pressure to minimise life cycle costs whilst maintaining reliability or availability targets, and to operate within safety and environmental regulation. This paper presents a risk based decision-making methodology for undertaking run-repair-replace decisions with the ultimate aim of maximising the Net Present Value (NPV) of the investment on such maintenance. This methodology is based on established engineering, financial and statistical techniques that are in practice in power plant management. In this paper, the infrastructure system under consideration is assumed to consist of a number of structural components. To demonstrate this approach, in the first instance, a qualitative risk analysis is conducted to highlight those components that are ‘high risk’. This enables operators to get an overview risk profile of their system and thereby focus resources on the more risky system components (structures). To analyze these risky structures, a quantitative risk analysis is performed on each of them. For simplicity, corrosion has been considered as the main damage mechanism affecting the structures. A basic probabilistic model using the MCS technique is developed to obtain remaining life (RL) estimates of the structure under consideration. The RL estimates are then fed into another model – the Cost Risk Optimization (CRO) – model that weighs the risk of the structure being out of service (measured in monetary units as the product of the probability of being out of service and the cost of its consequences) with the cost of risk mitigation by undertaking some action – repair or replacement. Using this risk based approach, the CRO model gives the optimum time of action such that financial benefit is maximized. NPV is used to assess the value of an action so that time value of money is taken into account. In addition, the model can take into account tax credits accruing due to the depreciation of the structure (if applicable). |
Optimising Procurement of a Viktor Thorisson This paper deals with determining the optimal decision path regarding procurement of a geothermal power plant. The aim is to determine which of the following has the highest expected profitability; ordering components on the crucial path (the genset) before first drilling, after first successful drilling or after second successful drilling. Basic geological estimations of probability of success (temperature and production Those estimates are fitted to distributions, simulation performed to estimate probability of failure (defined as minimrequired return on equity), as well as to determine expected values if success. Those results are used as input values to the decision tree with expected trade offs such as extra investment cost and expected lost production due to less efficiency when components are pre-ordered. The result from this analyze, i.e. that the components should be ordered before first drilling, is quite profound. All input variables need to change dramatically to affect theoptimal decision path. If production can start sooner adding income early in time to thecash flow it dramatically over influence all risk and expected cost due to pre-ordering of turbine analyzed herein. The Newsboy method is used to determine optimal plant size if pre-ordered, the |
Artificial Neural Networks in Pharmacy and Medicine Dr Loai Saadah Purpose: PalivizumabTM is the first humanized monoclonal antibody used to prevent an infectious disease, namely Respiratory Syncytial Virus (RSV). It is used in premature infants, born at less than 35 weeks gestation, to reduce hospital admission in addition to alleviating severe symptoms requiring intensive medical therapy. However, there is little evidence of the overall benefit in the neonatal intensive setting during a RSV outbreak. In this study, we used artificial neural networks to build models that can identify babies who might benefit from PalivizumabTM in this setting. Methods: We retrieved and documented demographics as well as other relevant clinical information for patients from four different RSV seasons in our neonatal intensive care unit. We included a total of fourteen input nodes; seven categorical (gender, Palivizumab Group, whether Palivizumab was given at any time during the outbreak, season, congenital heart disease (CHD), chronic lung disease (CLD), and RSV) in addition to seven numerical (gestational age, birth weight, age at the start of the season, apgar scores at 1, 5, and 10 minutes, and length of stay before the index case was identified). We used Neural Tools v1.0.1 (Palisade, UK) to train three probabilistic nets (three outcome variables; days of supplemental oxygen, length of stay after index case was identified, and survival) in batch with a preset error limit of 0.01% and self-generated learning rate. For the survival outcome, we used the prevalence of mortality in our patients, together with the model sensitivity and specificity, to calculate positive and negative predictive values. We also conducted a treatment-group, reassignment analysis to study the effect of use of Palivizumab on the three outcomes investigated. Results: Information from a total of 177 (39 Palivizumab, and 138 control) patients was used in the model. Twenty one different cases, seven for each outcome variable, were used for prediction. Of the remaining cases in each model, eighty percent were always used for training, and twenty percent were used for validation. All nets converged in less than 4 seconds and 400 hundred training cycles. Palivizumab did not improve survival in this model. When all the prediction cases are counter assigned to the alternative group, patients who were assigned to Palivizumab stayed either more or less days on supplemental oxygen (- 0.5 to 8.0 days). Moreover, hospital stay was either shortened or prolonged with Palivizumab (- 22.0 to 1.0 days). Conclusion: Although Palivizumab did not improve survival in our model, it does seem to offer significant advantage, i.e. reduction in length of hospital stay, in selected premature infants during the RSV nosocomial outbreak. Identification of these patients will most likely need a combination of complex artificial intelligence modeling techniques together with competent clinical judgment. We suggest that utilization of trained artificial neural networks can significantly improve the cost-effective use of Palivizumab in this setting. |
A Generalised Model for Valuing Early Stage Technology Michael Brand Investment decisions in R&D, early stage technology companies, and the licensing or sale of technology require a quantitative value be assigned to the technology. This is usually a difficult management problem because:
The uncertainty surrounding the development process and subsequent product sales contribute to the difficulty is assigning a value to the technology at early stages of its development. The problem is particularly acute for biopharmaceutical products which may cost up to $1 bil to develop and less than 1 in 5000 candidate drugs reach the market. This presentation will describe a generalised model for valuing early stage technology using an
By combining these two @RISK features in one model an expected Net Present Value can be Overvaluing technology at the early stages of development is a frequent problem, particularly with university spin-out technology, which can result in valuation problems for subsequent, later rounds of investment. The model presented serves as an effective communication tool demonstrating the impact of uncertainty on technology value at its early stages of development. |
Design and Evaluation of a New Revenue Insurance Product for Strawberry Producers in Huelva, Spain Salomon Aguado-Manzanares Spain is the world's third largest producer and major exporter of strawberries. Production is primarily concentrated within a single zone, which has made Huelva the most important strawberry-producing region in the world. At present, risk-management instruments in this sector do not offer strawberry farmers an adequate degree of satisfaction, and they are demanding an insurance policy that will offer them full cover for the risks that they run. The objective of the study presented here was to develop a global income insurance product that gathers together all the risks borne by the strawberry sector, responding sufficiently to the accidents and other sources of loss that affect this industry. The insurance product, whose design is based on the information supplied by companies that are responsible for 44 percent of production in the province, copies the income model of each individual company, and by means of a Monte Carlo simulation determines the pure premium level appropriate for different insurance strategies. |
![]() |
Introduction to the DecisionTools Suite 5.0
This session will show you how to use the elements of the new DecisionTools 5.0 Suite as a comprehensive risk analysis, optimisation, and statistical analysis toolkit. Each of the products in the suite, @RISK, RISKOptimizer, Evolver, PrecisionTree, TopRank, StatTools, and NeuralTools, will be presented, showing how they can be used to solve practical problems in the real-world. |
Introduction to NeuralTools and StatTools
In this session we will learn how to use Palisade’s two data analysis tools: NeuralTools and StatTools. NeuralTools imitates brain functions in order to “learn” the structure of your data. Once NeuralTools understands the data, it can take new inputs and make intelligent predictions. The new predictions are based on the patterns in known data, and offer uncanny accuracy. NeuralTools can automatically update predictions when input data changes, and it can even be combined with Palisade’s Evolver or Excel’s Solver to optimize tough decisions and achieve desired goals. StatTools is a Microsoft Excel statistics add-in. This session will cover how to perform the most common statistical tests, and will include topics such as: Statistical Inference, Forecasting, Data Management, Summary Analyses, and Regression Analysis. |
|
Selecting the Right Distribution
This session covers the choice of the appropriate distribution in @RISK. A variety of approaches are presented and compared, including pragmatic, theoretical and data-driven methods. The use of distributions to treat a variety of risk modelling situations is discussed. |
|
Oil and Gas Applications using @RISK and PrecisionTree
This session discusses basic oil and gas modelling applications using @RISK and PrecisionTree. Models in the areas of reserves estimation, portfolio modelling, production-decline modelling and drill testing will be covered. |
|
Introduction to Risk Analysis with @RISK 5.0
This hands-on introduction will briefly recap the main benefits and uses of risk analysis before walking you through key new features in @RISK 5.0. You will experience the all-new interface as you define distributions, compare distributions using overlays, fit distributions to data, and correlate input distributions. Review and edit your entire model in the new @RISK Model window. Swap distributions out for non-@RISK users using the new Function Swap feature, edit your model, and swap them back in again. Simulate in the new Demo Mode and watch all charts, thumbnails, and reports update in real time. View results using the new graphing engine, Scatter Plots, and Tornado Regression – Mapped Value charts. There is so much to see, we’ll cover as much as time permits. |
|
PrecisionTree 5.0 as part of DecisionTools Suite 5.0
This presentation combines an introduction of the enhanced user interface, tighter Excel integration, and new features of PrecisionTree 5.0, with demonstrations of how PrecisionTree can be used to analyse various problems in decision analysis. |
|
![]() |
Palisade's developers have seen @RISK and other Palisade tools applied in dozens of different industries. Join us for this exciting roundtable discussion, where you’ll be able to provide feedback about Palisade tools and seek advice for your particular modeling issues. So bring your spreadsheet models and your software wish list while you get to know the people behind @RISK, the DecisionTools Suite, and more. |
Getting More from @RISK for Free Manuel Carmona Problem: Many customers simply don’t know about the many places where they can get answers to their most common @RISK questions, as well as, good reference models and samples to look at.
|
|
Real Options Modelling with @RISK and PrecisionTree Dr Michael Rees This session introduces the topic of real options modelling as an extension of risk modelling. The link to general decision making under uncertainty and financial market options is also discussed. A variety of examples using @RISK and PrecisionTree is presented. |
|
Ian Wallace
The aim of this seminar is to give people a basic understanding of how @RISK for Microsoft Project works, including hands-on experience for setting up and running simulations, and interpreting the results. Attendees will learn about the key functionality within @RISK for Project in step-by-step method, enabling them to quickly become familiar with basic concepts and terminology. In addition to graphing and quantifying the risk in a business plan, you will learn how @RISK for Project, using Monte Carlo simulation enables you to:
|
|