» Download a zip file of the 2009 NYC presentations
![]() |
Think Clearly, Act Decisively, Feel Confident Applying the principles and best practice of decision analysis allows the Decision Analysis Group at Unilever to analyze decisions that are complex in nature and have a high degree of uncertainty. It is not just about being good at probabilistic analysis; it is also about making sure the company is addressing the right problem and gaining commitment to action. In this presentation, Sven Roden will discuss what Decision Making Under Uncertainty means to Unilever and how Decision Analysis techniques are at the forefront of making a cultural change to the way Unilever approaches and analyses strategic decisions. Also discussed will be how Unilever’s relationship with Palisade helped them in their journey, and how they are constantly looking internally and externally to identify future trends and applications and evolve their tools and models to be at the forefront of applying Decision Analysis.
|
Palisade and Trends in Risk Management Sam McLafferty Palisade products @RISK and the DecisionTools Suite are used across a variety of industry sectors. As such, Palisade sees multiple perspectives on risk management. Trends are emerging that emphasize risk management as an enterprise discipline rather than just a localized fire-fighting technique. Sam will briefly review common threads in risk management that Palisade sees developing across various sectors such as finance, insurance, healthcare, and energy. He will also discuss the latest release of @RISK and the DecisionTools Suite for Excel, new version 5.5. @RISK 5.5 is available in five new languages and brings a number of important new features that improve usability, save time, and enhance Monte Carlo simulation analyses. Sam will discuss how @RISK is meeting the growing demand for risk management by making quantitative risk analysis more accessible than ever. |
![]() |
Accelerating Product Design with Andy Sleeper » Download the presentation The key to successful new product development is to anticipate and prevent problems before they happen. Design For Six Sigma (DFSS) is a system of risk-prevention tools used by world-class companies to launch new products of the highest quality, in the least time, and at the lowest cost. Simulation and optimization tools are among the most powerful DFSS tools, allowing engineers to prevent performance and capability problems before the first prototypes are built. What once required months or years to discover now takes only minutes to prevent.
|
Capitalizing Upon Market Inequities: Clayton Graham Sports wagering brings two separate “markets” together. First is the production market or the game itself. The second is the wagering or betting market. As a matter of practicality, the wagering market is itself in balance, i.e., bet clearing is covered through the process of adjusting the cost-payout ratio (the line.) Betting lines are translated into an expected probability of winning. This resultant probability is frequently inconsistent with the probability of the team actually winning. Hence, the opportunity to capitalize upon the dichotomy between the inequities of the production and gaming markets will be detailed and quantified. The presentation will include:
Principal Palisade software utilized includes: StatTools, @RISK and Evolver. The presentation will have a heavy graphic and visualization emphasis. Theoretical statistics will be tightly tied with pragmatic realities of game modeling and economically based decision making. Specific quantification will consist of:
Examples of actual results for current and past seasons along with predictions will be provided. In short, it’s “Card Counting” for sports!
|
The Collapse of an Unsustainable Paradigm: Prosperity by Debt (What Happens Now?) Roy Nersesian There has always been pressure to free mankind from the constricts of a hard currency: “thou shalt not crucify mankind on a cross of gold” syndrome. Adding layers of debt to cash income expands the purchasing power of individuals and enhances economic activity. But debt reduces cash income via financing costs. As long as debt stays ahead of rising financing costs, individuals can live today by borrowing against tomorrow. But what happens when the day of reckoning cuts off further debt accumulation? The speaker maintains that the purchasing power of individuals for all intents and purposes is permanently impaired. This has direct bearing on living standards, job opportunities, and economic activity. This does not necessarily mean an economic depression, but it does infer that any economic recovery will be shallow. The economic doldrums will not go away until the overhanging debt is liquidated one way or the other. @RISK is featured in this financial analysis.
|
Executive Pay for Performance Using @RISK Marwaan Karame Traditional executive incentive plans often encourage mediocrity and in many cases lead to the destruction of shareholder value. We don't have to look far to find evidence of a company with a long-term decline in stock price and increasing executive pay packages. The issue with executive compensation is not about the amount being paid to CEOs, but rather on the measure of performance and the terms of their compensation plan. Our Value Based Management (VBM) incentive compensation plan is designed to align management and shareholder interests, such that managers are rewarded for creating long-term shareholder value. The structure of our incentive plan simulates ownership, similar to a management buyout, however it has added advantages for both the manager and the shareholder. Our VBM plan uses @RISK's Monte Carlo Simulation as part of our analysis package. In this case study we will see why traditional compensation is a formula for mediocrity. We will also show how to create a pay for performance compensation plan that will maximize long-term shareholder wealth by aligning management and shareholder interests. Lastly, we will demonstrate how to calibrate a pay for performance compensation plan using @RISK.
|
Integrated Project Risk Analysis Jay O’Connor When conducting project risk analysis, it is not uncommon for the qualitative risk, quantitative schedule and quantitative cost risk analysis to be conducted separately and kept independent of each other. While some software packages attempt to integrate all three into one analysis, these efforts tend to fall short in one area or another. Turner & Townsend’s approach is to integrate the residual risks and opportunities along with the results from the schedule risk analysis into the cost risk analysis to develop a more fully integrated project risk analysis. The presentation will discuss our approach to risk analysis.
|
Interpretive and Ethical Issues in using Dr. Robert Ameo Simulations are proliferating throughout the business community powered by a troop of freshly minted MBAs armed with their requisite course on decision sciences and their student versions of Crystal Ball or @RISK. Finance organizations are asking their analysts to “do a Monte Carlo”. Dutifully, the analysts select a handful of “key” variables, assign triangular or Pert distributions, set iterations to 1000, push the simulate button. The laptop’s screen displays a colorful histogram and a sensitivity analysis to add to the PowerPoint. Lo and behold, the simulation analysis supports the original scenario model showing the mean or median simulated output to be just about in the middle of the distribution. Mission accomplished. Senior leadership is assured that the model has been tested by simulating 1000 potential outcomes. Management moves forward in their pre-decided direction with confidence bolstered by a state of the art Monte Carlo analysis. This scenario happens every day and for so many reasons it is very wrong. Using simulations to support executive decision-making introduces ethical concerns that are not present in “most likely case” scenario modeling. In this presentation, Bob Ameo discusses the ethical responsibilities of using simulation models to inform executive decision-making. Specific recommendations are made how to appropriately conduct and present outcomes from simulation models.
|
Simulating the U.S. Economy: Dr. William Strauss There is an assumption that drives all of our expectations for how our economy will be in the future. That assumption is one of endless economic growth. Clearly endless exponential growth is impossible. Yet that is what we base all of our expectations upon. We all agree that zero or negative economic growth is bad (just look around now at the effects of the Great Recession). But we also know logically that 2% or 4% annual growth every year leads to an exponential growth outcome that is unsustainable. To see where this growth imperative will take us we first have to see how we go to where we are today. This work first models the 20th century. The model is both complex and simple. The basic schematic of the model’s relationships is easy to understand. Furthermore, the core of the model is a simple production function that combines capital, labor, and the useful work derived from energy to generate the output of the economy. Complexity is contained in the solutions to the internal workings of the model. What is unique is that there are no exogenous economic variables. Once the equations’ parameters are calibrated, setting the key outputs to “one” in 1900 results in their time paths very closely predicting the U.S. GDP and its key components from 1900 to 2006. The experiment in this work is about the future. If the model can very closely replicate the last 100 years, what does it have to say about the next 100 years? From 1900 to 2006 there are periods in which there was parameter switching. (The optimal parameters and the years for the switching were found using a constrained optimization technique.) That suggests that in the future there will also be changes. The experiment uses @RISK’s features to generate new combinations of parameters for each of tens of thousands of runs of the simulation. Changes in the parameters represent potential exogenous policy choices. The “doing what you did gets you what you got” scenario leads to a surprising and unsettling outcome. The experiments using @RISK do find a path that works. Obviously if it is not “business-as-usual” that leads to a stable outcome, it is some other way. The policy choices that lead to a stable outcome suggest that the future of capitalism is not going to be what we expect it to be.
|
Strategic Portfolio Decision Analysis Ahmad R. Saadat Pharmaceutical drug development is characterized by massive capital investments and equally significant risks. Recent statistics indicate a commercialization success rate of about 10 percent with estimated combined costs of $1 billion dollars for each product. However, the rewards of success are often products with billion dollar sales, patents that limit competition and high profit margins. This potential successful outcome has attracted much capital to start‐ups in this industry over the past few decades. The combination of high capital requirements, high failure rates and high but rare and uncertain returns, leaves a low margin of error on strategic decisions and requires a more quantitative approach to the decision making process. While older, more established and well-capitalized pharmaceutical companies can withstand the risks of failure; there are many start‐up companies with limited resources that are aspiring to become the next successful pharmaceutical firm. Given the industry’s low rate of development success, one could argue that the investors and executives of these firms plan to beat the odds. For these start‐ups, the business model is to raise cash through venture capital firms to get products from concept into the clinic followed by additional financing through either public stock offerings or non‐dilutive out‐licensing or partnerships. More recently, as a result of the recent financial crisis, funding through venture capital and public offerings have significantly decreased, resulting in wide spread company restructurings to preserve capital. Moreover, the financial distress of the potential licensors has resulted in lower valuations and less lucrative deals. In this unforgiving environment, it is imperative to develop and execute a robust business strategy focused on risk management. Diversification of risk through a portfolio of products is a widely used strategy for all pharmaceutical companies. The portfolio is comprised of projects with varying level of investment, probability of success and commercial opportunity. However, the optimization of the portfolio, to ensure that it meets strategic objectives, requires an often‐ignored analysis of its components and their contribution to overall risk and value. This presentation proposes a method to analyze portfolio decisions using industry statistics, a hypothetical product portfolio, and simulation techniques that aim to optimize management decisions and answer many questions including but not limited to: How many and what type of projects should be included to meet the required diversification profile? What are the impacts of various decisions on value and risk of a portfolio in the future? What are the most significant events and the impact of changes in assumptions? What are the probabilities of extreme scenarios – i.e. breakeven point and very large returns? This method can be used to support strategic investment decisions in an existing portfolio of pharmaceutical products (with or without existing cash flows), investing in a start‐up, licensing a product, collaborations and M&A activities.
|
So You Think You Have the Right Data? Have you often wondered how good your data is? Collecting data about future uncertainties from experts has a number of hidden traps. In this interactive session we will make you aware of some of the most common sources of bias, and suggest ways to overcome them.
|
A Stochastic Simulation Model for Dairy Business Investment Decisions Dr. Jeffrey Bewley A dynamic, stochastic, mechanistic simulation model of a dairy enterprise was developed to evaluate the cost and benefit streams coinciding with investments in Precision Dairy Farming technologies. The model was constructed to embody the biological and economical complexities of a dairy farm system within a partial budgeting framework. A primary objective was to establish a flexible, user-friendly, farm-specific, decision-making tool for dairy producers or their advisers and technology manufacturers. The basic deterministic model was created in Microsoft Excel. @RISK was employed to account for the stochastic nature of key variables within a Monte Carlo simulation. Net present value was the primary metric used to assess the economic profitability of investments. The model comprised a series of modules, which synergistically provided the necessary inputs for profitability analysis. Estimates of biological relationships within the model were obtained from the literature in an attempt to represent an average or typical U.S. dairy. Technology benefits were appraised from the resulting impact on disease incidence, disease impact, and reproductive performance. The economic feasibility of investment in an automated BCS system was explored to demonstrate the utility of this model. An expert opinion survey was conducted to obtain estimates of potential improvements from adoption of this technology. Benefits were estimated through assessment of the impact of BCS on the incidences of ketosis, milk fever, and metritis; conception rate at first service; and energy efficiency. Improvements in reproductive performance had the greatest influence on revenues followed by energy efficiency and disease reduction, in order. Stochastic variables that had the most influence on NPV were: variable cost increases after technology adoption; the odds ratios for ketosis and milk fever incidence and conception rates at first service associated with varying BCS ranges; uncertainty of the impact of ketosis, milk fever, and metritis on days open, unrealized milk, veterinary costs, labor, and discarded milk; and the change in the percentage of cows with BCS at calving ≤ 3.25 before and after technology adoption. The deterministic inputs impacting NPV were herd size, management level, and level of milk production. Investment in this technology may be profitable; but results were very herd-specific. Investment decisions for Precision Dairy Farming technologies can be analyzed with input of herd-specific values using this model.
|
Use of @RISK for Forecasting Dr. Jose A. Briones Profitability projections in a manufacturing environment are directly tied to how the sales forecast fits with the capability of the operation. When a company has a large portfolio of products with very different operational production rates, the manufacturing capacity of the plant will be significantly impacted by the product mix to be produced. This in turn will have a radical effect on the output of the plant and the allocation of the fixed cost of production. In this case we present an example where a company is trying to decide how best to balance the sales of certain families of products to maximize revenue, maintain a diverse product line, and properly price each individual product based on the impact to the manufacturing schedule and fixed cost allocation.
|
Use of @RISK and RISKOptimizer to Mark N. Abramovich A variety of disease-spread models help quantify the potential effects of pharmaceutical and non-pharmaceutical interventions on pandemic influenza. But taken in isolation, such models are of limited utility when one considers, risk, resource allocation and the optimal selection of a portfolio of interventions, as well as the timing and implementation of such interventions in the face of uncertainty and random disease profile characteristics. The PanálysisTM system for pandemic management uses a spreadsheet-based model that incorporates operational modeling techniques such as Process Analysis, Monte Carlo Simulation, Optimization and Constraint-Based Resource Allocation to project the effects of pandemics on hospitals, regions and their associated populations. Metrics used to measure the effects of pandemics on a population include overall fatalities, case fatality rates, and key resource shortage levels. As such, with PanálysisTM, the effects of a wide variety of pharmaceutical and non-pharmaceutical interventions can be measured in isolation of one another and in tandem with each other. This makes PanálysisTM a valuable tool for national, regional and hospital decision makers who wish to craft an overall strategy or design specific tactics to plan for, mitigate or respond to a pandemic. It can also be used as a tool for command and control during an event. It is also useful for researchers and commercial biotech developers interested in quantifying the potential effects of a single therapy.
|
Using the DecisionTools Suite for Risk Assessment in Competitive Electricity Markets Dr. Rahul Walawalkar The electricity industry has experienced a paradigm shift over the past decade with the emergence of competitive electricity markets. Traditionally, under the vertically regulated energy industry structure, all the risks of investment decisions were passed on to the consumers with guaranteed rate of recovery being assured for the utility companies. One of the main reasons for the move towards deregulation was to transfer these risks from consumers to market participants who have better information and capability to mitigate potential risks. Competitive electricity markets now act as one of the most efficient ways for price discovery, where even bilateral contracts are settled against these markets in most of the regions. These markets provide mechanisms for valuing various attributes of electricity through a number of markets such as energy, ancillary services and capacity markets. These competitive electricity markets also provide various hedging mechanisms that can be used by market participants to mitigate their risk of price volatility. In recent years, there is also a focus on environmental attributes of electricity, with various regions adopting Renewable Portfolio Standards (RPS), mandating that a certain percentage of electricity be produced from renewable sources. In competitive electricity markets where the value for energy and ancillary services varies from hour to hour and depends on the location in the electricity grid, it is quite challenging to evaluate investment decisions for both conventional and emerging technologies. We will discuss in this presentation how DecisionTools Suite software can be used for understanding the underlying risks and variability in the electricity markets, in order to make sound investment decisions. We will look at examples using real market data for various investment decisions, in areas such as energy storage, solar PV, wind and conventional fossil fuel plants, as well as transmission improvements.
|
Using the
DecisionTools Suite to Perform Timothy J. Havranek The United Nation’s Brundtland Commission’s definition of sustainability is often cited: “meeting the needs of the present without compromising the ability of future generations to meet their needs” (Brundtland Commission, Our Common Future, Oxford University Press 1987). Interest in sustainability is steadily increasing as individuals, corporations and governments consider issues such as global warming, the collapse of financial markets, and urban sprawl. Sustainability concepts have now become part of managing cleanups (i.e. remediation) at hazardous waste sites. In April 2008, the United States Environmental Protection Agency (USEPA) published its Green Remediation Technology Primer thereby launching its green remediation initiative. The primary goal of green remediation is to integrate sustainable practices into decision making, thereby increasing the environmental, social, and economic benefits of cleanup. This presentation demonstrates an approach for analyzing the overall sustainability of hazardous waste site clean-up alternatives using multi-criteria decision analysis (MCDA) as supported by Palisade DecisionTools. MCDA provides a transparent, systematic process for evaluating alternative strategies that have multiple costs and benefits (environmental, economic, and social). The presentation includes the application of the MCDA process to a fictitious site that includes many of the issues and complexities associated with a hazardous waste cleanup project. The techniques presented can be used to evaluate the sustainability of any business decision.
|
Advanced Time-Series Forecasting Models and Methods Dr. William J. McKibbin Analysts are frequently confronted with scenarios where a forecast based on historical data is required. The presentation will introduce methods for building stochastic time-series forecasts in spreadsheets, using methods ranging from simple moving averages to GARCH.
|
Integrated Quantitative Project Risk Analysis - Structuring the Model Effectively Jay O’Connor A project risk analysis is only as good as the model that was used to prepare it. It is critical that the model be constructed to reflect the risks specifically associated with the project. The model must be able to accurately reflect the risks associated with schedule, quantities, cost and the residual unmitigated risk items from the qualitative risk analysis. The model should also take into account the interrelationships and dependencies of these items. The presentation will address these issues and present examples of how results can vary based on the level of detail used in preparing the risk analysis.
|
Targeted Analyses and Compelling Communication: A Formula for Successful Michael A. Kubica The value of quantitative science projects too often remains unrealized for would-be consumers. Despite flawless analyses, sophisticated reports and dazzling presentations, the message goes unheeded by those who could most benefit: If only they understood how to operationalize the results. The clarity with which quantitative scientists view the practical application of results is often paralleled only by their inability to generate that same clarity in their customers. The result is that good management science is at best ignored and worst, misunderstood (and misapplied). This workshop describes steps we as quantitative scientists can take to foster understanding, generate novel insights and stimulate actionable results with our clients.
|
![]() |
Dr. Chris Albright » Download the presentation I have written a book, VBA for Modelers (now in its third edition), where I teach ordinary Excel users how to automate Excel with VBA. I also teach a course at Indiana University for MBAs from this book, and it is quite popular. One possibility that very few people seem to be aware of is that VBA can be used to automate other software, including add-ins for Excel, provided that the developers of these software packages expose their object models to programmers. Many have done so, including Palisade with @RISK. I will demonstrate what it takes to automate @RISK with VBA using two examples. The first is a general template for any simulation model, and the second is a useful program for grading students’ @RISK models. If you know VBA for Excel, the battle is more than half over; the same basic language still works. All you have to learn is the object model of the add-in and how the software developer, in this case Palisade, expects you to interact with its software. If you envision using this programming functionality for an occasional project only, the learning curve will perhaps be too steep. But if you can see the benefits from using it often, the VBA approach should be well worth the time it takes to master.
|
Introduction to the DecisionTools Suite 5.5 Erik Westwig This session will show you how to use the elements of the new DecisionTools Suite 5.5 as a comprehensive risk analysis, optimization, and statistical analysis toolkit. Each of the products in the suite, @RISK, RISKOptimizer, Evolver, PrecisionTree, TopRank, StatTools, and NeuralTools, will be presented, showing how they can be used to solve practical problems in the real-world.
|
Thompson Terry This introduction to @RISK 5.5 will walk you through a risk analysis using various example models. Key features of @RISK will be highlighted, and new enhancements in version 5.5 will be pointed out along the way. You will experience the intuitive interface of @RISK 5.5 as you define distributions, correlations, and other model components. During simulation you will be able to see all charts, thumbnails, and reports update in real time. View results with a variety of graphing options, including new cumulative-histogram overlays, scatter plots in scenario analysis, and more. There’s so much to see, we’ll cover as much as time permits.
|
Custom Software Applications using Dr. Javier Ordóñez @RISK and DecisionTools Suite software ship with full-featured development environments that allow you to create custom applications using Palisade technology directly in Excel. Using a customized VBA interface and simulation model in Excel, we will show how @RISK can be used to model cost uncertainty and risk events that will affect total project cost. This way, users can run the model without learning how to use @RISK. Using a custom interface, we will show how to model cost ranges and risk registers through the use of probability distributions. We will also discuss how to measure correlation between variables, how to add a correlation matrix into a model, and the impact of correlation in a result. Once the simulation model is run, we will learn how to assess the contingency required and develop mitigation strategies.
|
Dr. Michael Rees This session covers some ideas in modeling best practices, in both Excel models and @RISK models. Topics include issues in model design, structure, formatting, error-checking and a variety of tools related to sensitivity analysis. We also mention some uses of Palisade’s TopRank for model auditing and checking.
|
Introduction to StatTools 5.5 and Dr. Chris Albright Thompson Terry In this session we will learn how to use Palisade’s two data analysis tools: StatTools and NeuralTools. StatTools is a Microsoft Excel statistics add-in. This session will cover how to perform the most common statistical tests, and will include topics such as: Statistical Inference, Forecasting, Data Management, Summary Analyses, and Regression Analysis. NeuralTools imitates brain functions in order to “learn” the structure of your data. Once NeuralTools understands the data, it can take new inputs and make intelligent predictions. The new predictions are based on the patterns in known data, and offer uncanny accuracy. NeuralTools can automatically update predictions when input data changes, and it can even be combined with Palisade’s Evolver or Excel’s Solver to optimize tough decisions and achieve desired goals.
|
Advanced Features in @RISK 5.5: Sam McLafferty, Erik Westwig, Doug Stauffer, and Howard Duncan @RISK 5.5 includes new functions, interface features, graphs, and simulation archiving capabilities. Join us for this interactive discussion about what’s new in @RISK. See how the @RISK Library has been enhanced for simulation archiving and sampling. A portfolio optimization example combining the @RISK Library and RISKOptimizer will be shown. Additionally, we’ll cover other topics of interest as time permits.
|
Selected Applications of the Dr. Michael Rees This session presents a variety of pharmaceutical-related applications of the DecisionTools Suite. Models shown relate to areas such as individual drug development decisions (including phasing, valuation and real options), project portfolios (including basic aggregation, optimization and development time uncertainty), optimization and general business planning. The models shown use Palisade’s @RISK, PrecisionTree and Evolver.
|
Selecting the Right Distribution in @RISK Dr. Michael Rees This session covers the choice of the appropriate distribution in @RISK. A variety of approaches are presented and compared, including pragmatic, theoretical and data-driven methods. The use of distributions to treat a variety of risk modeling situations is discussed, and some new distributions and features in v5.5 are shown.
|
Introduction to PrecisionTree 5.5 Erik Westwig This presentation combines an introduction of the enhanced user interface, tighter Excel integration, and new features of PrecisionTree with demonstrations of how PrecisionTree can be used to analyze various problems in decision analysis.
|
Custom Software Applications using Dr. Javier Ordóñez @RISK and DecisionTools Suite software ship with full-featured development environments that allow you to create custom applications using Palisade technology directly in Excel. Using customized VBA interfaces and simulation models in Excel, we will show how @RISK can be used to plan investment strategies for retirement, manage a portfolio of assets, and assess the risks in prospecting for oil. These examples demonstrate how users can run a model tailored to their needs without learning how to use @RISK. Using a custom interface, we will show how to define uncertain elements in each model and how to interpret the simulation results.
|
Introduction to Project Risk Management Dr. Javier Ordóñez The aim of this seminar is to give people a basic understanding of how @RISK for Microsoft Project works, including hands-on experience for setting up and running simulations, and interpreting the results. Attendees will learn about the key functionality within @RISK for Project in step-by-step method, enabling them to quickly become familiar with basic concepts and terminology. In addition to graphing and quantifying the risk in a business plan, you will learn how @RISK for Project, using Monte Carlo simulation, enables you to:
|
Introduction to RISKOptimizer 5.5 and Thompson Terry RISKOptimizer and Evolver use powerful genetic algorithms to perform optimization in Microsoft Excel. RISKOptimizer builds on traditional optimization by adding Monte Carlo simulation to account for uncertain (stochastic), uncontrollable factors in your optimization problem. This session introduces you to these powerful tools, showing you how to set up a model, define constraints within the model, and ultimately arrive at the optimal outcome. Examples of resource allocation, budgeting, and scheduling will be included.
|
|