Live Chat
Abstracts

Download individual models and presentations listed with their abstracts below.

» Download all models and presentations: 2007PalisadeUserConferenceNorthAmerica.zip (14Nov2007) - ~42MB

KEYNOTE: Business Applications of @RISK

Dr. Wayne Winston
Professor, Kelley School of Business
Indiana University

» Download Winston files

This keynote address will look at a wide variety of ways @RISK and its companion RISKOptimizer can be applied to a range of business applications.  Dr. Winston will cover examples such as:

  • Evaluating effectiveness of major league baseball players with @RISK.
  • Optimizing corporate project selection with RISKOptimizer
  • Vendor selection with @RISK
  • Capital budgeting modeling with @RISK.
  • Incorporating the Bass Model into a new product simulation

Overview of @RISK 5.0 and DecisionTools Suite 5.0

Sam McLafferty
CEO
Palisade Corporation

This major upgrade to @RISK and the DecisionTools Suite is driven by the latest innovations in the field and customer input, as well as attention to the improvements with Excel 2007. Sam will highlight the most prevalent of the new software innovations, including a tour of the stunning new @RISK user interface that is completely integrated into the spreadsheet environment. The discussion will also address how @RISK 5.0 is designed to better meet the needs of corporate-wide sharing, and how it offers more robust analyses.

 

@RISK uses for Risk-Informed Regulatory Applications: A risk-informed inservice inspection program at a Nuclear Power Plant

Dr. Ching Ning Guey
Manager, Reliability & Risk Assessment Group Nuclear Engineering
Florida Power & Light

» Download model
» Download presentation

Industry Focus: Energy
Product Focus:@RISK

A risk-informed approach to inspect piping segments has been used in nuclear power plants since mid 1990s. In one of the two NRC accepted methodologies, @RISK is used to determine the relative importance of piping segments. Such an approach has been used successfully in maintaining the same level of safety, while reducing the burden of performing inspections. The risk-informed in-service inspection (RI-ISI) is periodically reviewed and is submitted to the NRC at a 10-year interval.

@RISK has been used in accounting for the uncertainties in two key aspects of the risk-informed inspection process, i.e., the likelihood of piping failures and the consequence of such failures. The likelihood of piping failures is estimated based on probabilistic fracture mechanics considering piping degradation mechanisms, piping stresses, configurations, and operating conditions. Consequence is estimated using the Probabilistic Risk Assessment (PRA) model considering the impact of piping failures on subsequent plant responses.

A point estimate and @RISK-based approach has been compared to determine the sensitivity of the results. Insights gained to determine the robustness based on engineering and @RISK analyses indicate areas of improving the effectiveness and efficiencies for future periodic updates.

Baseball Enigma: The Optimal Batting Order

Clayton J. Graham
Chief Analytical Architect
Analytical Advantages, LLC

» Download presentation

Industry Focus: Sporting
Product Focus: RISKOptimizer, @RISK, BestFit, StatTools  

Baseball is a game of significant traditions, if not superstitions. Before turning over the starting lineup to the home plate umpire before each game, the field manager has made use of such advanced evaluation techniques as: taro cards, biorhythms, and sun spots. Since Branch Rickey hired a full time statistician in 1947 the quest to make use of a player’s quantitative performance has been increasingly incorporated by major league baseball. As an example, the importance of “on base percentage” was documented in Mike Lewis’s book Moneyball. The efforts and contributions by the Society for American Baseball Research (SABR) are more than prolific. The magnitude of statistics on baseball ranks second only to the output of the Bureau of the Census. OK, so what do I do coach? Accordingly, the presentation will cover:

• Identification of data relevancy,
• Conversion of data to useful information,
• Model development process (including a brief history of batting order algorithms),
• Examples; fact, fantasy and uncertainty,
• Reaction and implementation by major league baseball.

Palisade software used includes: RISKOptimizer, @RISK, BestFit and StatTools.

The basic model incorporates a reaction matrix (what happens for each at bat) with the selection of the appropriate performance vector predicated upon a unique family of discrete density functions for each combination of: field, umpire, batter and pitcher. With over 360,000 potential batting orders (9!), 30 major league parks, 80+ umpires and 200+ pitchers this yields over 170 billion possibilities! Clearly, this is more than taxing for the most astute manager. Coupling these parameters with alternative objective functions, e.g., maximizing expected runs per game, minimizing scoring volatility or targeting skewness levels, results in much more than a "day at the park".

Can a Single Model Replace Simulation in Determining Optimal Portfolio Withdrawal Rates from a Retirement Portfolio?

Dr. Michael Tucker
Professor of Finance
Fairfield University, Dolan School of Business

» Download presentation
» Download model

Industry Focus: Finance
Product Focus:@RISK

Milevsky and Robinson (2005) have derived a formula using the gamma distribution to estimate the probability of running out of money during retirement. The initial assumptions are a fixed withdrawal rate in real dollars for the retirement portfolio.

What is uncertain is the performance of the underlying assets. Typically a Monte Carlo simulation (Ameriks et al. 2001, Bengen 1994, Cooley 2003, Gyton & Klinger 2006, Stout & Mitchell 2006, Young 2004) would be used to gauge the probability of running out of money under uncertainty. The parameters for this problem are life expectancy, real rate of return, and standard deviation. M&R make use of different withdrawal rates in Table 3 of their article to estimate the probability of ruin for 50-80 year olds in 5 year increments withdrawing 2% to 10% per year in 1% increments. Actuarial estimate of life expectancy are entered into their model and probabilities of ruin are arrived at.

Using the same data assumptions as M&R use and an assumed normal distribution of returns, a simulation is run with @RISK to estimate the same probability of ruin calculation as obtained by the formula. The simulation is run for 10,000 iterations at each withdrawal rate for each age and the results compare M&R (mil) to @RISK calculations. An Excel macro linked to @RISK calculates the probability of ruin at each withdrawal rate for each age and life expectancy.

Environmentally Impaired Property Transaction Analysis

Timothy J. Havranek, MBA, PMP
Senior Business Analyst
Triangle Economic Research

Poh-Boon Ung
Senior Economist
Triangle Economic Research

» Download presentation

Industry Focus: Real Estate, Environment
Product Focus: PrecisionTree and @RISK

Commercial property purchase is often a complex process given various regulatory requirements and negotiations between interested parties. This complexity is increased when the property is contaminated with hazardous materials involving uncertain remediation costs.  Triangle Economic Research (TER), an ARCADIS Company, has helped a number of clients assess potential remediation options and associated costs.  Drawing upon the experience and technical knowledge of engineers and scientists at ARCADIS, we have developed probabilistic models that account for different contaminated media, their probable impact on the property, and associated remedial costs of various technologies.

Our case study focuses on combining the strengths of decision trees (PrecisionTree) and Monte Carlo simulation (@RISK) to capture the full range of remedial options and associated costs for a contaminated property.  Decision trees are particularly useful for mapping and displaying various alternatives and their impacts.  Indeed, the visual representation provided by decision trees provides a valuable client/stakeholder communication tool.  This is especially true when a property has numerous contaminated media, remediation and redevelopment options.  Combined with Monte Carlo simulation, they provide a realistic estimate of likely remedial costs for the contaminated property.  We will discuss a recent case where a development company applied our model results to negotiate the purchase of a property.  The model also provided our client with an understanding of major uncertainties and associated impacts which were used to negotiate the settlement price of the property.    

Consumer Credit Data not Predictive in the Decision Process to Select Tenants for Apartment Rental

Dr. Michael Furick
Professor
Georgia Gwinnett State University

» Download presentation

Industry Focus: Real Estate
Product Focus: NeuralTools

Credit scoring is a mathematical means of summarizing a consumer’s credit and financial history into a three-digit number. This number provides an easy means of identifying and sorting consumer behavior into categories based on their financial history. To select applicants for loans and to set interest rates on loans, banks and financial institutions routinely use credit scoring. Auto insurance companies also use scoring to decide which consumers will be offered auto insurance and to set the price for auto insurance. Despite success in these two industries, scoring does not appear to be effective in the apartment rental industry in picking desirable applicants for apartment rental.

The first phase of this presentation will discuss recent research that analyzed the results of using six commercially available credit scores applied in one apartment complex to the task of selecting applicants. This part of the analysis answered the research question: How effective are commercially available credit scores in predicting applicant financial behavior when renting an apartment? This research determined that these six scores are not predictive and possible explanations will be given.

Phase two of this presentation will discuss recent research that used neural network software (specifically “Palisade NeuralTools”) to develop a new model using both credit data and other lifestyle data about the applicant. The hypothesis was that the addition of this lifestyle data would improve accuracy in selecting apartment rental applicants over currently available models based only on credit data. This part of the analysis answered the research question: How is the prediction accuracy of a new neural network based credit-scoring model improved by adding lifestyle data to the credit report data? This research indicates that accuracy is greatly improved. Three variables were found to be most predictive for the apartment rental decision and these were a) percentage of satisfactory accounts in the applicant’s credit file, b) total applicant income, and c) driving record of the applicant

Determining the Optimal Level of Swaps

Roy Nersesian
Profesor, School of Business
Monmouth University

» Download presentation

Industry Focus: Finance
Product Focus: @RISK

Swaps are a means of controlling risk. If a high price or a low price for a commodity, or a high or low rate for currency exchange or interest rates represents risk, then a swap can “prevent” that risk from occurring. But there is a cost if the market swings in the opposite direction. For instance, a crude oil swap at $50 per barrel may save a company from having to sell oil below $50, but it must forego incremental profits if oil prices were to rise above $50 up to the swap volume. Thus swaps can be dangerous to one’s financial health in the sense of foregoing incremental profits!

Financial Risk Quantification Of Environmental Remediation Costs Using Probabilistic Cost Estimating: Theory & Practice

John W. Lynch, P.E.
President
JWL Envionmental

» Download presentation

Industry Focus: Environmental Management
Product Focus: The DecisionTools Suite

This presentation will address financial risk quantification (RQ) for environmental remediation and site cleanup efforts. The session will consist of three segments: RQ Drivers, RQ Techniques and Resources, RQ Examples and Lessons Learned.

Segment 1 will describe the various drivers for risk quantification including financial accrual estimates, SEC FIN 47 compliance, environmental insurance placements, merger & acquisition support, and PRP group negotiations, corporate and trust dissolution and liability transfer mechanism origination. The ASTM guidance and FASB interpretation will be discussed.

Segment 2 will discuss alternative approaches to and resources available for RQ. Types of uncertainties including regulatory, site, contaminant and technological uncertainties will be described and discussed. Resources for defining limits on uncertainties including costing will be presented. The use of expert panels will be discussed. Decision trees as a RQ tool will be described and examples presented. Publicly available and proprietary commercial data bases of unit costs will be presented. The use of DecisionTools Suite and @RISK will be discussed.

Segment 3 will present three cases where RQ was applied to environmental cleanup projects and site portfolios. Example A will be a munitions site in California with impact on public water supplies. Examples B will the asset retirement obligation estimate for a transportation company. Example C will be the estimate of a large sediment remediation and the impact of a trust dissolution effort. Time will be allocated to Q&A to draw out the audience on applications and lessons learned.

Enterprise Risk Management

Lina S. Cheung, FSA, MAAA, FCA
Partner
CP Risk Solutions, LLC

» Download presentation

Industry Focus: Health Insurance
Product Focus: @RISK

A small insurance company was renewing a major medical stop loss policy relative to its capital size.  The earlier policy period of this group made a big dent on the insurance company's capital.  The insurance company wanted to employ and make a return on its capital (i.e. renewing the group) and preserve its capital to not fall below the Insurance Department's watch list.  CP Risk Solutions modeled the probability of risk exposure and ruin using @RISK, and developed a reinsurance strategy and structure that met our client's needs.  Our process included setting parameters reflecting the known medical claims and the variability around those claims, the typically health care distribution, and variability and the capital position.  The decision making point was simple -the best reinsurance strategy that yields the best combination of highest certainty on the return, the absolute return and the capital position.

How to Measure Anything: Finding the Value of Intangibles in Business

Douglas W. Hubbard
President
Hubbard Decision Research

» Download presentation

Industry Focus: Information Management
Product Focus: @RISK

Doug Hubbard will talk about the approach described in his book, “How to Measure Anything: Finding the Value of Intangibles”. This is based on Hubbard’s method Applied Information Economics (AIE) and includes how to formulate intangibles as measurable and techniques for improving Monte Carlo models including:

  1. How “calibration training” improves on the subjective estimates from subject matter experts.
  2. How to compute the value of additional information and how this radically changes empirical measurement methods.
  3. How to use the output of a Monte Carlo simulation as input to portfolio optimization methods.

Insurance Loss Modeling Using Simulation

Dr. Domingo Castelo Joaquin
Associate Professor
Illinois State University

» Download presentation
» Full article

Industry Focus: Insurance
Product Focus: @RISK 

With the dramatic increase in the speed of personal computers and steep decline in the cost of computing, simulation has become one of the standard tools in the risk manager’s toolbox and should now become one of the standard tools in the risk management and insurance student’s toolbox. This teaching note aims to facilitate this process by showing how to create and run a simulation in a spreadsheet environment, and interpret simulation results to gain insight and understanding about a real-world problem. Specifically, this teaching note provides step-by-step instruction for simulating the present value of payments for losses occurring within a one-year policy period. Losses are covered by an aggregate excess of loss treaty. The uncertainty lies in the frequency and severity of losses, as well as in claim processing time; and also in the discount rate for calculating the present value of loss payments.

Integrating @RISK into the DMAIC Process

Ed Biernat
President
Consulting with Impact, Ltd.

» Download model and presentation

The Lean Six Sigma approach has been integrated into the fabric of many of the world’s top companies. This approach is based upon the DMAIC framework – Define, Measure, Analyze, Improve, Control. Within each step, discrete toolsets can be applied to the problem at hand. There are significant advantages to adding @RISK analyses at key points in the process to shorten the issue resolution lead-time, generate additional insights and enhance the final solution.

This presentation is divided into two parts. Part I is a brief overview of the Lean Six Sigma methodology and how it is applied to various industries and problem sets. Part II consists of mini-case studies involving the application of @RISK at various stages of the DMAIC process in real-world manufacturing applications. The framework for both appropriate and sub-optimal applications of the software will be discussed. From these initial studies, other applications in non-manufacturing processes will be extrapolated.

Monte Carlo Methods in Forecasting
the Demand for Electricity

Frank S. McGowan
Senior Economist
British Columbia Hydro and Power Authority (BC Hydro)

» Download model
» Download presentation

Industry Focus: Energy
Product Focus: @RISK

BC Hydro uses Monte Carlo Methods to produce a stochastic forecast of the demand for electricity. This presentation will explain how this is done and how @RISK is used to calculate the results.

It will be shown how uncertainty in key predictor variables including GDP, electricity rates, natural gas prices, and weather variables can be modeled by probability distributions. The Monte Carlo Model that transforms these input probability distributions into final probability distributions for the load forecast will then be explained.

Emphasis will be placed on a critical examination of the methodology. Comparisons with alternate approaches will be made. The proper choice of elasticities will be considered. The use of probabilistic results in regulatory filings and resource planning will be discussed.

Monte Carlo Simulations and Sensitivity Analyses as a Means to Assess and Optimize the Design of Integrated Multi-Trophic Aquaculture Sites

Dr. Gregor K. Reid
Post–Doctoral Fellow
University of New Brunswick and Department of Fisheries and Oceans

» Download presentation

Industry Focus: Natural Resources
Product Focus: @RISK

An aquaculture sustainability project in the Bay of Fundy, Canada, is successfully making the transition from an experimental to a commercial scale. Mussels and kelps grown beside salmon cages have demonstrated accelerated growth rates due to the augmentation of their natural food sources by nutrients generated from the salmon. This aquaculture practice, were the by-products of one species becomes the nutrient inputs for another is known as Integrated Multi-Trophic Aquaculture (IMTA). If properly implemented, the benefits of IMTA are twofold. Economic diversification is fostered by the culture of additional harvestable commodities within the same site licence area, and the overall nutrient load to the environment is reduced.

Several challenges however, need to be overcome for open-water IMTA to optimize sustainability. It is the ratio of nutrient releasing fed biomass (i.e. fish) to the nutrient converting biomass of co-cultured extractive species (e.g. mussels, kelps) in their respective biomitigating niches that largely influence nutrient recovery efficiency; not necessarily the physical/spatial scale of any one component. Consequently, rearing nutrient extractive species at scales complementary to the fed species presents novel challenges. ‘Trial and error’ learning approaches are largely unavoidable, due mainly to new husbandry and site design. Each species within the system also has unique temporal and spatial culture requirements, adding further complexity. Continuous site evolution and unpredictable dynamics are typical of commercial operations and present unique challenges to modeling the system.

Nevertheless, modeling approaches, like Monte Carlo simulation, can generate a likelihood of outcomes based on ‘partial data’ thereby providing practical estimates until validation can occur at ‘fully evolved’ commercial sites. The use of @RISK software combined with nutritional mass-balance models in Excel has been ideal for this approach; simplifying otherwise complex processes and ‘reporting’ results in a manner understandable to all stakeholders. The resultant sensitivity analyses are providing strategic research and management direction by identifying variables that most affect the system.

Nonlinear Feedback Loops using @RISK: Adding Uncertainty to a Dynamic Model of Oil Prices

Dr. William Strauss
President
FutureMetrics, LLC

» Download presentation
» Download model

Industry Focus: Energy
Product Focus: @RISK, BestFit

Nonlinear feedback loops describe many of the processes that make up the real world, with resulting rising and falling patterns with overshoot (feast or famine).

The oil sector is best modeled using systems of differential equations due to positive and negative feedbacks and the slow responses of the supply side (exploration, production) to price signals and shocks. Long-term trends are accompanied by cyclic price fluctuations that also influence demand. Interacting feedback loops make the system complex.

We use a tool that is specific to developing complex nonlinear systems using visual maps to create the structure for the oil price forecasting model. However, this tool (STELLA from ISSE Systems) does not provide a robust method of incorporating variable uncertainty into the model’s dynamics. STELLA will always output the same expected values for the forecasts for a given set of initial conditions.

But many of the driving variables in a model of the oil market are uncertain. For example: what are the expected proven reserves over time and how rapidly will supply respond to price signals (which includes well productivity, discovery rate, price elasticity of supply, etc.); how rapidly will demand respond to price signals (efficiency of transportation, power generation, price elasticity of demand, etc.); how rapidly will demand grow (the developing world, etc.); and other areas of uncertainty for key equation parameters.

Based on real expectations regarding the key elements of the model, distributions are developed that encompass a set of possible outcomes. Correlations between these data are also estimated. Using @RISK, these correlated distributions are then dynamically linked from Excel cells into the STELLA model in which multiple runs forecast a range of potential paths. We can then see a map of the many potential future oil price paths with the map’s density signaling more or less probable outcomes.

Project Portfolio Management at Novartis Pharm
A Case Study from London Business School

Industry Focus:Pharmaceuticals
Product Focus: DecisionTools Suite (@RISK, PrecisionTree, RISKOptimizer)
Presenter: Dr. Javier Ordóñez

This case describes the R&D project selection and prioritization problem at Novartis, a recurrent issue of strategic importance for the company. In the pharmaceutical industry, project portfolio decisions are crucial to the viability and success of a company, and require huge investment commitments. This case illustrates the usefulness decision analysis, simulation and optimization to analyze and optimize project portfolio decisions. The London Business School used the DecisionTools Suite to demonstrate some of the analytical techniques used by Novartis.

The case starts with an overview of the pharmaceutical industry and the challenges in the drug development process, including the massive required R&D investments, possibility of failure and commercial uncertainty. Subsequently, the case discusses the work performed by the project portfolio group at Novartis. They collect the project data and requirements submitted by the individual therapy areas and collate them to analyze the global company portfolio. The case reports Novartis’s decision process, focusing on the role of the Innovation Management Board (IMB), which takes the portfolio decisions at Novartis Pharma.

This case study won the 2004 INFORMS Case Competition, a prestigious competition for the best case study in Operations Research/Management Science, organized by the Institute for Operations Research and Management Science.

Robust Dynamic Pricing of Perishable Products

Yu-jiun Tsai
Manufacturing Strategy Analyst
Corning Display Technologies

» Download model and presentation

Industry Focus: Hi Tech Manufacturing
Product Focus: @RISK, RISKOptimizer

We are going to study dynamic pricing for perishable products when demand is uncertain and the underlying probabilities are not known precisely.  In our study, we consider a linear price-response function with additive uncertainty.  With this additive uncertainty assumption, we assign distributions to both market size and price elasticity with some means and standard deviations, and explore the optimal pricing for the products under different levels of uncertainty embedded in demands with the above parameters.  To solve such a semi-infinite optimization problem, we use RISKOptimizer to deploy a genetic algorithm.  Furthermore, since the pricing in real practice usually is bound to some constraints regarding nominal values while the optima from RISKOptimizer may slightly differ from the nominal values, we also use @RISK to simulate the resulting objectives corresponding to the nominal pricing values and analyze the impact on decision making.  In addition to the discussion of the above robust optimization modeling, we also survey the control parameters of the genetic algorithm to be tuned for efficiency.

Significance of Risk Quantification

John G. Zhao, MSc.
Manager, Project Risk Management Programme
Suncor Energy Services, Inc.

» Download white paper
» Download presentation

Industry Focus: Energy
Product Focus: DecisionTools Suite

The “significance of risk quantification in the decision-making process” reflects the emergence of modern risk management within the project management discipline. Using the DecisionTools Suite of software tools, the integrated process forces decision makers to quantify risk impacts rather than, as current practice, to qualify risk effects on their decisions.

Risk Register, Monte Carlo Simulation, Decision Trees and Force Field Analysis have been integrated to facilitate the decision-making process. This system has been used and applied to Suncor Bitumen Selection Strategy, and the “case study” proved to be successful. In addition, the result of this case study, along with further research work, may have potential commercial values. If the processes are properly generalized, theorized, and formalized; it will be valuable to Suncor and to any other energy company which desires a proven methodology for their future major capital project selection decisions, because “many organizations continue using decision practices that are decades out of date” (Schuyler, 2001, p.29).

Using NeuralTools to Generate a
Pricing Model for Wool

Kimbal Curtis
Senior Research Officer
Department of Agriculture Western Australia

» Download presentation
» Download model

Industry Focus: Agriculture
Product Focus: NeuralTools

Most Australian wool is sold by open cry auction, with approx 3500 sales per week, per selling center. The parcels of wool are listed in a catalogue with a number of categorical and numeric descriptors (independent variables). For market reporting, the industry selects a set of ‘indicator’ types with specified values for the descriptors and quotes prices for those types. Unlike a share market index, there may be no sales of lots with exactly the same specifications as an indicator type and so the market indicator value has to be estimated using interpolation with a numerical model.

For some abundant wool types, standard regression methods work well. However, for some of the less common types of wool, there are problems with regression. Not all types may be represented in each sale, independent variables are often correlated and the relationships between variables may be non linear, and dynamic over time. For these reasons we chose to test the usefulness of neural nets.

Our approach has been to trial neural nets using 6 months data for each of the three main selling centers in Australia, Sydney, Melbourne, and Fremantle. Testing was undertaken against published values for the indicator types. A range of neural net configurations were explored. Variable Impact Analysis was used to confirm those variables that are most important, and to identify candidate variables for deletion.

The approach is quick to use, appears to handle correlated variables, interactions and non linearity, and can be used to confirm the results of existing methods. Live prediction can be used to explore price sensitivity of the independent variables, something of interest to both wool buyers and sellers.

 

Introduction to the DecisionTools Suite

Thompson Terry
Training Consultant
Palisade Corporation

» Download presentation and model

This session will show you how to integrate the elements of the DecisionTools Suite into a complete risk program. PrecisionTree will help identify the model and decision analysis that needs to be done, as well as provide an efficient and effective way of presenting the results. We will use TopRank to set us on the right path to creating a probabilistic model out of a deterministic one. @RISK (including BestFit and RISKview) adds the uncertainty to the model and runs the Monte Carlo simulations required for effective risk analysis. RISKOptimizer produces the values for decision variables in your model optimize your required outcome.

Introduction to NeuralTools and StatTools

Dr. Javier Ordóñez
Training Consultant
Palisade Corporation

» Download presentation and model

In this session we will learn how to use Palisade’s two data analysis tools: NeuralTools and StatTools.

NeuralTools imitates brain functions in order to “learn” the structure of your data. Once NeuralTools understands the data, it can take new inputs and make intelligent predictions. The new predictions are based on the patterns in known data, and offer uncanny accuracy. NeuralTools can automatically update predictions when input data changes, and it can even be combined with Palisade’s Evolver or Excel’s Solver to optimize tough decisions and achieve desired goals.

StatTools is a Microsoft Excel statistics add-in. This session will cover how to perform the most common statistical tests, and will include topics such as: Statistical Inference, Forecasting, Data Management, Summary Analyses, and Regression Analysis.

Introduction to PrecisionTree 5.0

Eric Westwig
Software Engineer
Palisade Corporation

This presentation combines an introduction of the enhanced user interface, tighter Excel integration, and new features of PrecisionTree 5.0, with demonstrations of how PrecisionTree can be used to analyze various problems in decision analysis.

Introduction to Project Risk Management
using @RISK for Project

Dr. Javier Ordóñez
Training Consultant
Palisade Corporation

» Download presentation and model

The aim of this seminar is to give people a basic understanding of how @RISK for Microsoft Project works, including hands-on experience for setting up and running simulations, and interpreting the results.

Attendees will learn about the key functionality within @RISK for Project in step-by-step method, enabling them to quickly become familiar with basic concepts and terminology.

In addition to graphing and quantifying the risk in a business plan, you will learn how @RISK for Project, using Monte Carlo simulation, enables you to:

  • Calculate the probability of success
  • Graph the margin of error around the most likely outcome
  • Quantify and prioritise the risk drivers
  • Quantify the amount ‘@RISK’

Introduction to the RISKOptimizer and Evolver

Thompson Terry
Training Consultant
Palisade Corporation

» Download presentation and model

RISKOptimizer and Evolver use powerful genetic algorithms to perform optimization in Microsoft Excel.  RISKOptimizer builds on traditional optimization by adding Monte Carlo simulation to account for uncertain, uncontrollable factors in your optimization problem.  This session introduces you to these powerful tools, showing you how to set up a model, define constraints within the model, and ultimately arrive at the optimal outcome.  Examples including resource allocation, budgeting, and scheduling will be included.

What’s New in @RISK 5.0

Dr. Javier Ordóñez
Training Consultant
Palisade Corporation

» Download presentation

Try @RISK 5.0 yourself!  This hands-on introduction will briefly recap the main benefits and uses of risk analysis before walking you through key new features in @RISK 5.0.  You will experience the all-new interface as you define distributions, compare distributions using overlays, fit distributions to data, and correlate input distributions.  See firsthand how large models can be simplified by combining distributions in a new @RISK function.  Add distributions to and draw distributions from the new @RISK Library.  Define Six Sigma properties in your outputs.  Review and edit your entire model in the new @RISK Model window.  Swap distributions out for non-@RISK users using the new Function Swap feature, edit your model, and swap them back in again.  Simulate in the new Demo Mode  and watch all charts, thumbnails, and reports update in real time.  View results using the new graphing engine, Scatter Plots, Box Whisker Plots, and Tornado Regression – Mapped Value charts.  Store simulation results in the @RISK Library or your workbook.  There is so much to see, we’ll cover as much as time permits.

What’s New in @RISK 5.0

Thompson Terry
Training Consultant
Palisade Corporation

» Download presentation and model

Try @RISK 5.0 yourself!  This hands-on introduction will briefly recap the main benefits and uses of risk analysis before walking you through key new features in @RISK 5.0.  You will experience the all-new interface as you define distributions, compare distributions using overlays, fit distributions to data, and correlate input distributions.  See firsthand how large models can be simplified by combining distributions in a new @RISK function.  Add distributions to and draw distributions from the new @RISK Library.  Define Six Sigma properties in your outputs.  Review and edit your entire model in the new @RISK Model window.  Swap distributions out for non-@RISK users using the new Function Swap feature, edit your model, and swap them back in again.  Simulate in the new Demo Mode  and watch all charts, thumbnails, and reports update in real time.  View results using the new graphing engine, Scatter Plots, Box Whisker Plots, and Tornado Regression – Mapped Value charts.  Store simulation results in the @RISK Library or your workbook.  There is so much to see, we’ll cover as much as time permits.

 

Meet Palisade Developers

Palisade's developers have seen @RISK and other Palisade tools applied in dozens of different industries. Join us for this exciting roundtable discussion, where you’ll be able to provide feedback about Palisade tools and seek advice for your particular modeling issues. So bring your spreadsheet models and your software wish list while you get to know the people behind @RISK, the DecisionTools Suite, and more.

Model Sharing with the New @RISK Library

Sam McLafferty
CEO
Palisade Corporation

@RISK 5.0 is designed to better meet the needs of corporate-wide usage.  It features the @RISK Library, a new utility enabling customers to share specific probability distributions, parameters, and simulation results across workgroups.  Standardizing on modeling inputs has never been easier. Swap distributions with colleagues, share results graphs, reference simulation data from other users – all without leaving @RISK.  This session will demonstrate how this exciting new functionality can work for you.

@RISK 5.0 Advanced Topics Discussion

Sam McLafferty
CEO
Palisade Corporation

Have questions after seeing the overview of the all-new @RISK 5.0?  Want to know how @RISK 5.0 can apply to your specific modeling issues?  Or, do you just simply want to learn more about @RISK 5.0 than we had time to cover in the overview?  This session will cover more advanced features of @RISK 5.0 and provides a forum for questions and answers.

Real Options with @RISK and RISKOptimizer

Dr. Wayne Winston
Professor, Kelley School of Business
Indiana University

» Download Winston files

In this presentation, Dr. Winston answer the question: How can corporate finance departments value project options such as expansion, contraction and compound options?  He demonstrates, using real-world models, how to use @RISK and RISKOptimizer to perform risk-neutral valuation.

Selecting the Right Distribution

Thompson Terry
Software Engineer
Palisade Corporation

» Download presentation and model

How often have you looked at the palette of distributions in @RISK and related tools and wondered which one you should use?  A crucial aspect of risk modeling is the selection of the appropriate distribution to use to represent key uncertain variables.  @RISK offers a wealth of probability distributions – some are very intuitive like the Uniform and Triangular, others are somewhat familiar to anyone with a scientific, engineering or finance background, like the Normal and Lognormal.  However, many of the other distributions offered in @RISK gives us access to sophisticated probability thinking that can greatly extend and simplify your risk models.  This session will explain, in simple terms and illustrated with example models, the thinking behind the most powerful distributions, what they model, and how they can be put to use in your risk analyses.

 

 

Palisade Corporation
798 Cascadilla Street
Ithaca, NY 14850-3239
800 432 RISK (US/Can)
+1 607 277 8000
+1 607 277 8001 fax
sales@palisade.com
Palisade EMEA & India
+44 1895 425050 salesEMEA@palisade.com
salesIndia@palisade.com
Palisade Asia-Pacific
+61 2 9252 5922
salesAP@palisade.com
Palisade アジア・
パシフィック東京事務所
+81 3 5456 5287 tel
sales.jp@palisade.com
www.palisade.com/jp/
Palisade Latinoamérica
+1 607 277 8000 x318
+54-1152528795  Argentina
+56-25813492 Chile
+507-8365675 Panamá
+52 55 5350 2852 México
+511-7086781 Perú
+57-15085187 Colombia
servicioalcliente@palisade.com
ventas@palisade.com
www.palisade-lta.com
Palisade Brasil
+55 (21) 3958 1443
+1 607 277 8000 x318 tel
vendas@palisade.com
www.palisade-br.com