New DecisionTools Suite 6 offers a wide range of improvements across various analyses. Below is a summary of new features by product. Click on video icons below to see short videos of new features.
All products in the DecisionTools Suite are now fully compatible with 32- and 64-bit versions of Excel 2013, Project 2013, and Windows 8, as well as previous versions of Office and Windows back to Office 2003/Windows XP.
@RISK 6.1 offers a new, faster calculation engine for the simulation of Microsoft Project schedules. Many Project simulations now run 5-15 times faster than before!
All products in DecisionTools Suite version 6.1 also give you the ability to change the language of your user interface, all from a single installer – perfect for global companies. All example files, tutorials, help documentation and user manuals have been fully translated as well.
All DecisionTools products now include revised and updated example files, as well as all-new examples in a range of industries. All examples are clearly organized and easy to navigate. These examples have been designed and written by leading MBA professor and author Dr. Chris Albright of Indiana University. They provide step-by-step guidance on setting up and running models that address a wide range of industry applications. In addition, many examples offer short, built-in videos that walk you through the model.
Furthermore, new video tutorials have been developed by Dr. Albright to help new users get started quickly, and to help experienced users get more out of the software.
The @RISK toolbar ribbon has been better organized to allow quicker access to common tasks, and to make it easier to find different analyses.
As is offered in Excel, @RISK now provides a small toolbar for quick access to graphs and functions that pops up where you are working in your model, saving you time dragging the mouse. To activate the mini toolbar, simply click and hold the left mouse button.
The @RISK mini toolbar provides quick access to common tasks.
@RISK now provides detailed statistics and simulation data in the same window as simulation results graphs. This simplifies your analysis by eliminating the need to open multiple windows. You also have the option to hide this information and view only summary statistics as before.
This new “double-sided” tornado graph shows an input’s positive and negative impact on actual output values – information very valuable to managers, and much easier to understand than statistical coefficients. It uses “input scenarios” to calculate the impact of each input on a specific output statistic, such as the mean, a percentile, or others.
@RISK’s double-sided tornado is much easier to understand, making it ideal for managerial reporting.
Scatter plots have also been integrated into this “double-sided” tornado to understand and highlight different input scenarios.
By dragging a bar from a tornado graph, you can see a scatter plot of the impact of a given input on your output, and understand different scenarios.
New spider graphs display the change in a given output’s mean (or whatever statistic you specify) across a range of values for all the various inputs. It’s a very intuitive new view on sensitivity analysis – great for reports and presentations.
Spider graphs in @RISK show intuitively how an output changes as a given input changes.
@RISK is now a truly cross-platform tool, enabling risk modeling of your Microsoft Project schedules using the same @RISK you use for modeling in Excel! You can now do your project risk modeling in Excel rather than Microsoft Project, providing a new world of flexibility. A new interface layer reproduces your schedule in Excel, enabling you to use all Excel formulas and @RISK functions. When you make changes to your model in either Project or Excel, those changes are reflected in the other with @RISK’s Sync feature. (Note that all @RISK modeling takes place in Excel, so @RISK functions do not appear in Microsoft Project.) Then simulate your Project schedules in Project itself, using Project's scheduling and calculation engine.
The benefits of using Excel for your Project risk modeling are many. You can easily build risk registers in Excel for your Project model using new “RiskProject” functions. You can integrate your cost and schedule analyses. You can standardize on a single tool - @RISK – to meet the needs of your project managers, cost estimators, finance analysts – everyone who deals with risk in your company. Plus, a single interface means a shorter learning curve for everybody.
You can perform risk modeling on your Microsoft Project schedules directly in Excel. Here, we are defining a distribution to reflect the uncertain duration of a task.
@RISK can show you the likelihood of your project being completed on time or by a specific date.
@RISK now offers integrated bootstrapping to estimate confidence intervals for fitted parameters. This automated process greatly saves you time and gives you more confidence in your fits.
There are new goodness-of-fit measures as well. You can also hold certain parameters fixed during fits, and fit data sizes up to 10 million values (increased from 100,000).
Furthermore, @RISK now offers batch fitting for fitting batches of data sets. There’s even a correlation matrix feature built in to batch fitting.
Finally, the live fit function, RiskFitDistribution, along with supporting functions, returns statistics on fit results in real time as new fits are run.
Fitting reports show which distributions were chosen and why, and include bootstrapping estimates of fitted parameters.
New goodness-of-fit tests, bootstrapping, and batch fitting are among the enhancements to @RISK’s distribution fitting.
@RISK now offers a new set of functions for simulating time series processes, or values that change over time. Any future projection of time series values has inherent uncertainty, and @RISK now lets you account for that uncertainty by looking at the whole range of possible time series projections in your model. This is particularly useful in financial modeling and portfolio simulation.
There are functions available for 17 different statistical time series models, including ARMA, GBM, GARCH, and others. These functions are entered as array functions in Excel.
@RISK provides new windows for fitting historical time series data to these new functions. The results can be animated to show the behavior of your time series during simulation. All this is integrated into the existing @RISK interface.
@RISK has fitted the Moving Average 1 (MA 1) stochastic time series process to this variable, which is the stock price of Apple Computer.
New distribution functions have been added to the more than 40 already in @RISK: DoubleTriang, Levy, Laplace, F, Extreme Value Min, and Bernoulli.
A variety of new distribution functions have been added to @RISK for even greater modeling flexibility.
An automatic converter has been added to @RISK that lets you open and run risk models created in Crystal Ball. @RISK converts Crystal Ball distributions and other model elements to native @RISK functions, enabling you to use all old and current Crystal Ball models without wasting time.
OptQuest is a widely used state of the art optimizer, and is now available with RISKOptimizer and Evolver. The OptQuest engine integrates Tabu Search, Neural Networks, Scatter Search, and Linear/Integer Programming into a single composite method. It provides great results – and quickly – on many types of models.
OptQuest supplements the existing Genetic Algorithm engine, which is still available. The engine to use can be automatically selected by RISKOptimizer or Evolver based on your model, or you can choose yourself.
RISKOptimizer now has the ability to share simulation settings with @RISK, saving redundant entry. All RISKOptimizer commands are now available directly from the @RISK ribbon. Plus, all @RISK reports, graphs and features are available to analyze RISKOptimizer’s best solution.
All RISKOptimizer commands are now available directly from the @RISK ribbon.
If an optimization problem is linear (that is both the optimized function and the constraints are linear), Evolver will now solve it using a linear programming algorithm. This makes the optimization very fast, and also ensures that the solution found is the best possible solution. Evolver linear programming works with all types of variables (adjustable cells): continuous, integer, and discrete.
Report from a linear programming optimization. On the first trial Evolver evaluates the solution defined by the original cell values, and then immediately proceeds to generate the optimal solution on the second trial.
Improvements to constraint handling
With the addition of the OptQuest optimization engine, constraints are often handled more efficiently. If a constraint is linear, Evolver and RISKOptimizer will not even attempt solutions that fail the constraint, making optimizations faster. The handling of non-linear constraints is also improved. For example, the situation in which an optimization starts with adjustable cell values that do not meet the specified constraints presented a difficulty in previous versions (and required the use of the Constraint Solver utility). With OptQuest this type of situation no longer requires special handling.
This is a log of solutions attempted during a RISKOptimizer optimization. The constraints in this problem are linear and therefore all the solutions generated by RISKOptimizer are valid (meet the constraints).
“Discrete” adjustable cells
During optimization RISKOptimizer and Evolver try different values of adjustable cells, given the specified minimum and maximum values. However, not all values within the range may be realistic. For example, when deciding the production levels, we may need to take into account that the product is manufactured in batches. It may be that only multiples of 10 are realistic production levels. A situation like this can now be reflected in RISKOptimizer and Evolver using “discrete” adjustable cells. Discrete adjustable cells can also be used to speed up optimizations: with cells defined as “discrete” there are fewer possible solutions, so the best solution will generally be found faster.
This is a log of solutions attempted during an Evolver optimization. Some cells have been defined as discrete with “step size” of 10. With these cells Evolver only attempts values that are multiples of 10.
PrecisionTree now allows you to “flip” one or more chance nodes in a model in order to show probabilities calculated using Bayes’ Rule. This is valuable when the probabilities of a model are not available in a directly useful form. For example, you may need to know the probability of an outcome occurring given the results of a particular test. The test’s accuracy may be known, but the only way to determine the probability you seek is to “reverse” a traditional tree using Bayes Rule. Now in PrecisionTree, this process is easy.
Bayesian Revision video
Using Bayesian Revision, PrecisionTree can “flip” a traditional tree to presents the probabilistic results you need in a useful way.
This command will allow you to quickly build large trees, saving a lot of time.
You can quickly add additional subtrees to existing trees with the Append Subtree feature.
Now you can easily put a new node in between existing nodes in a much simpler way involving fewer steps than before.
The Insert Node command saves a lot of time when you need to edit your tree.
PrecisionTree lets you copy and paste any part of a decision tree into Word, PowerPoint, or any application for reports and presentations. Simply right-click on any node in a tree and copy an image of the whole tree or the subtree from that node onward.
PrecisionTree lets you copy and paste any subtree into Word, PowerPoint, or any application for reports and presentations.
PrecisionTree now supports twice as many sensitivity analysis inputs as before. The Policy Suggestion graph now provides more information about the benefits of the correct decisions. And a number of interface enhancements have been added to make cell referencing, adding branches, and other common tasks easier.
The amount of data available for training a neural net is often limited, and obtaining additional data can be expensive. The new Testing Sensitivity Analysis helps to make the most of small data sets.
When a neural net is trained on a small data set, the subset of data that is used to test the neural net is also small, which limits the reliability of the testing results. This new analysis helps to determine if testing results are reliable given the amount of data that has been set aside for testing. It can also answer the question whether changing the size of the testing subset will increase the reliability of the results.
Testing Sensitivity Analysis output shows the stability of the testing results for different sizes of the subset of data that is set aside for testing.
StatTools now includes a matrix that lets you view multiple scatter plots between variables in a single report. This consolidates data, saving a lot time.
New scatter plot matrices in StatTools allow you to see, at a glance, relationships between many different variables.
You can also choose to select categorical variables to color your data points, identifying which category the points fall into.
StatTools lets you color different categories in scatter plots. For example, here you see the relationship between salary and previous spending for consumer data, with gender indicated by color.
This makes it easy to determine the Spearman rank-order (non-linear) correlations from your data in StatTools for use in @RISK or other modeling.