@RISK (Version 3.5e)

Journal of Property Valuation and Investment

ISSN: 0960-2712

Article publication date: 1 August 1998

82

Citation

(1998), "@RISK (Version 3.5e)", Journal of Property Valuation and Investment, Vol. 16 No. 3, pp. 345-347. https://doi.org/10.1108/jpvi.1998.16.3.345.3

Publisher

:

Emerald Group Publishing Limited


Systems requirements: IBM PC 386 or higher Microsoft Windows 95 NT or 3.x Microsoft Excel 4.0, 5.0 or 7.0 Lotus 1‐2‐3 for Windows release 4.0 or higher 4Mb RAM free

Recommended: 486 or higher CPU Windows 95 8Mb RAM installed

Home Page: www.palisade.com

The copy that I used for review was a 16‐bit version designed for use with Excel and Windows 3.1. There is also a 16‐bit version for Lotus 1‐2‐3. I was able to run both the full program and tutorial without a problem in Windows 95, but there are also 32‐bit versions of the same program available for Excel and Lotus to handle substantial models in Windows 95 and NT.

In general, the techniques in an @RISK risk analysis involve four steps:

• developing the model, by defining the problem in Excel or 1‐2‐3;

• identifying “uncertainty” in relation to key variables in the model, by specifying their possible values with probability distributions;

• analysing the model using simulation to determine the range and probabilities of all possible outcomes for the results of the worksheet; and

• making the decision, based on the results and personal judgement.

To achieve this @RISK aims to extend the analytical capabilities of Lotus or Excel to include risk analysis and simulation, and adds two extra tool bars to the normal Windows display in Excel.

@RISK uses Monte Carlo simulation which specifies uncertain (“stochastic”) input values in the spreadsheet as probability distribution functions (PDF). The program adds some 37 more PDFs to those already provided by Excel and Lotus, each of which give the range of values the variable could take (minimum to maximum) and the likelihood of occurrence within the range. These PDFs can then be placed in the worksheet cells and formula as is the case with other spreadsheet functions. For example, a “triangular distribution” where 10 is the minimum value, 20 the most likely and 30 the maximum takes the form, RiskTriang(10, 20, 30).

As well as being certain or uncertain, variables can also be “independent” or “dependent”. @RISK allows the user to define which variables are of each type in the model and to specify the nature and direction of correlation between them.

Once distribution functions have been entered into your spreadsheet, output cells also need to be identified, prior to running the simulation.

A simulation in @RISK involves recalculating the worksheet that has been created using iteration. For each iteration:

• all distribution functions are sampled;

• sampled values are returned to the cells/formulae of the worksheet;

• the worksheet is recalculated; and

• values are calculated for output cells collected from the worksheet and then stored.

You can also choose what type of simulation @RISK performs. For example “multiple simulation” allows the running of one simulation after another on the same model, changing the values by simulation. The sampling type can also be chosen: “Latin Hypercube” is the default, which selects straightforward sampling whereas “Monte Carlo” selects standard Monte Carlo sampling. Finally, you can also specify the number of iterations and monitor convergence during a simulation to track the stability of the output figures being generated.

The simulation results that are produced by @RISK include statistics and data reports for both input and output variables. The statistics which are produced include minimum and maximum calculated values, mean, standard deviation and percentages. Additionally, histograms, cumulative curves and summary graphs for cell ranges are very easy to produce either in @RISK or in Excel.

@RISK can also perform two additional advanced analyses: sensitivity analysis; and scenario analysis.

Sensitivity analysis shows the sensitivity of each output variable to the inputs in the worksheet, and this identifies the most critical inputs in the model. Using either multivariate stepwise regression or rank order correlation, coefficients are produced and the results can be visually displayed in the shape of a tornado chart, with longer bars at the top representing the most significant input variables.

Scenario analysis, on the other hand, identifies combinations of inputs which lead to output target values. This allows testing of such statements as “when NPV is high, significant inputs are a high expectation of inflation and a high long‐dated gilt rate”.

In conclusion, @RISK does offer a very powerful tool for analysing risk. The manual is well‐written and concise and the disk‐based tutorial provided an excellent summary of the full program. Having run the program on both a low end and high end PC the difference in speed and ability to handle large models is noticeable. Most users will therefore require Pentium‐based PCs with at least 16Mb of memory. As always, decision making comes down to personal judgement, and it is likely that someone who does not have a strong statistics background may not understand the rationale behind @RISK and could make the wrong judgement. Nevertheless, if the limits of probability and financial theory are well understood @RISK is an excellent tool for decision making.

References

Mollart, R. (1994), “Software review: using @RISK for risk ananlysis”, Journal of Property Valuation & Investment, Vol. 12 No. 3, pp. 89‐96.

Related articles