OR/MS Today - April 2006|
Probability Management Part 2
Small models linked through their input and output distributions create coherent networks of models that identify enterprise-wide risks and opportunities.
By Sam Savage, Stefan Scholtes and Daniel Zweidler
In the first article in this series  we presented the seven deadly sins of averaging. To counter them, we introduced the concept of Probability Management, which focuses on estimating, maintaining and communicating the distributions of the random variables driving a business. We presented the three underpinnings of probability management as follows:
1. interactive simulation: illuminates uncertainty and risk much as light bulbs illuminate darkness.
2. centrally generated stochastic libraries of probability distributions: provide standardized probability distributions across the enterprise, much as the power plants provide standardized sources of electricity to light bulbs.
3. certification authority: analogous to the power authority that ensures that you get the expected voltage from your wall socket. We refer to the person or office with this authority as the Chief Probability Officer or CPO.
In this article, we discuss each of these areas in more detail, and then finish with a short discussion of the potential for Probability Management in regulation and accounting.
In "Action in Perception" , the philosopher Alva Noë argues that without action on the part of the observer, there can be no perception. He describes an experiment in which two kittens are presented with the same visual environment, but only one of the two can interact with it by walking on a turntable. The other is suspended just above the turntable. By the end of the experiment, the suspended kitten has not learned how to process visual information and is effectively blind. No wonder managers have so much difficulty understanding and communicating uncertainty and risk. After all, how do you interact with a probability distribution?
Interactive simulation may be the answer. The "exploration cockpit" at Shell, described in our earlier paper, allowed managers to select or deselect projects with a mouse click. The resulting portfolio was then driven through repeated copies of Excel formulas, where each repetition was driven by a separate row of pre-calculated Monte Carlo trials in the stochastic library. The statistical properties of the portfolio were immediately apparent through the graphical interface. If, however, as part of a sensitivity analysis it is desired to change underlying econometric parameters like the future distribution of oil or gas price, the stochastic library and the associated universe of portfolios has to be regenerated. This procedure is currently too slow from a computational standpoint to qualify as interactive in a decision-making setting. It is hoped that new simulation technology described below, coupled to ever-faster computers, will expand the envelope of interactive exploration.
Another firm using interactive portfolio simulation is Bessemer Trust of New York. Bessemer has a model for
displaying the implications of various wealth management strategies for its clients. According to Andrew M. Parker,
managing director and head of Quantitative Strategies at Bessemer, "one significant drawback with most simulation software is that it can be time consuming. This can overwhelm the potential to easily compare and contrast different scenarios. Having an interactive model dramatically solves this problem."
Although the interactive portfolio models at Shell and Bessemer have proven successful, they were complex to develop and maintain. Furthermore, it would not be easy to generalize the approach beyond the modeling of portfolios. This appears to be about to change.
One of the founding fathers of the spreadsheet revolution has developed technology that automates the process of interactive simulation. Dan Fylstra, CEO of Frontline Systems and co-developer of VisiCalc, has introduced technology (see story on page 62) that almost instantly runs thousands of Monte Carlo trials every time you modify an input to a normal spreadsheet model. This will allow a large managerial audience to start interacting with and hopefully sharpening their perception of probability distributions.
Coherent modeling preserving relationships.
The multivariate distributions driving the firm are stored in a stochastic library unit, with relationships preserved, or SLURP. In its simplest form, this is a matrix of pre-generated Monte Carlo trials, with one column for each uncertain business driver, and one row per trial.
Demographers use SLURPs as a matter of course. They call them "representative samples." A representative sample of, say, 10,000 U.S. citizens can be used to generate a SLURP for such quantities as income, education, family size, voting behavior, etc. with all relationships preserved. One can think of a SLURP for business planning as a "representative sample" of the possible futures.
Modeling dependence: We come to bury correlation, not to praise it.
Time, the third dimension.
There are relationships between oil price and GDP, and oil price from one time period to the next. One trial is a rectangular "slice" in this three-dimensional cube, with oil price and GDP defining one side, and time defining the other.
Coherence and the fundamental identity of SLURP algebra.
Thus, stochastic models may be rolled up to higher levels. SLURPs can in theory be propagated, horizontally across hierarchies of organizations, vertically through supply chains, as well as dynamically forward in time.
We summarize this in what we call the fundamental identity of SLURP algebra as follows.
Let X = (X1 ... Xn) be a vector of uncertain inputs to a model represented by SLURP S(X), and let Y = (Y1 ... Ym) = F(X) denote the outputs of a model, F, that depends on X.
The SLURP of the outputs of F is found by evaluating F for each of the trials in the SLURP of X, or symbolically, S(F(X)) = F(S(X)).
The crucial argument is simple: The output SLURP F(S(X)) inherits the sample property from the input SLURP S(X), i.e. if all trials in S(X) have the same probability of occurring, then so do all trials in F(S(X)).
This identity is in stark contrast to the strong form of the flaw of averages (closely related to Jensen's Inequality), which states that E(F(X)) ≠ F(E(X)), where E(X) is the expectation of X and F is a non-linear function. It is this inequality that leads to many of the systematic errors embodied in the seven deadly sins of averaging, when single numerical values are propagated through an organization. Thus, the use of SLURPs cures the flaw of averages.
At Shell, the stochastic library had to be assembled from vast amounts of data gathered worldwide. The first decision was the level of granularity at which to model projects. The level chosen was the "exploration venture," which included a number of projects within a single geographical region. As the first step towards creating a stochastic library, the exploration engineers within each venture were responsible for providing initial estimates of the distribution of oil and gas volumes in that venture. When assembling distributions of possible hydrocarbon volumes and economic value of exploration, it is important to acknowledge the consequences of the "Prospector Myth" as described by Pete Rose and Gary Citron . Explorers by their very nature are not only very optimistic, but also often fail to recognize the full range of possible outcomes of an exploration venture. Painting the numerical picture of an exploration venture and its various execution alternatives is a mélange of art and science underpinned by experience.
The distributions of hydrocarbon volumes were assumed to be independent across ventures. Conversely, the economic evaluations of the ventures have strong relationships resulting from global oil and regional gas prices. The volumetric distributions were converted to coherent distributions of economic output by using discrete distributions of oil and gas prices and associated drilling and development cost assumption. For the economic evaluations, the input parameters are distributed globally through a shared library updated on an annual basis.
To provide the assurance that the ventures and their execution alternatives are not only feasible as described, but also portray the cost and value elements appropriately, seasoned explorers and economists review the input to the coherent simulation that generates the stochastic library of outcomes for exploration ventures and their alternative execution plans. They will also engage in further dialog with the engineers and managers in the field to ensure consistency across ventures
At Bessemer, the situation was quite different. First, with financial portfolios there is rich historical data and a number of accepted approaches to modeling asset growth. The second difference was that the ultimate consumers of the information derived from the simulations were Bessemer's individual clients.
"In the wealth management business, it's extremely important to assure that clients understand the risk in their investment portfolios," says Parker, "and the only way to do this effectively is to use probabilistic modeling. To this end, having a centrally managed process with a shared library of asset distributions assures uniformity across the organization." Parker periodically updates this library, and distributes it to others in the organization to use in the simulation models that he also oversees. "This allows our client account managers to give robust, consistent answers without requiring a deep knowledge of statistics," Parker adds.
Regulators of financial intuitions and other organizations are concerned not only with the stability of individual firms within a given industry, but also in the stability of the industry as a whole. Establishing coherent benchmark distributions of global economic factors would provide a uniform basis against which firms could be stochastically compared.
After the Enron fiasco, the U.S. Congress moved to increase transparency into the risks faced by publicly traded firms. The resulting Sarbanes Oxley legislation  mandated tighter adherence to Generally Accepted Accounting Principles (GAAP). Unfortunately, GAAP itself is permeated with examples of the flaw of averages [7, 8]. Although the accounting industry by its nature does not change quickly, there may be opportunities in this area for those with training in accounting, law and stochastic modeling .
OR/MS Today copyright © 2006 by the Institute for Operations Research and the Management Sciences. All rights reserved.
Lionheart Publishing, Inc.
506 Roswell Rd., Suite 220, Marietta, GA 30060 USA
Phone: 770-431-0867 | Fax: 770-432-6969
Web Site © Copyright 2006 by Lionheart Publishing, Inc. All rights reserved.