Warning: define(): Argument #3 ($case_insensitive) is ignored since declaration of case-insensitive constants is no longer supported in /home/u742613510/domains/strategy-at-risk.com/public_html/wp-content/plugins/wpmathpub/wpmathpub.php on line 65
ERM – Strategy @ Risk

Tag: ERM

  • You only live once

    You only live once

    This entry is part 4 of 4 in the series The fallacies of scenario analysis

    You only live once, but if you do it right, once is enough.
    — Mae West

    Let’s say that you are considering new investment opportunities for your company and that the sales department has guesstimated that the market for one of your products will most likely grow by a little less than 5 % per year. You then observe that the product already has a substantial market and that this in fifteen years’ time nearly will be doubled:

    Building a new plant to accommodate this market growth will be a large investment so you find that more information about the probability distribution for the products future sales is needed. Your sales department then “estimates” the market yearly growth to have a mean close to zero and a lower quartile of minus 5 % and an upper quartile of plus 7 %.

    Even with no market growth the investment is a tempting one since the market already is substantial and there is always a probability of increased market shares.

    As quartiles are given, you rightly calculate that there will be a 25 % probability that the growth will be above 7 %, but also that there will be a 25 % probability that it can be below minus 5 %. At the face of it, and with you being not too risk averse, this looks as a gamble worth taking.

    Then you are informed that the distribution will be heavily left skewed – opening for considerable downside risk. In fact it turns out that it looks like this:

    A little alarmed you order the sales department to come up with a Monte Carlo simulation giving a better view of the future possible paths of the market development.

    The return with the graph below giving the paths for the first ten runs in the simulation with the blue line giving average value and the green and red the 90 % and 10 % limits of the one thousand simulated outcomes:

    The blue line is the yearly ensemble  averages ((A set of multiple predictions that is all valid at the same time. The term “ensemble” is often used in physics and physics-influenced literature. In probability theory literature the term probability space is more prevalent.

    An ensemble provides reliable information on forecast uncertainties (e.g., probabilities) from the spread (diversity) amongst ensemble members.

    Also see: Ensemble forecasting; a numerical prediction method that is used to attempt to generate a representative sample of the possible future states of dynamic systems. Ensemble forecasting is a form of Monte Carlo analysis: multiple numerical predictions are conducted using slightly different initial conditions that are all plausible given the past and current set of observations. Often used in weather forecasting.));  that is the time series of average of outcomes. The series shows a small decline in market size but not at an alarming rate. The sales department’s advice is to go for the investment and try to conquer market shares.

    You then note that the ensemble average implies that you are able jump from path to path and since each is a different realization of the future that will not be possible – you only live once!

    You again call the sales department asking them to calculate each paths average growth rate (over time) – using their geometric mean – and report the average of these averages to you. When you plot both the ensemble and the time averages you find quite a large difference between them:

    The time average shows a much larger market decline than the ensemble average.

    It can be shown that the ensemble average always will overestimate (Peters, 2010) the growth and thus can falsely lead to wrong conclusions about the market development.

    If we look at the distribution of path end values we find that the lower quartile is 64 and the upper quartile is 118 with a median of 89:

    It thus turns out that the process behind the market development is non-ergodic ((The term ergodic is used to describe dynamical systems which have the same behavior averaged over time as averaged over space.))  or non-stationary ((Stationarity is a necessary, but not sufficient, condition for ergodicity. )). In the ergodic case both the ensemble and time averages would have been equal and the problem above would not have appeared.

    The investment decision that at first glance looked a simple one is now more complicated and can (should) not be decided based on market development alone.

    Since uncertainty increases the further we look into the future, we should never assume that we have ergodic situations. The implication is that in valuation or M&A analysis we should never use an “ensemble average” in the calculations, but always do a full simulation following each time path!

    References

    Peters, O. (2010). Optimal leverage from non-ergodicity. Quantitative Finance, doi:10.1080/14697688.2010.513338

    Endnotes

  • The tool that would improve everybody’s toolkit

    The tool that would improve everybody’s toolkit

    Edge, which every year ((http://www.edge.org/questioncenter.html))   invites scientists, philosophers, writers, thinkers and artists to opine on a major question of the moment, asked this year: “What scientific concept would improve everybody’s cognitive toolkit?”

    The questions are designed to provoke fascinating, yet inspiring answers, and are typically open-ended, such as:  “What will change everything” (2008), “What are you optimistic about?” (2007), and “How is the internet changing the way you think?” (Last’s years question). Often these questions ((Since 1998))  are turned into paperback books.

    This year many of the 151 contributors pointed to Risk and Uncertainty in their answers. In the following we bring excerpt from some of the answers. We will however advice the interested reader to look up the complete answers:

    A Probability Distribution

    The notion of a probability distribution would, I think, be a most useful addition to the intellectual toolkit of most people.

    Most quantities of interest, most projections, most numerical assessments are not point estimates. Rather they are rough distributions — not always normal, sometimes bi-modal, sometimes exponential, sometimes something else.

    Related ideas of mean, median, and variance are also important, of course, but the simple notion of a distribution implicitly suggests these and weans people from the illusion that certainty and precise numerical answers are always attainable.

    JOHN ALLEN PAULOS, Professor of Mathematics, Temple University, Philadelphia.

    Randomness

    The First Law of Randomness: There is such a thing as randomness.
    The Second Law of Randomness: Some events are impossible to predict.
    The Third Law of Randomness: Random events behave predictably in aggregate even if they’re not predictable individually

    CHARLES SEIFE, Professor of Journalism, New York University; formerly journalist, Science magazine; Author, Proofiness: The Dark Arts of Mathematical Deception.

    The Uselessness of Certainty

    Every knowledge, even the most solid, carries a margin of uncertainty. (I am very sure about my own name … but what if I just hit my head and got momentarily confused?) Knowledge itself is probabilistic in nature, a notion emphasized by some currents of philosophical pragmatism. Better understanding of the meaning of probability, and especially realizing that we never have, nor need, ‘scientifically proven’ facts, but only a sufficiently high degree of probability, in order to take decisions and act, would improve everybody’ conceptual toolkit.

    CARLO ROVELLI, Physicist, University of Aix-Marseille, France; Author, The First Scientist: Anaximander and the Nature of Science.

    Uncertainty

    Until we can quantify the uncertainty in our statements and our predictions, we have little idea of their power or significance. So too in the public sphere. Public policy performed in the absence of understanding quantitative uncertainties, or even understanding the difficulty of obtaining reliable estimates of uncertainties usually means bad public policy.

    LAWRENCE KRAUSS, Physicist, Foundation Professor & Director, Origins Project, Arizona State University; Author, A Universe from Nothing; Quantum Man: Richard Feynman’s Life in Science.

    Risk Literacy

    Literacy — the ability to read and write — is the precondition for an informed citizenship in a participatory democracy. But knowing how to read and write is no longer enough. The breakneck speed of technological innovation has made risk literacy as indispensable in the 21st century as reading and writing were in the 20th century. Risk literacy is the ability to deal with uncertainties in an informed way.

    GERD GIGERENZER, Psychologist; Director of the Center for Adaptive Behavior and Cognition at the Max Planck Institute for Human Development in Berlin; Author, Gut Feelings.

    Living is fatal

    The ability to reason clearly in the face of uncertainty. If everybody could learn to deal better with the unknown, then it would improve not only their individual cognitive toolkit (to be placed in a slot right next to the ability to operate a remote control, perhaps), but the chances for humanity as a whole.

    SETH LLOYD, Quantum Mechanical Engineer, MIT; Author, Programming the Universe.

    Uncalculated Risk

    We humans are terrible at dealing with probability. We are not merely bad at it, but seem hardwired to be incompetent, in spite of the fact that we encounter innumerable circumstances every day which depend on accurate probabilistic calculations for our wellbeing. This incompetence is reflected in our language, in which the common words used to convey likelihood are “probably” and “usually” — vaguely implying a 50% to 100% chance. Going beyond crude expression requires awkwardly geeky phrasing, such as “with 70% certainty,” likely only to raise the eyebrow of a casual listener bemused by the unexpected precision. This blind spot in our collective consciousness — the inability to deal with probability — may seem insignificant, but it has dire practical consequences. We are afraid of the wrong things, and we are making bad decisions.

    GARRETT LISI, Independent Theoretical Physicist

    And there is more … much more at the Edge site

  • Plans based on average assumptions ……

    Plans based on average assumptions ……

    This entry is part 3 of 4 in the series The fallacies of scenario analysis

     

    The Flaw of Averages states that: Plans based on the assumption that average conditions will occur are usually wrong. (Savage, 2002)

    Many economists use what they believe to be most likely ((Most likely estimates are often made in-house based on experience and knowledge about their operations.)) or average values ((Forecasts for many types of variable can be bought from suppliers of ‘consensus forecasts’.))  (Timmermann, 2006) (Gavin & Pande, 2008) as input for the exogenous variables in their spreadsheet calculations.

    We know however that:

    1. the probability for any variable to have outcomes equal to any of these values is close to zero,
    1. and that the probability of having outcomes for all the (exogenous) variables in the spreadsheet model equal to their average is virtually zero.

    So why do they do it? They obviously lack the necessary tools to calculate with uncertainty!

    But if a small deviation from the most likely value is admissible, how often will the use of a single estimate like the most probable value be ‘correct’?

    We can try to answer that by looking at some probability distributions that may represent the ‘mechanism’ generating some of these variables.

    Let’s assume that we are entering into a market with a new product, we know of course the maximum upper and lower limit of our future possible market share, but not the actual number so we guess it to be the average value = 0,5. Since we have no prior knowledge we have to assume that the market share is uniformly distributed between zero and one:

    If we then plan sales and production for a market share between 0, 4 and 0, 5 – we would out of a hundred trials only have guessed the market share correctly 13 times. In fact we would have overestimated the market share 31 times and underestimated it 56 times.

    Let’s assume a production process where the acceptable deviation from some fixed measurement is 0,5 mm and where the actual deviation have a normal distribution with expected deviation equal to zero, but with a standard deviation of one:

    Using the average deviation to calculate the expected error rate will falsely lead to us to believe it to be zero, while it in fact in the long run will be 64 %.

    Let’s assume that we have a contract for drilling a tunnel, and that the cost will depend on the hardness of the rock to be drilled. The contract states that we will have to pay a minimum of $ 0.5M and a maximum of $ 2M, with the most likely cost being $ 1M. The contract and our imperfect knowledge of the geology make us assume the cost distribution to be triangular:

    Using the average ((The bin containing the average in the histogram.)) as an estimate for expected cost will give a correct answer in only 14 out of a 100 trials, with cost being lower in 45 and higher in 41.

    Now, let’s assume that we are performing deep sea drilling for oil and that we have a single estimate for the cost to be $ 500M. However we expect the cost deviation to be distributed as in the figure below, with a typical small negative cost deviation and on average a small positive deviation:

    So, for all practical purposes this is considered as a low economic risk operation. What they have failed to do is to look at the tails of the cost deviation distribution that turns out to be Cauchy distributed with long tails, including the possibility of catastrophic events:

    The event far out on the right tail might be considered a Black Swan (Taleb, 2007), but as we now know they happen from time to time.

    So even more important than the fact that using a single estimate will prove you wrong most of the times it will also obscure what you do not know – the risk of being wrong.

    Don’t worry about the average, worry about how large the variations are, how frequent they occur and why they exists. (Fung, 2010)

    Rather than “Give me a number for my report,” what every executive should be saying is “Give me a distribution for my simulation.”(Savage, 2002)

    References

    Gavin,W.,T. & Pande,G.(2008), FOMC Consensus Forecasts, Federal Reserve Bank of St. Louis Review, May/June 2008, 90(3, Part 1), pp. 149-63.

    Fung, K., (2010). Numbers Rule Your World. New York: McGraw-Hill.

    Savage, L., S.,(2002). The Flaw of Averages. Harvard Business Review, (November), 20-21.

    Savage, L., S., & Danziger, J. (2009). The Flaw of Averages. New York: Wiley.

    Taleb, N., (2007). The Black Swan. New York: Random House.

    Timmermann, A.,(2006).  An Evaluation of the World Economic Outlook Forecasts, IMF Working Paper WP/06/59, www.imf.org/external/pubs/ft/wp/2006/wp0659.pdf

    Endnotes

  • Planning under Uncertainty

    Planning under Uncertainty

    This entry is part 3 of 6 in the series Balance simulation

     

    ‘Would you tell me, please, which way I ought to go from here?’ (asked Alice)
    ‘That depends a good deal on where you want to get to,’ said the Cat.
    ‘I don’t much care where—‘said Alice.
    ‘Then it doesn’t matter which way you go,’ said the Cat.
    –    Lewis Carroll, Alice’s Adventures in Wonderland

    Let’s say that the board have sketched a future desired state (value of equity) of the company and that you are left to find if it is possible to get there and if so – the road to take. The first part implies to find out if the desired state belongs to a set of feasible future states to your company. If it does you will need a road map to get there, if it does not you will have to find out what additional means you will need to get there and if it is possible to acquire those.

    The current state (equity value of) your company is in itself uncertain since it depends on future sales, costs and profit – variable that usually are highly uncertain. The desired future state is even more so since you need to find strategies (roads) that can take you there and of those the one best suited to the situation. The ‘best strategies’ will be those that with highest probability and lowest costs will give you the desired state that is, that has the desired state or a better one as a very probable outcome:

    Each of the ‘best strategies’ will have many different combinations of values for the variables –that describe the company – that can give the desired state(s). Using Monte Carlo simulations this means that a few, some or many of the thousands of runs – or realizations of future states-will give equity value outcomes that fulfill the required state. What we need then is to find how each of these has come about – the transition – and select the most promising ones.

    The S@R balance simulation model has the ability to make intermediate stops when the desired state(s) has been reached giving the opportunity to take out complete reports describing the state(s) and how it was reached and by what path of transitional states.

    The flip side of this is that we can use the same model and the same assumptions to take out similar reports on how undesirable states were reached – and their path of transitional states. This set of reports will clearly describe the risks underlying the strategy and how and when they might occur.

    The dominant strategy will then be the one that has the desired state or a better one as a very probable outcome and that have at the same time the least probability of highly undesirable outcomes (the stochastic dominant strategy):

    Mulling over possible target- or scenario analysis; calculating backwards the value of each variable required to meet the target is a waste of time since both the environment is stochastic and a number of different paths (time-lines) can lead to the desired state:

    And even if you could do the calculations, what would the probabilities be?

    Carroll, L., (2010). Alice‘s Adventures in Wonderland -Original Version. City: Cosimo Classics.

  • Working Capital and the Balance Sheet

    Working Capital and the Balance Sheet

    This entry is part 2 of 3 in the series Working Capital

     

    The conservation-of-value principle says that it doesn’t matter how you slice the financial pie with financial engineering, share repurchases, or acquisitions; only improving cash flows will create value. (Dobbs, Huyett & Koller, 2010).

    The above, taken from “The CEO’s guide to corporate finance” will be our starting point and Occam’s razor the tool to simplify the balance sheet using the concept of working- and operating capital.

    To get a better grasp of the firm’s real activities we will as well separate non-operating assets from operating assets – since it will be the last that defines the firm’s operations.

    To find the amount of operating current assets we have to deduct the sum of minimum cash level, inventories and account receivables from total current assets. The difference between total- and operating current assets is assumed placed in excess marketable securities – and will not be included in the working capital.

    Many firms have cash levels above and well beyond what is really needed as working capital, tying up capital that could have had better uses generating higher return than mere short-term placements.

    The net working capital now found by deducting non-interest bearing current liabilities from operating current assets, will be the actual amount of working capital needed to safely run the firms operations – no more and no less.

    By summing net property, plant and equipment and other operating fixed assets we find the total amount of fixed assets involved in the firm’s operations. This together with net working capital forms the firms operating assets, assets that will generate the cash flow and return on equity that the owners are expecting.

    The non-operating part – excess marketable securities and non-operating investments – should be kept as small as possible, since this at best only will give an average market return. The rest of the above calculations give us the firm’s total liability and equity, which we will use to set up the firm’s ordinary balance sheet:

    However, by introducing operating-, non-operating- and working capital we can get a clearer picture of the firm’s activities ((Used in yearly reports by Stora Enso, a large international Pulp & Paper company, noted on NASDAQ OMX in Stockholm and Helsinki.)):

    The balance sheet’s bottom line has been reduced by the smallest value of operating current assets and non-interest bearing debt and the difference between them – the working capital – will be an asset or a liability depending on which of them that have the largest value:

    The above calculations is an integral part of our balance simulation model and the report that can be produced for planning, strategy- and risk assessment from the simulation can be viewed her; report for the most likely outcome (Pdf, pp 32). However this report can be produced for every run in the simulation giving the opportunity to look at tail events that might arise, distorting expectations.

    Simplicity is the ultimate sophistication. — Leonardo da Vinci

    References

    Dobbs, D, Huyett, H, & Koller, T. (2010). The ceo’s guide to corporate finance. McKinsey Quarterly, 4. Retrieved from http://www.mckinseyquarterly.com/home.aspx

    Endnotes

  • Uncertainty modeling

    Uncertainty modeling

    This entry is part 2 of 3 in the series What We Do

    Prediction is very difficult, especially about the future.
    Niels Bohr. Danish physicist (1885 – 1962)

    Strategy @ Risks models provide the possibility to study risk and uncertainties related to operational activities;  cost, prices, suppliers,  markets, sales channels etc. financial issues like; interest rates risk, exchange rates risks, translation risk , taxes etc., strategic issues like investments in new or existing activities, valuation and M&As’ etc and for a wide range of budgeting purposes.

    All economic activities have an inherent volatility that is an integrated part of its operations. This means that whatever you do some uncertainty will always remain.

    The aim is to estimate the economic impact that such critical uncertainty may have on corporate earnings at risk. This will add a third dimension – probability – to all forecasts, give new insight: the ability to deal with uncertainties in an informed way and thus benefits above ordinary spread-sheet exercises.

    The results from these analyzes can be presented in form of B/S and P&L looking at the coming one to five (short term) or five to fifteen years (long term); showing the impacts to e.g. equity value, company value, operating income etc. With the purpose of:

    • Improve predictability in operating earnings and its’ expected volatility
    • Improve budgeting processes, predicting budget deviations and its’ probabilities
    • Evaluate alternative strategic investment options at risk
    • Identify and benchmark investment portfolios and their uncertainty
    • Identify and benchmark individual business units’ risk profiles
    • Evaluate equity values and enterprise values and their uncertainty in M&A processes, etc.

    Methods

    To be able to add uncertainty to financial models, we also have to add more complexity. This complexity is inevitable, but in our case, it is desirable and it will be well managed inside our models.

    People say they want models that are simple, but what they really want is models with the necessary features – that are easy to use. If something is complex but well designed, it will be easy to use – and this holds for our models.

    Most companies have some sort of model describing the company’s operations. They are mostly used for budgeting, but in some cases also for forecasting cash flow and other important performance measures. Almost all are deterministic models based on expected or average values of input data; sales, cost, interest and currency rates etc.

    We know however that forecasts based on average values are on average wrong. In addition will deterministic models miss the important uncertainty dimension that gives both the different risks facing the company and the opportunities they bring forth.

    S@R has set out to create models that can give answers to both deterministic and stochastic questions, by linking dedicated Ebitda models to holistic balance simulation taking into account all important factors describing the company. The basis is a real balance simulation model – not a simple cash flow forecast model.

    Both the deterministic and stochastic balance simulation can be set about in two different alternatives:

    1. by a using a EBITDA model to describe the companies operations or
    2. by using coefficients of fabrications (e.g. kg flour pr 1000 bread etc.) as direct input to the balance model – the ‘short cut’ method.

    The first approach implies setting up a dedicated Ebitda subroutine to the balance model. This will give detailed answers to a broad range of questions about markets, capacity driven investments, operational performance and uncertainty, but entails a higher degree of effort from both the company and S@R. This is a tool for long term planning and strategy development.

    The second (‘the short cut’) uses coefficients of fabrications and their variations, and is a low effort (cost) alternative, usually using the internal accounting as basis. This will in many cases give a ‘good enough’ description of the company – its risks and opportunities. It can be based on existing investment and market plans.  The data needed for the company’s economic environment (taxes, interest rates etc) will be the same in both alternatives:

    The ‘short cut’ approach is especially suited for quick appraisals of M&A cases where time and data is limited and where one wishes to limit efforts in an initial stage. Later the data and assumptions can be augmented to much more sophisticated analysis within the same ‘short cut’ framework. In this way analysis can be successively built in the direction the previous studies suggested.

    This also makes it a good tool for short-term (3-5 years) analysis and even for budget assessment. Since it will use a limited number of variables – usually less than twenty – describing the operations, it is easy to maintain and operate. The variables describing financial strategy and the economic environment come in addition, but will be easy to obtain.

    Used in budgeting it will give the opportunity to evaluate budget targets, their probable deviation from expected result and the probable upside or down side given the budget target (Upside/downside ratio).

    Done this way analysis can be run for subsidiaries across countries translating the P&L and Balance to any currency for benchmarking, investment appraisals, risk and opportunity assessments etc. The final uncertainty distributions can then be “aggregated’ to show global risk for the mother company.

    An interesting feature is the models ability to start simulations with an empty opening balance. This can be used to assess divisions that do not have an independent balance since the model will call for equity/debt etc. based on a target ratio, according to the simulated production and sales and the necessary investments. Questions about further investment in divisions or product lines can be studied this way.

    Since all runs (500 to 1000) in the simulation produces a complete P&L and Balance the uncertainty curve (distribution) for any financial metric like ‘Yearly result’, ‘free cash flow’, economic profit’, ‘equity value’, ‘IRR’ or’ translation gain/loss’ etc. can be produced.

    In some cases we have used both approaches for the same client, using the last approach for smaller daughter companies with production structures differing from the main companies.
    The second approach can also be considered as an introduction and stepping stone to a more holistic Ebitda model.

    Time and effort

    The work load for the client is usually limited to a small team of people ( 1 to 3 persons) acting as project leaders and principal contacts, assuring that all necessary information, describing value and risks for the clients’ operations can be collected as basis for modeling and calculations. However the type of data will have to be agreed upon depending on the scope of analysis.

    Very often will key people from the controller group be adequate for this work and if they don’t have the direct knowledge they usually know who to ask. The work for this team, depending on the scope and choice of method (see above) can vary in effective time from a few days to a couple of weeks, but this can be stretched from three to four weeks to the same number of months.

    For S&R the time frame will depend on the availability of key personnel from the client and the availability of data. For the second alternative it can take from one to three weeks of normal work to three to six months for the first alternative for more complex models. The total time will also depend on the number of analysis that needs to be run and the type of reports that has to be delivered.

    S@R_ValueSim

    Selecting strategy

    Models like this are excellent for selection and assessment of strategies. Since we can find the probability distribution for equity value, changes in this brought by different strategies will form a basis for selection or adjustment of current strategy. Models including real option strategies are a natural extension of these simulation models:

    If there is a strategy with a curve to the right and under all other feasible strategies this will be the stochastic dominant one. If the curves crosses further calculations needs to be done before a stochastic dominant or preferable strategy can be found:

    Types of problems we aim to address:

    The effects of uncertainties on the P&L and Balance and the effects of the Boards strategies (market, hedging etc.) on future P&L and Balance sheets evaluating:

    • Market position and potential for growth
    • Effects of tax and capital cost
    • Strategies
    • Business units, country units or product lines –  capital allocation – compare risk, opportunity and expected profitability
    • Valuations, capital cost and debt requirements, individually and effect on company
    • The future cash-flow volatility of company and the individual BU’s
    • Investments, M&A actions, their individual value, necessary commitments and impact on company
    • Etc.

    The aim regardless of approach is to quantify not only the company’s single and aggregated risks, but also the potential, thus making the company capable to perform detailed planning and of executing earlier and more apt actions against uncertain factors.

    Used in budgeting, this will improve budget stability through higher insight in cost side risks and income-side potentials. This is achieved by an active budget-forecast process; the control-adjustment cycle will teach the company to better target realistic budgets – with better stability and increased company value as a result.

    This is most clearly seen when effort is put into correctly evaluating strategies-projects and investments effects on the enterprise. The best way to do this is by comparing and Choosing strategies by analyzing the individual strategies risks and potential – and select the alternative that is dominant (stochastic) given the company’s chosen risk-profile.

    A severe depression like that of 1920-1921 is outside the range of probability. –The Harvard Economic Society, 16 November 1929