What is the purpose of modelling, in any field? Clearly, it is divination whether foretelling the future, or controlling it. So my task here is to foretell the future of a field that itself tries to foretell the future. To do that, I must first locate the present: what works now, and why. My view is a parochial one; I wasn’t trained as an economist, but as a natural scientist who, for the past 10 years or so, has made a living and had some fun building the models and systems used by people who trade complex, mostly derivative, securities for their living. It is interesting, though limited, work, but it is what I know about, from the bottom up.
So let me start by giving you one view of the derivatives trading environment today: vast struggles with dispersed data and information and record-keeping, all overlaid with ambitious, sometimes astonishingly successful, attempts to describe the underlying phenomena with the classical tools of the natural sciences. People worry about model risk, but I think the largest risks are procedural, administrative and operational.
Given this picture. you can understand why, at Goldman Sachs, despite the models we build, the papers we write and the clients we visit, only four or five of our group of 30 people in quantitative strategies in equity derivatives are directly involved in modelling: that is, in isolating financial variables, studying their dynamical relationships, formulating them as differential equations or statistical affinities, solving them and, finally, writing the programs that implement the solution.
How are the models used? In brief, to value listed or over-the-counter options for market making and proprietary trading; to calculate and hedge the exposure of portfolios across many different countries and currencies; to convert listed prices to the common currency of implied volatilities; to engineer structured derivatives; to run systems that look for mismatches between fair value and the market; to value and hedge corporate finance instruments for arbitrage purposes; and, finally, to estimate firm-wide value-it-risk. Less frequently, we also use models directly to examine non-derivative securities.
Models are important, as they lie beneath most of our applications but take few pure resources. Why are there so few modellers compared with programmers and system builders? And, interestingly, why are there fewer in equities than in fixed income?
Derivatives and non-linearity
According to Professor Stephen Ross in the Palgrave Dictionary of Economics: … options pricing theory is the most successful theory not only in finance, but in all of economics”. This seems unquestionable, but why has it worked so well?
I think it is because the fundamental problem of options theory is the valuation of hybrid, nonlinear securities, and options theory is an ingenious but glorified method of interpolation. I don’t mean that as an insult. Traders use options theory intuitively to understand complex, nonlinear patterns of variation in price in terms of simpler, linear changes in volatility and probability. They do this by regarding a hybrid as a probability-weighted mixture of simpler securities, with the probability depending on the volatility. They think linearly in terms of perceived changes in volatility and probability, and use the model to transform their perceptions into non-linear changes in price.
In the real world of traded securities, few of the assumptions of Black, Scholes and Merton are strictly respected. But their view of the hybrid nature of a stock option as a probability-weighted mixture of stock and bond captures a core of truth that provides the foundation for the model’s robustness.
The same strategy – to think of something complex as a non-linear, probability-weighted mix of simpler things – underpins yield curve models, which let you regard swaptions as bond hybrids subject to interpolation. Similarly, implied tree models regard exotic options as interpolated mixtures of vanilla options of various strikes and expiries.
Options theory works because it aims at relative, rather than absolute, value. A necessary prerequisite is the notion, sometimes scorned by academics, of value calibration: the effort to ensure that the derivative value matches the value of each underlyer under conditions where mixtures become pure and certain. Without that, the relativity of value has no foundation.
Underlyers and linearity
Stock options can be likened to molecules made of indivisible atoms, where we understand the basic processes of chemistry and synthesis. The stocks themselves, in contrast, are the atoms: the supposedly irreducible primitives that comprise the derivatives.
But this analogy is limited. In physics, we have a deep understanding of the fundamental laws of atomic physics that support chemistry, but in finance we understand the laws of options – the molecular chemistry – much better than we do the laws of stocks. This isn’t unprecedented; advances in nineteenth-century chemistry did precede advances in twentieth-century physics. At present, our stock model lacks deep structure or firm laws. So most traditional equity modelling resources focus on data.
Not so with bonds. Although they are the underlyers of the fixed-income world, with interest rates extracted from bond prices, people think of interest rates as the underlyer and bonds as the non-linear derivatives, So, in this case, even the simplest instruments are non-linear and need interpolation and mathematics. That is why there are so many more quantitative modellers and computer scientists in fixed-income areas than in equities.
Limits of traditional modelling
Where can traditional modelling work? “Theory”, in the natural sciences, has come to mean identifying the primitives and describing the rest of the world in terms of the postulated dynamical relations between them.
But theories of the natural world involve man playing against God, using ostensibly universal variables, such as position and momentum, and universal laws such as Newton’s, that we pretend to believe are independent of human existence, holding true forever. (I do not believe that this independence is as obvious as it seems, and furthermore, recent cosmological theories contemplate our universe consisting of many subuniverses, each pinched off from the others, each with different laws.)
In the financial world, in contrast, it is man playing against man. But mankind’s financial variables are clearly not universal: they are quantities – such as expected return and expected risk – that do not exist without humans; it is humans doing the expecting. Also, these variables are frequently hidden or unobservable – they are parts of the theory that are observed only as values implied by some other traded quantity. But human expectations and strategies are transient, unlike those of the God of the physicists. So financial modelling is never going to provide the eight-decimal place forecasting of some areas of physics.
Advances in engineering have often followed advances in scientific understanding. The industrial revolution exploited mechanics and thermodynamics. The computer revolution needed Boolean algebra and solid-state physics. The biotech revolution of genetic engineering and immunology, which is just starting up, requires the structure of DNA and the genetic code.
Ultimately, I do not think that physics and genetics are reliable role models for finance and economics. Physics has immutable laws and great predictive power, expressed through mathematics. You would expect its textbooks to took pure and rigorous. Finance has few dynamical laws and little predictive power and you would expect its textbooks to look discursive.
So why is it that finance books look like pure mathematics, filled with axioms, whereas physics books look like applied mathematics? The degree of axiomatisation seems inversely proportional to applicability. This unnatural disequilibrium reminds me of an inverted yield curve, or of the put skew in equity markets: how long can it last without the crash it implies?
Black, Scholes and Merton were the Newtons of derivatives. They created and then almost completed the field, the only part of finance ripe for an industrial revolution based on principles. We are now living in the post-Newtonian world and it will take a long time for Einstein to appear. We will continue to see the extension of derivatives models and the relative-value approach. What more can we expect?
Extensions of ideas that work
Options theory uses the following principles: (1) The law of one price; (2) A dynamic strategy for options replication: (3) Lognormal underlyer evolution; and (4) Calibration of the model to known market values. What extensions of these principles can we expect?
Rationality rather than voodoo. Options theory is rational and causal, based on logic. It is mathematical but the mathematics is secondary. Mathematics is the language used to express dynamics. There are still many traders, even options traders, who have a taste for mathematics without reason – for voodoo number-juggling and patterns and curve fitting and forecasting. I think we will continue to see successful models based on ideas about the real world, expressed in mathematics, as opposed to mathematical-looking formulas alone.
Better adjustments of the theory to the real world. The real world violates most of the option pricing principles. Liquidity issues and transaction costs mitigate the law of one price. Evolution isn’t lognormal. Volatility is stochastic. Replication is neither continuous nor costless. Consequently, simulation shows that the profit and loss of a “risklessly hedged” option has an astonishingly large variance when you rehedge intermittently and when you allow for the small, but inevitable, mismatch between realised and hedging volatility. How, you may wonder, do options desks make any money?
I think the truth is that many desks do not fully understand the source of their profit and loss. I expect to see more realistic analyses of the profit and loss of options books under practical circumstances. Leland’s 1995 paper on transactions costs was a good start. More recently, a Risk magazine article by Ajay Gupta (July 1997, page 37) started to probe the effects of mismatches between implied and realised volatility, similar in spirit to some analyses we have been doing at Goldman.
Forwards as a basis. Many of the advances in modelling in the past 20 years have been connected with the efficacy of using forward, rather than spot, values as the appropriate mathematical basis of a model. This is the essence of the Heath, Jarrow & Morton (1992) approach to yield curve modelling, and similar ideas can also be applied to volatility. Recent work on market models of interest rates by Brace, Gatarek & Musiela (1977), Jamshidian (1996) and others is also closely connected to this concept.
Calibration. A good trading model must both match the values of known liquid securities and realistically represent the range of future market variables. Very few models manage this. Academics tend to favour those with a realistic evolution but practitioners who hedge cannot live without well-calibrated models; it is no good valuing an option on a bond with a model that misprices the underlying bond itself. If I were forced to choose, I would prefer to calibrate determinacy first – that is, to get the values of known securities right – and hope for robustness if I get the stochastics a little wrong. Obviously, that’s not perfect. I hope to see progress in building models that are both market calibrated and evolutionarily realistic.
The wisdom of implied variables. There is little certain knowledge about future values in finance. Implied values are the rational expectations that make a model fit the market, and provide the best (and sometimes the only) insight into what people expect. During the recent stock market correction, the pre-crash implied volatilities of options with different strikes gave a good indication of the level and variation of post-crash, at-the-money implied volatilities. I expect to see modelling based on implied variables – implied forward rates, volatilities, correlations and credit spreads – continue to grow in applicability and sophistication.
Traded variables as stochastic factors. A few years ago, there was a tendency to build stochastic models based on whatever principal components emerged from the data, no matter how arcane their groupings. The current fashion, factors that represent traded instruments, seems sensible. Market models of interest rates are an attractive step in this direction. They model directly the evolution of traded, discrete securities, and intuitively justify simple pricing formulas. I like models whose stochastic factors can be grasped viscerally. Finance is still too immature to rely on esoteric dynamical variables.
Changes of numeraire. This method, pioneered by Margrabe (1978), seems to keep re-emerging as a tactic for simplifying complex problems by reducing them to simpler, previously solved problems when viewed in a different currency.
Techniques of limited value
Optimisation. Optimisation sounds vital to people who do not work in the industry, but I don’t find it that useful in practical finance. I am a little embarrassed to admit that we rarely use optimisation programmes in our equity derivatives options group at Goldman. In engineering, where the laws are exactly understood, or in travelling-salesman-style problems – where one optimises over many possible paths, each of whose length is exactly known – optimisation is sensible. One is simply trying to isolate the scenario that produces the best well-specified outcome.
In financial theory, in contrast, each scenario is inexact – there is a crude interest rate model, a crude prepayment model and other misspecifications. While averaging may cancel much of the misspecification, optimisation tends to accentuate it. So I am largely a sceptic about optimisation in finance, although that is not to say that it never makes sense, just that it should be used with caution.
The capital asset pricing model. This provided the original framework for the Black-Scholes equation, and its ideas about risk and return loosely permeate all thoughts about trading. In practice, we do not use it much.
Large dimension problems. Financial theory seems more solidly successful when applied to problems with a small number of dimensions.
New directions
Underlyer modelling. We need more sophisticated models of underlyers, but we lack any good general laws beyond lognormality. In the real world, there are fat tails, jumps, exchange rate bands and other so-called anomalies. Classical physics starts with the certainty of single particle dynamics and proceeds to the statistics of ensembles. in finance, even a single stock suffers from uncertainty. The broadest theoretical advance would be some new theory of underlyers; perhaps there is some way to “derivitify” underlyers by regarding them as dependent on more primitive quantities. But I know of nothing, from behavioural finance to chaos theory, that is ready to for real applicability.
Computing and electronic markets. Computing will continue to be the driving force behind financial markets. Fast computation will allow electronic market making and automated exchanges for options as well as stocks. Expect faster trading, fewer intermediaries and more direct access to capital. Trading systems will have to accommodate these changes. Fast access to relevant information is even more important in electronic markets. Limited artificial intelligence models will find their use in areas where information is vast and logic is limited. Rule-based systems might work well here. It is easier to see the advantages of computing power than of models. Furthermore, computers will have increasing value in displaying and examining multi-dimensional risk profiles.
Market microstructure. Most financial models assume an economic equilibrium. The approach to equilibrium in models of market microstructure is becoming a fertile area. I recently heard an interesting talk at the Society for Quantitative Analysts in New York by Charles Plott of the California Institute of Technology on trading experiments that observe the approach to price equilibrium. This type of work will ultimately help organise market making systems and tie them ever more closely to hardware and software.
Statistical arbitrage. I am unsure what to predict here. I am always struck by the difference between statistics in physics and in finance. First, in theory. In physics, the microscopic laws of mechanics and the macroscopic laws of thermodynamics were ultimately joined in statistical thermodynamics. In finance, both the macroscopic intuition and the microscopic laws are sometimes missing, yet modellers still like to apply statistics and optimisation.
Second, experiment. In the natural sciences, theory is compared with experiment via statistics. Kelvin is supposed to have said that if, as an experimenter, you actually need statistics, then you did the wrong experiment. In finance. researchers sometimes do statistical analysis first and then look for a theory. I am a great believer in thoughtfulness and causality. I would hope to see researchers think more about the causal dynamics between particular underlyers, propose models of cointegration, and then test them using the data.
Value-at-risk. The VAR problem is mostly operational: how do you get all of a firm’s positions and pricing parameters in one place at one time? With that in place, you can run a Monte Carlo simulation to calculate expected losses. This is useful, but is no substitute for the much more detailed scenario analysis and common sense and experience necessary to run a derivatives book. There is no short cut to understanding complexity. For a view of the practicalities of portfolio risk management in a trading environment, see Litterman (1996). From the theoretical point of view, Cornell University’s David Heath et al have written some interesting notes on the axiomatic requirements for consistent measures of value-at-risk (Risk November 1997, pages 68-71).
Some recent sociological changes in the modelling world. Being a geek is now officially cool. You don’t have to apologise about talking mathematics in the elevator any more.
Financial theory seems to be moving out of business schools in both directions, leftwards to the sciences and rightwards to real businesses. On the one hand, sophisticated financial research now thrives on Wall Street, perhaps even more than in universities. There has been a mass exodus of skilled financial theorists into the banking arena. Even textbooks refer to theories created by practitioners. On the other hand, financial theory is also becoming part of an applied mathematics curriculum. Mathematics departments give financial engineering degrees, and mathematicians write books on options valuation. Applied mathematicians get PhDs in options pricing with transactions costs.
Options valuation models are becoming commoditised and cheaply available. Companies that write risk management systems are going public. Risk consulting is lucrative and commonplace. Big firms still prefer to do it themselves, but smaller ones can buy or contract most of what they need.
Conclusion
From the viewpoint of someone who works with traders, I like to think of models the way quantum physicists used Gedanken experiments, as a sort of imaginary stress-testing of the physical world done in your head, or on paper, in order to force your picture of the world into a contradiction. Einstein, in thinking about special relativity, imagined what he would see sitting on the edge of a moving light beam, while Schr?dinger’s contemplation of quantum mechanics famously led him to imagine a cat in a sealed box with a radioactive atom that would trigger a Geiger counter that would release poison.
I think that is the right way to use mathematical models in finance. In most cases, the world doesn’t really behave in exactly the way you have constructed it. You are trying to make a limited approximation of reality, with perceptual variables you can think about, so that you can say to yourself “What happens if volatility goes up, or if the slope of the yield curve changes?” Then you can come up with a value based on what you can understand and describe.
You have to have a sense of wonder, almost a suspension of disbelief, when you observe desks using quantitative models to value and hedge complex securities. Think of a collateralised mortgage obligation: you use an (at best) quasi-realistic model for interest rate evolution and a crude model for prepayments, and combine both to simulate thousands of future scenarios to value the curvature of the mortgage. Then you pay for it, based on that value. It’s not quite preposterous, but it is amazing. The strongest reasons for trusting it are that it is rational and thoughtful, and there is nothing better. It will probably continue to be that way. But I think that’s good news.
Emanuel Derman is a managing director in the quantitative strategies group at Goldman Sachs in New York. This article is based on a speech presented at the Risk Tenth Anniversary Global Summit In London on November 19 and is ? Goldman Sachs, 1997. It reflects the personal views of the author and does not necessarily reflect the views of Goldman Sachs.