More than mere semantics

 

“Die Ordnung der Gesellschaft ist das Ergebnis menschlichen Handelns, nicht aber menschlichen Entwurfs”.

 F.A. Hayek

Not knowing the future is a dreadful bother. It creates all kinds of difficulties for financial economists. It does, however, make watching cricket matches more interesting.

Lack of certainty about the future is, in fact, one of the primary reasons why we have distributed financial markets. If everyone knew, with complete certainty, what was going to happen, there would be no disagreements about the valuation of assets.

A distinctive feature of the Austrian School is its early emphasis on uncertainty and the limits of knowledge. In contrast to later neoclassical traditions, with their focus on rational agents, optimization, and aggregate equilibrium, the Austrians concentrated on how individuals with subjective and limited knowledge engage in exchange — and on how prices in decentralized markets communicate dispersed information.

Ludwig von Mises eloquently wrote that “The calculus of probability is inapplicable to problems of human action.” Economics, von Mises and his fellow Austrians understood, “is not a science of exact laws like physics; it is a science of human action.”

Ludwig Wittgenstein understood – and so concisely immortalized in the Tractatus Logico-Philosophicus“Wovon man nicht sprechen kann, darüber muß man schweigen.” William Shakespeare understood this basic fact as well, which is why, after Horatio sees a ghost, Hamlet calmly counsels him: “There are more things in heaven and earth, Horatio, / Than are dreamt of in your philosophy.”

While the nature of Austrian Economics allows a somewhat laissez-faire attitude toward risk, the branches of Economics and Finance that are built on the optimizing-rational-actor paradigm – the descendants of Antoine Cournot – the first economist who used calculus to derive a profit-maximizing condition – or W.S. Jevons kickstarting the Marginal Revolution – cannot afford to be quite as casual about uncertainty, especially with regard to Financial Economics.

For the sake of both clarity and ease, herein I am going to refer to the practice of utilizing calculus and all other manner of quantitative methods – especially statistics and probability – as the methodology of die Optimierungsweltanschauung, and its adherents as die Optimierungsideologen.

In a normally functioning marketplace, the exact value of a share of stock, or a bond, or a derivative contract is determined by the price at which it is bought or sold. However, given the stakes – that is, the amounts of money involved – and the fact that the “correct value” of an asset can only be known in hindsight, it is natural for intelligent people to seek new, better ways to forecast what the price of an asset will be at some point in the future. How serious a business is forecasting? Consider the following…

According to the Bank for International Settlements (BIS), the notional amount outstanding of over-the-counter (OTC) derivatives was approximately $846 trillion at end-June 2025 – a figure that, while overstating net exposure, illustrates the immense scale of that market alone.

Before we go on, please take a moment and contemplate that figure… $846 TRILLION. And that is only the market for derivatives.

The total value of publicly listed stocks is in the neighborhood of $125 trillion (although that figure fluctuates significantly, sometimes wildly); global GDP is approximately $110 trillion; global debt, all-in, is nearly impossible to calculate, but any remotely reliable estimate has the total over a couple hundred trillion dollars. And we haven’t considered other markets, such as insurance, or the values of private arrangements that aren’t reported.

Perhaps the most staggering feature of the above numbers is that, with the obvious exception of GDP, most of that money is changing hands – sometimes many times per day. Billions of trades take place every day in stocks, bonds, and derivatives, not to mention cryptocurrencies.

The core methods of determining the value (aka, “pricing”) of stocks, derivatives, bonds, and all other securities is reliant on a remarkably small group of methods for predicting what any one of those financial assets will be worth in the future. Different assets are valued by different metrics, depending on their nature. The remarkable thing is that for any category of asset, there are fundamentally few methodologies.

One result of such widespread reliance resting on so few concepts is that coming up with a defensible valuation is relatively easy and significantly more straightforward than investment bankers who charge clients to handle this type of work – me included – generally highlight when we prepare proposals.

What is tricky is determining whether the price that the asset is currently selling for in the market is appropriate, or if – for reasons that are unknown – eventually the market price will change.

To address this question, die Optimierungsideologen have made (and continue to make) improvements to the existing models. The trouble is that practically all these methods have two common attributes.

First, virtually all of these models rely on past market prices — either of the asset itself or of comparable assets (“comps”) — and assume “approximate stationarity,” which means that they treat the underlying probabilistic structure as relatively stable over recent time horizons.

While tomorrow’s price will not be identical to today’s, it is expected to be generated by a broadly similar process. Structural breaks, regime shifts, policy changes, or major geopolitical shocks can certainly disrupt that stability. But as a working assumption, models proceed as if yesterday’s environment is a reasonable guide to tomorrow’s.

For example, if stock XYZ traded at $150.17 yesterday, it will likely trade somewhere in that neighborhood tomorrow — perhaps a bit higher, perhaps a bit lower — but not at $0.00 or $1 million.

The second attribute is that the models assume the risk of a change occurring – that is, of something different happening tomorrow – can be accurately quantified and encapsulated in one single number: variance.

Variance is both mathematically simple to calculate and conceptually straightforward. Mathematically, variance is the average squared distance from the mean; conceptually, variance is a measure of how widely outcomes are dispersed around their average.

And that is all variance is. How it has been used by die Optimierungsideologen is something else entirely… In Finance, variance is THE proxy for risk.

In 1952, Professor Harry Markowitz changed the world of finance forever when he formally showed that variance indicated the level of volatility in a portfolio of assets. In the same paper, Prof. Markowitz also explicitly aligned volatility and risk. From that moment forward, the mathematical identity, variance = volatility = risk, has been taken as near gospel – a simplified version of the gospel, to be sure, but effectively a holy identity.

The problem Prof. Markowitz’s paper addressed was, in fact, a remarkably practical one. He wanted to provide a method for how to optimally allocate a fixed amount of capital between two alternative investments, given an investor’s risk profile. And Prof. Markowitz’s insight was powerful in large part because it was so practical.

What Prof. Markowitz demonstrated was that by modeling risk as the measurable quantity of variability of historical returns, he created a method that transformed portfolio selection into a solvable optimization problem. By representing risk as variance, a solution could be determined by employing calculus. Today, as then, this approach works admirably. However, problems – major problems – emerge when portfolio managers assume that volatility captures every facet of uncertainty.

Much has changed since 1952, but the use of variance as the single core quantitative proxy for risk – under the assumption that uncertainty has been reduced to risk, or, if you prefer, encapsulated into the single metric of variance with it – has remained largely intact. The trouble is that thirty years before, another Professor that virtually nobody remembers, Frank Knight, published a book titled, Risk, Uncertainty, and Profit.

In his book, Prof. Knight rigorously demarcated risk (which is measurable) from uncertainty (which is not). More specifically, in Prof. Knight’s formulations, “risk” refers to situations in which probabilities can be meaningfully assigned to future outcomes. This is contrasted with “uncertainty” – or what we now call “Knightian uncertainty” – which occurs when not even the probabilities can be known. Uncertainty, simply put, is unknowable. And, more critically, it cannot be quantified or measured based on past information.

The trouble is that risk and uncertainty are used virtually interchangeably in the media, as they are in everyday conversations. Consider the two sentences regarding what your friend says to you about an upcoming blind date.

“I’m uncertain if you two will hit it off.”
vs.
“There is a risk you won’t hit it off.”

Other than recognizing the strong likelihood that you are going to have a lousy time on your date, is there any semantic difference? Of course not – both statements mean the same thing.

“Risk” and “uncertainty” are synonyms in common usage. Unfortunately, even in Financial Economics where there is a difference, few people recognize the distinction. And that is a major problem, because the difference is profound.

Mervyn King is the former governor of the Bank of England (BOE) and has been a professor at the London School of Economics as well as at least a half-dozen other institutions. In his book, The End of Alchemy, he defines “risk” very much the same way Knight did a century ago. “Risk concerns events, like your house catching fire, where it is possible to define precisely the value of the future outcome and to assign a probability to the occurrence of that event based on past experience.”

In contrast, Knightian uncertainty is neither related to past observable events, nor can it be precisely calculated. As Mervyn puts it, Knightian uncertainty “concerns events where it is not possible to define, or even imagine, all possible future outcomes, and to which probabilities cannot therefore be assigned.”

And here is where the seriousness of the situation becomes clear…Variance has become – for many reasons – the single number used in a stupefying quantity of financial models. Yet variance only tells part of the story. More accurately, variance only represents risk. As a result, the entire Knightian uncertainty aspect has been left out of the equation, literally.

Few people – including many, many highly intelligent professionals, managing billions of dollars – understand that by focusing on variance, they are looking at an incomplete picture. Having an incomplete picture is bad enough. However, it is especially dangerous if, as is the case in recent years, those individuals have a false sense of security because they do not know they are using incomplete information and therefore end up taking on much greater risk than intended.

There were many causes underlying the 2007-2009 Great Financial Crisis (GFC), just as there were many reasons that Long Term Capital Management (LTCM) crashed and burned in such breathtaking fashion. Similarly, there wasn’t one cause underlying the 1987 financial crash, the Great Crash of 1929, or the Panic of 1907. But all of those events have one thing in common: they all involved dramatic errors in assessing the level of risk involved.

In fact, it is quite likely that many financial crises – both large and small, those above and others – might not have been as bad as they were, if not for the misguided sense of security provided by the fallacious notion that risk can be entirely quantified.

The key point is this: One of the major pillars on which modern Financial Economics is built is the patently fallacious notion that risk and uncertainty are the same, or at least that they can adequately be mathematically represented by a single variable – variance. This misrepresentation reaches well beyond esoteric debates in ivory towers and has played a role – to one extent or another – in every major financial crisis in the last one-hundred fifty years.

Is it any wonder why Prof. King concludes that “The [Great Financial Crisis] was a failure of a system, and the ideas that underpinned it, not of individual policy-makers [sic] or bankers, incompetent and greedy though some of them undoubtedly were. There was a general misunderstanding of how the world economy worked…economics has encouraged ways of thinking that made crises more probable.”

If it is true as Prof. King has suggested, I firmly believe – and the Austrian School has long held – that the die Optimierungsweltanschauung approach is bound to fail – not because we have yet to properly refine mathematical techniques, but because there is a systemic flaw in the approach itself. What then is the alternative?

Investors, analysts, economists, bankers (central or otherwise), policy makers, and the professors who instruct and advise all the aforementioned need to stop, take a breath, and ask themselves the basic question: “Does this make sense?” In other words, is it time to reintroduce the bland, basic, old-fashioned concept of common sense back into financial decisions?

Please bear with me on this… I promise, the idea of introducing common sense is neither as flippant nor pedestrian as it sounds.

The more formal, fancy name for employing common sense and rules-of-thumb in combination with empirical data and individual judgements is “heuristics.” And the world has functioned reasonably well for a very long time when people employed and relied on heuristics to guide them on any number of matters, large or small.

Employing a heuristic approach is far from perfect. However, so are people. Besides, unlike solely relying on die Optimierungsweltanschauung, employing heuristics doesn’t rule out the use of analytical methodologies. On the contrary: the more information an individual has and brings to bear on a decision the better.

In The Use of Knowledge in Society, F.A. Hayek argues, among other things, that one benefit of markets (and prices) is that they aggregate disparate information from specialists, and, in doing so, aid in communication. Using a heuristic-centric approach is the natural extension of this idea.

There will never be a “perfect model” – a giant equation, or system of equations – that will be able to replicate the human process of making subjective decisions. Believing otherwise is folly, and an example of hubris in extremis.

The grand experiment of die Optimierungsweltanschauung has contributed much and revealed a great many insights: there is no denying this. Yet there are limits to how much insight can be gained in the social sciences by the application of quantitative analysis. The towering figures who define the Austrian School were able to find peace amidst a sea of uncertainty. It is high time we follow their example instead of chasing the impossible and putting our faith in the unattainable.

 

Our Partners

Liechtenstein Academy | private, educational foundation (FL)
Altas Network | economic research foundation (USA)
Austrian Economics Center | Promoting a free, responsible and prosperous society (Austria)
Berlin Manhatten Institute | non-profit Think Tank (Germany)
Buchausgabe.de | Buecher fuer den Liberalismus (Germany)
Cato Institute | policy research foundation (USA)
Center for the New Europe | research foundation (Belgium)
Forum Ordnungspolitik
Friedrich Naumann Stiftung
George Mason University
Heartland Institute
Hayek Institut
Hoover Institution
Istituto Bruno Leoni
IEA
Institut Václava Klause
Instytut Misesa
IREF | Institute of Economical and Fiscal Research
Johns Hopkins Institute for Applied Economics, Global Health, and the Study of Business Enterprise | an interdivisional Institute between the Krieger School of Arts and Sciences, and the Whiting School of Engineering
Liberales Institut
Liberty Fund
Ludwig von Mises Institute
LUISS
New York University | Dept. of Economics (USA)
Stockholm Network
Students for Liberty
Swiss Mises Institute
Universidad Francisco Marroquin
Walter-Eucken-Institut