Crystal Balls, Likely Outcomes and Portfolio Planning

Crystal Balls, Likely Outcomes and Portfolio
Planning
May 6, 2015
by Robert Holton
of Cleary Gull
In presentations to fellow professionals, we often make self-deprecating comments similar to this –
“And then later we’ll tell you exactly when the Fed will raise interest rates. We’ll just grab our crystal
ball and take a look.” That simple joke, variations of which are repeated over and over among financial
professionals, draws on fairly deep-seated anxiety about the predictability of financial markets. It also
may understate the value of systematically studying financial markets — complex systems impacted by
behavioral issues and multiple variables. As I consider this conundrum in early spring, two illustrations
that may help sort out this ambivalence in making predictions come to mind: sports and the weather.
Sports, Data and Randomness As March came to a close, a certain madness faded from the college
basketball scene. As I do each year, I had selected the winners of each game in the big tournament,
round by round, based on my rather limited game-watching, a review of rosters, a cursory look at team
schedules and a bit of gut feel. Millions of other people do the same, with predictions based on greater
and lesser degrees of knowledge. Despite the sometimes considerable time, money and effort, there is
no reputable evidence for any person ever having predicted every winner.
So, should people stop making efforts to more accurately predict winners? “If you can never get the
bracket right,” the logic goes, “then why waste all the time and effort in trying?” Water cooler banter
suggests the least informed might actually have a better chance at accurately predicting the results,
leading to erroneous logic: If no one can be right, everyone’s predictions must be equally valid. And if
the chance of being accurate is low, any expression of confidence in our predictive capability smacks
of unwarranted hubris. So we downplay our chances, we joke about our predictions.
It’s worth considering from a predictive standpoint. Certainly there is randomness in any effort involving
18- to 22-year-olds thrust into the national limelight to throw an inflated orange ball through a hoop ten
feet above ground. Yet there has been an entire season of games from which to glean data about
relative strengths and weaknesses of teams, the behavioral tendencies of each player. The practice of
statistical analytics in basketball is at such a high level, Massachusetts Institute of Technology’s Sloan
School hosts a Sports Analytics Conference.1 Using similar analytics, Nate Silver and his friends over
at the blog FiveThirtyEight developed a model for each round using a “composite of power rankings,
pre-season rankings, the team’s placement on the NCAA’s S-curve, player injuries and geography.”2
That’s impressive predictive technology. And yet, teams with a 76% and 91% chance of winning lost in
Page 1, ©2017 Advisor Perspectives, Inc. All rights reserved.
the first round and teams with a 72% and 88% chance in the second round went down. Even with
sophisticated modeling and robust statistical input, along with randomness, the key issue is the
number of variables.
Weather, Likely Outcomes and Planning Each year, as March rains fall in the Great Lakes region
and spring takes hold, I begin to think about when I can plant my backyard garden. But my planning
pales in comparison to the financial importance of planting decisions for agricultural concerns across
the country. It is one industry among many that are highly dependent on accurately predicting the
weather to maximize the economic utility of their resources. The rise of computer modeling has
increased overall accuracy of weather forecasts dramatically over the last few decades, from about
65% in 1985 to almost 90% by 2009.3 Yet we still have that nagging feeling that the weather forecast
on our evening news won’t be accurate. The weather app on my phone often differs from one on my
wife’s phone. Any claim of predictive supremacy here seems silly.
An interesting review of the current state of weather predictions appeared in a recent Bulletin of
American Meteorology. It explores the many models explaining the “Earth system,” which includes
weather, and the varying success in accurately projecting future weather activities. In academic-speak,
here is how they explain the critical issue: “[The Earth system] is the canonical example of a complex
system, in which its dynamics, resulting from interacting multiscale and nonlinear processes, cannot
be predicted from understanding any of its isolated components.”4 In brief, there are many variables
but the key understanding is how they interact in order to project the future state of the weather. The
authors go on to differentiate between the process of “projecting the weather”, which is a set of
plausible outcomes, and “predicting the weather,” which is a likely outcome. That is where it gets
interesting.
The predictive power of our weather models, not the projection, is what makes them useful for a
farmer in Iowa. Knowing with great certainty it will rain approximately one inch less this year than last
year may be mildly helpful. Knowing with a fairly high level of certainty the next week will have constant
rain storms followed by a week of cold weather is far more relevant. As the article details, most
computer modeling makes detailed projections based on specific parameters and processes, with little
compatibility between models, leading to scientific exploration of plausible outcomes. An alternative is
to build inter-compatible models that predict with clearly defined probability. This is the key to useful
weather modeling – specific predictions based on multiple models with an expression of probability.
When faithful Eeyore, from A.A. Milne’s classic Winnie the Pooh series, intones woefully, “Looks like
rain,” Pooh should ask, “How likely is it to rain? How confident are you? And based on what data?”
That would be far more useful to Pooh’s quest to find honey. But imagine the difficulty of assessing a
weather forecast that includes probability for each temperature prediction. The key issue may not be
improving our models, but our desire as an audience for simplified answers.
The Markets and Crystal Balls So what might the unpredictability of sports and the spring weather
teach us about how we discuss financial market predictions? Financial markets also are “complex
systems” that include behavioral issues, like we see in sports, and an incredible number of variables,
like we see in in weather systems. The complexity inherently makes outcomes unpredictable, thus our
joke about the elusive crystal ball.
Page 2, ©2017 Advisor Perspectives, Inc. All rights reserved.
As we work to further develop our understanding of behaviors and relative values in the global financial
markets, we must remain clear on our objective. We are not attempting to reach 100% accuracy;
rather, we are seeking to improve our accuracy with each prediction. This goal properly places the
value on the experience, research and technical capabilities of our analytical staff. As we drill down
from a global economic view all the way to specific investment vehicles, we look at the numbers
(quantitative analysis) and then supplement that with how people are behaving (qualitative analysis).
We will never have the hubris to believe we can predict global financial markets, but that is not our
goal. Our goal is to make educated predictions based on the most complete data available to us, with a
clear indication of our confidence level in that prediction. Our mission is to help our clients build the
most cost-effective and risk-aware portfolio, planning carefully for the most likely outcomes in order to
meet their desired goal. That requires a lot of research, effort and a bit of humility. Thankfully, it does
not require a crystal ball.
1
See more here: http://www.sloansportsconference.com/
2
Reuben Fischer-Baum, Andrew Flowers, Allison McCann, Dhrumil Mehta and Nate Silver, 2015:
2015 March Madness Predictions, FiveThrityEight, Accessed at
http://fivethirtyeight.com/interactives/march-madness-predictions-2015/#mens on 4/2/2015
3
Jesse Ferrell, The Secrets of Weather Forecast Models, Exposed, Accessed at
http://www.accuweather.com/en/weather-blogs/weathermatrix/why-are-the-models-soinaccurate/18097 on 4/2/2015
4
Matthew J. Smith, Paul I. Palmer, Drew W. Purves, Mark C. Vanderwel, Vassily Lyutsarev, Ben
Calderhead, Lucas N. Joppa, Christopher M. Bishop, and Stephen Emmott, 2014: Changing How
Earth System Modeling is Done to Provide More Useful Information for Decision Making, Science, and
Society. Bull. Amer. Meteor. Soc., 95, 1453–1464. Accessed
athttp://journals.ametsoc.org/doi/full/10.1175/BAMS-D-13-00080.1 on 4/2/2015
© Cleary Gull
Page 3, ©2017 Advisor Perspectives, Inc. All rights reserved.