The Creating Wealth Show #105:
Propelling investment success using econometrics and
competitive analytics
In the one hundred and fifth episode of Jason’s extremely popular Creating Wealth show, he interviews David Savlowitz, the head of Competitive Analytics (CA), a niche full-service market intelligence firm. In this show, David explains the way that his firm uses a multiplicity of data to generate more robust information for their clients than can be obtained by the simplistic scorecards that are employed by most of the financial media. The methods that CA employs analyze supply and demand by using statistics, econometrics, predictive modeling, comprehensive research, and applied mathematics.
In this show, Jason talks with David about the ways that Competitive Analytics uses comprehensive data analysis to drive prediction models for their clients. One of the methods that they frequently employ is an economic composite score that is based on a multiplicity of weighted indicators. When applying this methodology to the general economy, David’s model is predicting the bottom of the economic cycle in Q4, 2009 with a return to equilibrium by Q4, 2011. Furthermore, David’s models are forecasting a U-shaped recovery that will have an extended trough. This stands in sharp contrast to previous V-shaped recoveries that experienced an immediate “bounce back” from the market lows. The reason for this extended trough is because a significant adjustment needs to be made in order to equalize the debt-financed over-consumption that fueled the recent asset bubbles.
The unique part of David’s methodology is the fact that his team uses a very wide variety of input variables in an attempt to capture future items that may become big swing factors. He rightly understands the implicit danger that can be present within quantitative economics for people that do not fully understand the analysis. This danger stems from the fact that econometric analysis uses trends in the past to predict the future, and thus cannot anticipate the impact of events that have never happened before. The importance of this insight comes from the fact that rare events like September 11th, 2001, the Russian Financial Crisis, and the collapse of credit default swaps were never incorporated into any prediction models because they had never happened before.
Each of these events had an unfathomable impact on the marketplace that left people who were blindly following technical trends of the past absorbing unbelievable losses (or pushing those losses onto the taxpayer in the form of a government bailout). As a point of reference; the hedge fund “Long Term Capital Management” was the brainchild of Robert Merton and Myron Scholes. It made heavy use of econometrics to undertake highly leveraged arbitrage trades in the bond market, but nearly collapsed the financial markets after the Russian Financial Crisis in September of 1998. This became the first iteration of a “too big to fail” argument, and is being used as the precedent for the government bailouts of financial institutions that are currently being pushed on the marketplace.
This is not to say that quantitative analysis and econometrics are implicitly dangerous. It is simply to say that it is a tool . . . a very powerful tool that needs to be understood before it is used. When applied by knowledgeable professionals, it can generate valuable insights. When given over to pseudo-intellectual or short-sighted agents, it can become a tool of mass financial destruction as the algorithms become an item of blind faith that drives insane investment decisions. As with all tools, the result depends largely on how it is used. Thankfully, David keeps the scope and limits of his analytics in perspective. A strong dose of this perspective is highly advised for anybody that seeks to incorporate econometric analysis into investment decisions.