AgPa #2: What Moves Stock Prices?

What Moves Stock Prices? The Role of News, Noise, and Information (2022)
Jonathan Brogaard, Thanh Huong Nguyen, Talis J. Putnins, Eliza Wu
The Review of Financial Studies, Forthcoming, URL

This week’s AGNOSTIC Paper attempts to answer a very fundamental question: What drives the day-to-day volatility of stock prices? Of course, there are many things active at the same time. News about the underlying businesses, news about the economy, market impact of large investors, and various more. The authors estimate the impact of such different types of information in the US stock market. This involves some heavy time-series econometrics and the paper is primarily a methodological contribution to the academic literature. However, the authors also bring their model to the data and derive some empirical results that are (in my opinion) also relevant for practitioners.

Similar to the previous post, I divided this one into the following parts:

Everything that follows is only my summary of the original paper. So unless indicated otherwise, all tables and charts belong to the authors of the paper and I am just quoting them. The authors deserve full credit for creating this material, so please always cite the original source.

Setup and Idea

The story of information in stock markets goes like this. Greedy investors constantly monitor stock prices and the underlying companies. As soon as new information about future prospects arises, they adjust their expectations and trade accordingly. This process is active all the time and therefore, stock prices reflect all available information. This is of course the basic idea of Fama’s (1970) efficient market hypothesis. We can debate, whether this is true or whether price movements are always correct. But we cannot debate that new information somehow affects stock prices and thereby causes volatility.1Volatility is the standard deviation of stock returns over a certain period of time. It is one of the most common measures for risk.

To illustrate, the volatility of the average US stock between 1926 and 2018 was almost 40% per year.2Isreal et al. (2021) mention this statistic to illustrate the low signal-to-noise-ratio of stock markets. This is a pretty big number and exactly the point where the paper kicks in. The authors take this volatility and develop a model to examine what type of information is driving it.

Think of it as sitting in front of a price chart when a company announces it earnings. The price will react somehow and you try to figure out why. Are the earnings bad? Is it a bad day for the market in general? Is there a large investor who wants to sell quickly and distorts the price? In most cases, it is a messy mix and we cannot see the impact of individual pieces of information. But with the authors’ model and the right data it becomes visible. Specifically, the authors decompose the variation of stock prices into the following components.

Own illustration based on Figure 2 of Brogaard et al. (2022).

In the first step, they distinguish noise and information. What is noise? The term was coined by Fisher Black who use it to describe the uncertainty around financial markets. Noise comes in various forms and captures everything that affects prices apart from information. These can be behavioral factors like uninformed traders, mechanical issues like illiquidity, or anything else that prevents the price from being at its theoretical value. Sounds bad, right? It is not. If we would know the exact price of a security, no rational person would trade with us. So we need some noise for the market to work.

Subsequently, the authors distinguish between Stock-Specific- and Market-Wide information.3This is by no means the only way to decompose information and the authors mention this explicitly. Their model, however, works with any type of information-decomposition as long as the empirical proxies for it are available. I think this one is straight forward. Market-Wide information captures macro factors that affect all stocks simultaneously, for example interest rates. Stock-Specific information is the micro level and captures everything related to the underlying company, for example earnings announcements. Finally, the authors further divide Stock-Specific information into Public and Private. Public information is available for all market participants whereas Private information is not. Both are important for efficient markets. On the one hand, you want to have fair access to public information. But on the other hand, you also want to encourage smart people to do proprietary analyses and trade on them.

The focus of the paper is to estimate how much of the variation in stock returns is explained by each of the four categories. Why is this interesting? First, I think it is a cool measure of market efficiency. The larger the share of noise, the less efficient the market. All else equal, less efficient areas of the market are more attractive for investors. Second, I just find it interesting. I guess everyone knows the situation when you look at a chart and have no idea why the damn stock is falling again. One day later, the media will of course give you a story. But usually, this has nothing to do with rigor research. Finally, the paper provides empirical guidance on which information matters most. This tells us something about where we as investors should spend our time on.

Data and Methodology

How does the model work? The authors use a vector autoregressive (VAR) model, a common tool from time-series econometrics. As always, I don’t want to bore anybody with mathematical details but rather focus on the idea. The model assumes that stock prices are a function of their own past and the past of other explanatory variables. For example, returns of the market or trading volume. If we have the right data, we can estimate this function and use it to analyze the impact of each variable.

Consequently, we need some empirical proxies for Market-Wide- and Stock-Specific information. The authors suggest the following variables.

  • Market-Wide: Return of the stock market
  • Stock-Specific (Private): Signed trading volume in USD (Net Buying vs. Net Selling)
  • Stock-Specific (Public): Individual stock return

However, they also admit that this is just one specification and that their model is flexible for any other. There are of course many other feasible ways to think about information in the stock market. But in my opinion, the specification in the paper is transparent and intuitive. In addition to that, return and volume data is readily available for large samples of stocks and long periods of time. This is certainly a problem for more sophisticated information proxies like news coverage.

The intuition behind each variable is straight forward. The aggregate stock market reacts to macro information and consequently, the return reflects this information. Signed trading volume is an intuitive indicator for Private information. For example, a positive number suggests net buying which may be driven by sophisticated investors who trade on their proprietary analyses. Finally, the individual stock return captures the remaining Public information, for example the reaction to company disclosures.

What about noise? To estimate the impact of noise, the authors use so called impulse response functions. These functions are a feature of VAR models and allow to analyze the impact of shocks over time. But before losing ourselves in more details, let’s look at the example of Figure 1 in the paper.

Panel A of Figure 1 in Brogaard et al. (2022).

The left chart shows the cumulative return of a stock after a shock at t=0. For example, this can be a large buy order that increases the price by 0.9% (peak of the red line). But over time, the market digests this shock and the cumulative return levels at 0.45%. Based on this model, these 0.45% are the information content of the buy order. Everything else is noise. This is shown in the right chart. Without noise, the price would immediately reflect the true information (blue line). In reality, however, it fluctuates around it and needs some time to adjust (red line).

Of course, this is a stylized example with only one information-shock to illustrate how the model works. In reality, there are many simultaneous shocks from all types of information and each creates it’s own noise. The results are the price charts we see when sitting in front of our screens. Using the model, however, we can also look beneath the surface and see the underlying drivers. This methodology is the key contribution of the paper.

The authors apply their model to the US stock market. Specifically, they look at all common stocks listed on the three major US stock exchanges (NYSE, AMEX, NASDAQ) between 1960 and 2015. Daily prices, returns, market capitalizations and volumes are from the Center for Research in Security Prices (CRSP). On average, the sample consists of 4,362 stocks per year (22,025 stocks in total) and thereby captures most of the investable US stock market. CRSP is a high-quality database that is widely used for empirical research. Given that the paper is also forthcoming in the Review of Financial Studies, one of the leading finance journals, I think the following results are based on solid data and methodology.

Important Results and Takeaways

As usual, the paper offers a lot of interesting statistics and discussion about econometric methodology. For real-world investing, however, I think the following three results are most relevant.

Stock-Specific information is most important

For the full sample (1960-2015), on average 60.7% of the variation in stock prices is driven by Stock-Specific information. Of those, 36.7% are Public information and 24% are Private. In my opinion, these are very nice results. As mentioned earlier, information-processing and efficient pricing of securities are important features of stock markets. Therefore, it is great that almost 2/3 of the variation in prices is actually driven by that. On the contrary, the role of Market-Wide information is with 8.6% almost negligible.

Own illustration based on Panel A of Table 2 in Brogaard et al. (2022).

While the high information shares are promising, almost 1/3 of variation in stock prices is still driven by noise (30.7%).4According to a study quoted by the authors, the share of noise is even larger for intraday prices and reaches up to 82%. We all knew before that financial markets are messy, but I think it is interesting to see this in hard numbers. Ultimately, this chart is consistent with many fundamental investment philosophies. For example, Warren Buffett is famous for saying that one should focus on the underlying business (Stock-Specific) and ignore any stock market- or macro-forecasts (Market-Wide).5For example, in this interview. Moreover, the high share of noise is also consistent with the fact that even the very best investors are only slightly more often right than wrong.

Finally, a technical but important details on how the authors aggregate those numbers. It is a volatility-weighted average, meaning more volatile stocks receive a higher weight. The authors argue that is is logically consistent, because the paper focuses on drivers of volatility. However, this may overweight very volatile small caps that are economically not very relevant. To control for this, the authors also present equal weighted averages in an internet appendix and the results are not materially different.

Markets became more efficient over time

To illustrate how markets changed over time, the authors also present the time series of variation-shares from 1960 to 2015. One striking observation is the share of noise (solid black line). Around 1995, about 45% of the variation in stock prices was driven by noise. This is a substantial figure but fortunately, it decreased to about 20% by 2015. The decreasing share of noise indicates that the US stock market apparently became more efficient over those years. According to the authors, this coincides with various liquidity improvements introduced during this period.

Panel B of Figure 3 in Brogaard et al. (2022).

Another interesting observation is the pattern of Market-Wide Information. For almost the entire sample period, it is the least important driver of variation in stock returns. However, it fluctuates considerably. For example, during the financial crisis from 2008 to 2010, the share of Market-Wide information almost doubled to more than 15% (versus 8.6% on average). Similar patterns are observable for other periods of increased market volatility, for example the burst of the internet bubble in the early 2000s. In absolute terms, however, Market-Wide information remains the least important type of information.

Smaller stocks are more noisy

In the second part of the paper, the authors take the variation-shares and examine how they relate to certain characteristics of stocks. Most importantly, they look at the market capitalization of the underlying firms, a.k.a. size. Panel C of the Table highlights the results. Q1 represents the 25% smallest firms, whereas Q4 are the largest 25%. The values in brackets are 99%-confidence intervals for the respective variation-share.

Panels A-C of Table 2 in Brogaard et al. (2022).

I want to highlight a few numbers that I find particularly interesting. First, Market-Wide information is most important for large firms (21.09%) and decreases with firm size. For the smallest firms, only 5.58% of the variation in stock prices is driven by Market-Wide information. I think this result is reasonable and there are many intuitive explanations for it. For example, sophisticated global-macro investors usually work with stock indices that mainly consist of larger companies.

It is interesting, however, that the share of Stock-Specific information is not higher for small companies. Taken together, it amounts to about 59% (21.85%+37.01%) for small companies and to 62% (30.33%+31.67%) for large ones. Why is this the case? Because stock prices of smaller companies are much more noisy than those of large ones (35.56% vs. 16.92%). Using the same argument as before, this means that smaller companies are less efficiently priced than large ones. All else equal, this makes them more attractive for active investors.

Conclusions and Further Ideas

I like the paper because it attempts to answer the question we all ask ourselves when looking at a price chart. Why did the price move that way? Which headline or which event was it? With pretty standard data (returns and volume), the authors derive some very interesting empirical results that are overwhelmingly consistent with economic theory. I think that’s a great step in the right direction! But as also mentioned by the authors themselves, this paper is more about methodology than application. Therefore, I hope that we will see some follow-ups that try to estimate different types of information with even better empirical proxies.

The framework is also interesting for regulators and market authorities. For example, liquidity improvements like the reduction of tick sizes reduced the share of noise and thus contributed to more efficient markets. Similarly, mandatory disclosures increased the share of Public information. These are important insights for regulators because it helps them to evaluate whether their policies are actually useful.

As also mentioned in the last post, my takeaways as an investor are less noble. I think the paper is great because it tells us where we should spend our time on. The results are clear. Focus on Stock-Specific information and develop processes to cope with noise. These two are the main drivers of variation in stock returns. The paper also tells us that investing in large caps is difficult. They are more efficiently priced and more of their variation is driven by Market-Wide information. When looking for superior returns, we should therefore focus on small caps and mostly ignore macro forecasts. Once again, that is quite close to the early and heavily successful strategy of Warren Buffett.



This content is for educational and informational purposes only and no substitute for professional or financial advice. The use of any information on this website is solely on your own risk and I do not take responsibility or liability for any damages that may occur. The views expressed on this website are solely my own and do not necessarily reflect the views of any organisation I am associated with. Income- or benefit-generating links are marked with a star (*). All content that is not my intellectual property is marked as such. If you own the intellectual property displayed on this website and do not agree with my use of it, please send me an e-mail and I will remedy the situation immediately. Please also read the Disclaimer.

Endnotes

Endnotes
1 Volatility is the standard deviation of stock returns over a certain period of time. It is one of the most common measures for risk.
2 Isreal et al. (2021) mention this statistic to illustrate the low signal-to-noise-ratio of stock markets.
3 This is by no means the only way to decompose information and the authors mention this explicitly. Their model, however, works with any type of information-decomposition as long as the empirical proxies for it are available.
4 According to a study quoted by the authors, the share of noise is even larger for intraday prices and reaches up to 82%.
5 For example, in this interview.