Relationship between Variability of Past Returns and Levels of Future Returns for Common Stocks, 1926–1960
The relationship of risk and expected returns of common stocks has been an integral part of the study of valuation theory even before Graham and Dodd authored their classic Security Analysis in 1934. In 1959, Harry Markowitz established a quantitative framework to the discussion. He adopted variance (or the square root of variance, the standard deviation) of annual returns as a usable measure of risk. The empirical study of the standard deviation of annual returns as a measure of total risk to ex post annual returns is the subject of this essay. This essay reports on one of the first empirical studies using the data bases of the center for Research in Security Prices at the Graduate School of Business of the University of Chicago.Abstract
Contributor Notes
Dr. Shannon P. Pratt, CFA, FASA, MCBA, CM&AA, is Chairman and CEO of Shannon Pratt Valuations in Portland, Oregon. This essay is drawn from his dissertation submitted to fulfill the requirements for his Doctor of Business Administration in Finance at Indiana University. The essay was first published in Frontiers of Investment Analysis, revised edition, edited by E. Bruce Fredrickson (out of print). At the time the essay was published, Dr. Pratt was Associate Professor and Director, Investment Analysis Center, Portland State University. The author wishes to thank the following for assistance in this research and writing: Harry Sauvain, James Lorie, Lawrence Fisher, Benoit Mandelbrot, A. James Heins, William Sharpe, Paul Cootner, and Alexander Robichek. Financial research for the research was provided in part by the Ford Foundation.