In a recent post, I briefly discussed the possibility of expansion being a contributor to the rise in home runs. It’s based on the work of Stephen Jay Gould, who posited that as talent became less (more) disperse, excellent achievements became less (more) likely to occur. Since home run hitters tend to be baseball’s best hitters, there improvement should be expected as pitching talent becomes more disperse. Expansion is one cause of talent dispersion.
Reader Shek sent me this excellent graph (actually, he sent me his beautifully crisp Stata code to recreate it) of home run rates and the number of teams per season.
The rise in home runs does seem to move with expansion in the 1990s and possibly the late-1960s, but the relationship is hardly airtight, nor necessarily causal. And even in the 1990s, it’s difficult to know if the corresponding spikes are noise or real effects.
But, expansion isn’t the only cause of dispersion. In Chapter 8 of The Baseball Economist, I discuss how league dispersion has changed over time, by measuring the variation in performance across players. With and without expansion, the difference between the best and worst for pitchers and hitters has fluctuated quite a bit with time, as the graph below reports.
The 1990s and 2000s are two of the most disperse decades for pitchers in the history of baseball, while there were many good pitchers who excelled, there were also many bad pitchers for batters to feast on. However, during other past expansions, pitching quality was relatively compact. So, it should be no surprise that recent expansions felt more of an effect than past expansions. But still, this is not proof, just evidence that fits with a theory that is very difficult to test.
Thanks to Shek for passing the graph along.