Someone (unknown) once commented, “Baseball is an island of activity amidst a sea of statistics.” There’s no doubt statistics drive the game. Here’s a good general timeline on how that has played out over the years.
A pitcher’s Earned Run Average (or ERA) is the average number of earned runs that a pitcher gives up per nine innings pitched (as the typical game lasts nine innings).
An earned run is a run that is not scored as the result of a defensive error, such as a fielding error or a passed ball.
A pitcher’s ERA is calculated by dividing the number of earned runs he has allowed by the number of innings he has pitched, then multiplying by nine. For example, if a pitcher is charged with 21 earned runs over the course of 90 innings pitched, his ERA would be 2.10.
(21/90) x 9 = 2.1
An ERA under 3.00 is generally considered to be excellent. The lower a pitcher’s ERA, the better.
The lowest all-time career ERA in baseball history was 1.82, by Ed Walsh, who pitched from 1904 to 1917. The lowest career ERA during the live-ball era (that is, post-1920), belongs to Mariano Rivera, who pitched from 1995-2013 and posted an ERA of 2.21.
A player’s batting average is determined by dividing the number of base hits a player has by the total number of at-bats. For example, if a player has 500 at-bats and collects 150 hits in those at-bats, his batting average would be .300 (150/500 = .300). Keep in mind that walks and sacrifice plays (i.e. sacrifice bunts and sacrifice flies) do not count as at-bats, and therefore, do not factor into a player’s batting average.
A batting average of .300 or above is considered an excellent batting average, and an average of .400 for a season is deemed nearly impossible. The last player to hit .400 for a season was Ted Williams, who finished the 1941 seasons with a .406 batting average.