Above left: Plantsville, CT after the February 8, 2013 Blizzard. Above Right: Satellite Image of Hurricane Sandy. Credit: NOAA
The Summer of 2013 was a wet one and meteorologists often explained how "above normal" precipitation was to illustrate the anomaly. Even last year, which was one the warmest years on record, was compared to the "normal" annual temperature to add perspective to the extreme. But how exactly do meteorologists know what the “average” weather is for a given location?
First, the climate community uses data from a 30 year period to learn what typical temperature and precipitation values occur on a daily, monthly and even yearly time frame. Data from a variety of weather stations including the National Weather Service’s Cooperative Observer Program (COOP) Network is used to compute these climate normals. Current climate normals are based on weather data between 1981 and 2010 but they are recalculated every ten years. New 30 year normals will be released in 2021, which will be based on data between 1991 and 2020.
But how are these normal calculated? Well, to find Boston's typical maximum temperature in January, for example, the normal monthly maximum temperature for all of the 30 Januarys between 1981 and 2010 are averaged together. Similarly, averaging the annual snowfall for each of the 30 years at say, New York City, shows the Big Apple's typical yearly snowfall. Of course, numerous quality control measures filter out the less reliable stations and highlight those with more complete and accurate records. Check out the graph below to learn how close to "average" the monthly maximum temperature in Boston and monthly precipitation in NYC were thus far in 2013.
So, next time your local TV meteorologist uses the word “normal” to illustrate how unusual a weather event is, you have perspective on what he or she means!