Experimental Long-Range Weather Predictions
by R. Clayton Brough, UCCW Climatologist, May, 2005
During the past several years, an increasing number of individuals and companies have inquired about the accuracy of various long-range weather predictions offered by private companies and non-government sources. Currently, UCCW officials are analyzing some of these long-range predictions and their claims of accuracy--which will be publicized in 2006.
For example, the The Old Farmer's Almanac claims that it uses "a secret formula that was devised...in 1792" which "predict[s] weather trends and events by comparing solar patterns and historical weather conditions with current solar activity." This Almanac claims that "although neither we nor any other forecasters have as yet gained sufficient insight into the mysteries of the universe to predict the weather with total accuracy, our results are almost always very close to our traditional claim of 80 percent [accuracy]."
In comparison, WeatherPlanner claims that it uses a "proprietary scientific process" based on "historical weather information" that was "originally developed in the 1930s by Dr. Irving P. Krick." This company--which produces daily weather predictions "of expected precipitation and temperatures up to a year in advance" for specific locations--claims that some of its predictions "were correct 77 percent of the time in 1998 and 83 percent of the time in 1999."
Although I do not endorse or support the methods or predictions made by the The Old Farmer's Almanac, WeatherPlanner, or other such companies or sources, I do feel that these predictions are "interesting" or "possibly useful" if and when they match known climatological precipitation probabilities--such as days or weeks which are statistically (and historically) the wettest during a particular month for a specific location.
For example, the The Old Farmer's Almanac claims that it uses "a secret formula that was devised...in 1792" which "predict[s] weather trends and events by comparing solar patterns and historical weather conditions with current solar activity." This Almanac claims that "although neither we nor any other forecasters have as yet gained sufficient insight into the mysteries of the universe to predict the weather with total accuracy, our results are almost always very close to our traditional claim of 80 percent [accuracy]."
In comparison, WeatherPlanner claims that it uses a "proprietary scientific process" based on "historical weather information" that was "originally developed in the 1930s by Dr. Irving P. Krick." This company--which produces daily weather predictions "of expected precipitation and temperatures up to a year in advance" for specific locations--claims that some of its predictions "were correct 77 percent of the time in 1998 and 83 percent of the time in 1999."
Although I do not endorse or support the methods or predictions made by the The Old Farmer's Almanac, WeatherPlanner, or other such companies or sources, I do feel that these predictions are "interesting" or "possibly useful" if and when they match known climatological precipitation probabilities--such as days or weeks which are statistically (and historically) the wettest during a particular month for a specific location.
Comparative Study of Long-Range Forecasts from two Commercial Providers and Forecasts using Climatology
and Persistence to Actual Observations taken at
Salt Lake City, Utah, over a Twelve-Month Period (2004-2005)
by Dan A. Risch, Meteorologist, CCM
Associate Chairman, Board of Consultants, UCCW
August, 2006
Associate Chairman, Board of Consultants, UCCW
August, 2006
This study compares weather parameters at Salt Lake City, Utah that were forecast by two long-range commercial forecast providers to actual observed conditions. The parameters of temperature and precipitation where used in the study and a determination was made as to how well the two providers did in forecasting a year in advance for Salt Lake City, Utah. These two providers are referred to as Provider A and Provider B. The time period this study covers is slightly different for each provider. Provider A covers November 2004 through October 2005. Provider B covers December 2004 through November 2005. So there is a one month shift in the study periods. Additionally climatology and persistence forecasts were compared to the actual Salt Lake City conditions. The time period used for the climatology and persistence forecasts were December 2004 through November 2005.
The results of the temperature forecast comparison show that Provider A was correct 39 percent of the time while Provider B was correct 47 percent of the time. Climatology was correct 55 percent of the time, and Persistence was correct 65 percent of the time.
Results for the combined yes- / no-precipitation day forecasts, showed Provider A was correct 54 percent of the time, Provider B was correct 59 percent of the time, climatology was correct 60 percent of the time, and persistence was correct 70 percent of the time. The forecasts for yes-precipitation days only, show Provider A was correct 46 percent of the time, Provider B was correct only 19 percent of the time, climatology was correct 30 percent of the time, and the persistence forecast was correct 57 percent of the time.
In general there was little skill shown by Provider A or Provider B. Their highest scores were a little above 50/50 with the correct forecasts in the combined yes- and no-precipitation day forecast. It is interesting to note that if no precipitation had been forecast for the entire year that the resultant correct forecasts would have been near 60%. The Climatology and Persistance forecasts both outperformed the two commercial Providers.
The results of the temperature forecast comparison show that Provider A was correct 39 percent of the time while Provider B was correct 47 percent of the time. Climatology was correct 55 percent of the time, and Persistence was correct 65 percent of the time.
Results for the combined yes- / no-precipitation day forecasts, showed Provider A was correct 54 percent of the time, Provider B was correct 59 percent of the time, climatology was correct 60 percent of the time, and persistence was correct 70 percent of the time. The forecasts for yes-precipitation days only, show Provider A was correct 46 percent of the time, Provider B was correct only 19 percent of the time, climatology was correct 30 percent of the time, and the persistence forecast was correct 57 percent of the time.
In general there was little skill shown by Provider A or Provider B. Their highest scores were a little above 50/50 with the correct forecasts in the combined yes- and no-precipitation day forecast. It is interesting to note that if no precipitation had been forecast for the entire year that the resultant correct forecasts would have been near 60%. The Climatology and Persistance forecasts both outperformed the two commercial Providers.
DATA COLLECTED FROM PROVIDERS
Provider AForecast information covering the period from November 2004 through October 2005 was collected from this provider. This provider issued forecasts using blocks of days to which they applied word descriptors indicating what type of temperatures and precipitation they were forecasting.
For temperature forecasts Provider A used the following terms:
For precipitation forecasts Provider A used the following terms:
Provider B
Forecast information covering December 2004 through November 30, 2005 was collected from this provider. Forecast data for December 1 and 2 were not available. This provider issued forecasts on a daily basis through the period of interest. These forecasts used temperature ranges for the daily maximum's, and descriptors indicating what type of precipitation they were forecasting.
For temperature forecasts Provider B used:
For precipitation forecasts Provider B used the following terms:
DATA COLLECTED FROM SALT LAKE CITY
All though there are several locations in the Salt Lake Valley that collect temperature and precipitation data. The official site representing Salt Lake City is located at the Salt Lake City International Airport (KSLC) and is operated by the National Oceanic and Atmospheric Administration (NOAA) through the National Weather Service (NWS). It is this site that was chosen to provide the actual weather data for comparison to the forecast data from the two providers and to climatology. The NWS provides a summary of daily weather conditions recorded at the airport site. This summary includes maximum and minimum temperatures and precipitation in both snow and liquid forms. The summary is published monthly in a form called WS Form F-6 which is available on the Salt Lake City NWS web site. It is from this form that the comparative data for KSLC was collected for the period of November 2004 through November 2005.
COMPARING THE DATA
Temperature
With Provider A, a scheme was developed to convert the terms used to describe the temperature conditions so that they could be directly compared to the actual temperatures recorded. A simple above normal, normal and below normal table was developed. Provider A temperature terms were then put into this three category table using the following criteria:
chilly = below normal
cool = below normal
cooler (coming off of hot) = above normal
cold = below normal
very cold = below normal
hot = above normal
mild = normal
milder (coming off of cold) = below normal
seasonable = normal
warm = normal
Provider B gave temperature ranges with their forecasts and these could be directly compared to actual records with no conversion necessary.
The actual temperature records from KSLC were converted to fit in the 3 category table developed for Provider A's temperature forecasts to allow for comparison to those forecasts. Inserting the actual recorded maximum and minimum temperatures into one of the three categories as described for Provider A was done by looking at the standard deviation of the normal maximums and minimums for each day for KSLC. If either the daily maximum or daily minimum temperature (or both) were above one standard deviation higher than the average maximum or minimum for the day, then that day was added to the above normal category. Similarly, if either the maximum or minimum temperature (or both) were below one standard deviation lower than the average maximum or minimum for the day, then that day was added to the below normal category. All other days were added into the normal category.
The standard deviations used in this exercise were taken from the Western Regional Climate Center (WRCC) web site (http://www.wrcc.dri.edu/climsum.html). These deviations were based on KSLC data over the 30 year period from 1971 through 2000. The standard deviations were calculated separately for each days maximum and minimum temperature.
A "climatology" forecast for temperatures was also developed. Average temperatures for the period from 1971 through 2000 were used as the climatological forecast. Additionally a "persistence" forecast was developed, where the maximum and minimum temperatures that occurred on one day were used as the forecast for the next day.
Precipitation
In the case of both Provider's, if a forecast of precipitation of either rain or snow was given for any particular day, then that day was considered a forecast precipitation-day. Forecasts using the term "chance" or the term "high elevation" as used by Provider B were considered a non-precipitation-day forecast. This day was then compared directly to the KSLC precipitation records. If a trace or more of either rain or snow or both fell on a particular day that was a forecast precipitation-day by a Provider it was considered to have been correct (a hit). Additionally if no precipitation was forecast and no precipitation occurred then it was also considered a hit.
As with temperature forecasts, a "climatology" forecast for precipitation was developed. The method used for this forecast could be considered a modified climatology. In this method the average number of days that experience precipitation each month of the year was tabulated. This information was taken from NOAA Technical Memorandum NWS WR-152, Climate of Salt Lake City, Table 36. Table 36 is based on data from 1928 through 2001. The percentage chance for precipitation for each day of the year was then assessed. This data was also collected from NOAA Technical Memorandum NWS WR-152, Climate of Salt Lake City, Table 42a. Table 42a is based on data from 1928 through 2001.
Each month was looked at individually. The average number of days that precipitation fell for a particular month was used as the number of forecast precipitation days for that month. This number was then used to find which actual days in the month to use for the precipitation-day by comparing the days with the highest chances of precipitation for that particular month, and using the number of highest-percent-chance days corresponding to the number of average days of rain the month had. For example, in January, Salt Lake City averages 10 days in the month with precipitation. So the 10 days in January that had the highest percentage chance for precipitation were chosen to be the days that precipitation fell in the climatology forecast.
The persistence forecast was developed by forecasting precipitation for the day after a day which recorded precipitation and forecasting no precipitation for the day following a day which recorded no precipitation.
RESULTS SUMMARY
Temperature
Provider A, using word descriptor forecasts for temperature that were tabled in an above normal, normal, below normal scheme was then compared to actual temperatures from KSLC, which were processed into the same above normal, normal, and below normal scheme by using plus or minus one standard deviation from the average maximum and minimum temperatures. This produced results showing Provider A temperature forecasts were correct 36 percent of the time.
Provider B, using maximum temperature range forecasts for the daily maximum compared to the actual maximum temperature was correct 47 percent of the time with their temperature forecasts.
Climatology, using average maximum and minimum daily temperatures, which by default are all "normal" temperatures, and comparing them to the above normal, normal, below normal scheme developed for Provider A was correct 55 percent of the time.
Persistence, comparing maximum and minimum forecast temperatures that were tabled in the temperature scheme described above for Provider A and compared to the actual maximum and minimum temperatures that were also tabled in the same temperature scheme, was correct 65 percent of the time.
Precipitation
Provider A, using descriptor forecasts for type of precipitation compared to a yes/no on actual precipitation was correct 46 percent of the time on a forecast of 162 precipitation days and correct 54 percent of the time on a forecast of precipitation or a forecast of no precipitation (198 days). The number of actual precipitation days was 131*. For missed forecasts there were 102 cases where precipitation was predicted but none occurred and there were 65 cases where no precipitation was predicted and precipitation did occur.
Provider B, using descriptor forecasts for type of precipitation compared to a yes/no on actual precipitation was correct 19 percent of the time on a forecast of 67 precipitation days and correct 59 percent of the time on a forecast of precipitation or a forecast of no precipitation (220 days). The number of actual precipitation days was 138*. For missed forecasts there were 39 cases where precipitation was predicted but none occurred and there were 109 cases where no precipitation was predicted and precipitation did occur.
Climatology, using a yes/no precipitation-day compared to a yes/no actual precipitation-day was correct 30 percent of the time on a forecast of 91 precipitation days and correct 60 percent of the time on a forecast of precipitation or a forecast of no precipitation (219 days). The number of actual precipitation days was 138. For missed forecasts there were 50 cases where precipitation was predicted but none occurred and there were 96 cases where no precipitation was predicted and precipitation did occur.
Persistence, using a yes/no precipitation-day compared to a yes/no actual precipitation-day was correct 57 percent of the time on a forecast of 137 precipitation days and correct 70 percent of the time on a forecast of precipitation or a forecast of no precipitation (256 days). The number of actual precipitation days was 138. For missed forecasts there were 54 cases where precipitation was predicted but none occurred and there were 54 cases where no precipitation was predicted and precipitation did occur.
* the number of actual precipitation days varies between the providers due to a one month shift in the period when each provider was compared to Salt Lake City.
cool = below normal
cooler (coming off of hot) = above normal
cold = below normal
very cold = below normal
hot = above normal
mild = normal
milder (coming off of cold) = below normal
seasonable = normal
warm = normal
Provider B gave temperature ranges with their forecasts and these could be directly compared to actual records with no conversion necessary.
The actual temperature records from KSLC were converted to fit in the 3 category table developed for Provider A's temperature forecasts to allow for comparison to those forecasts. Inserting the actual recorded maximum and minimum temperatures into one of the three categories as described for Provider A was done by looking at the standard deviation of the normal maximums and minimums for each day for KSLC. If either the daily maximum or daily minimum temperature (or both) were above one standard deviation higher than the average maximum or minimum for the day, then that day was added to the above normal category. Similarly, if either the maximum or minimum temperature (or both) were below one standard deviation lower than the average maximum or minimum for the day, then that day was added to the below normal category. All other days were added into the normal category.
The standard deviations used in this exercise were taken from the Western Regional Climate Center (WRCC) web site (http://www.wrcc.dri.edu/climsum.html). These deviations were based on KSLC data over the 30 year period from 1971 through 2000. The standard deviations were calculated separately for each days maximum and minimum temperature.
A "climatology" forecast for temperatures was also developed. Average temperatures for the period from 1971 through 2000 were used as the climatological forecast. Additionally a "persistence" forecast was developed, where the maximum and minimum temperatures that occurred on one day were used as the forecast for the next day.
Precipitation
In the case of both Provider's, if a forecast of precipitation of either rain or snow was given for any particular day, then that day was considered a forecast precipitation-day. Forecasts using the term "chance" or the term "high elevation" as used by Provider B were considered a non-precipitation-day forecast. This day was then compared directly to the KSLC precipitation records. If a trace or more of either rain or snow or both fell on a particular day that was a forecast precipitation-day by a Provider it was considered to have been correct (a hit). Additionally if no precipitation was forecast and no precipitation occurred then it was also considered a hit.
As with temperature forecasts, a "climatology" forecast for precipitation was developed. The method used for this forecast could be considered a modified climatology. In this method the average number of days that experience precipitation each month of the year was tabulated. This information was taken from NOAA Technical Memorandum NWS WR-152, Climate of Salt Lake City, Table 36. Table 36 is based on data from 1928 through 2001. The percentage chance for precipitation for each day of the year was then assessed. This data was also collected from NOAA Technical Memorandum NWS WR-152, Climate of Salt Lake City, Table 42a. Table 42a is based on data from 1928 through 2001.
Each month was looked at individually. The average number of days that precipitation fell for a particular month was used as the number of forecast precipitation days for that month. This number was then used to find which actual days in the month to use for the precipitation-day by comparing the days with the highest chances of precipitation for that particular month, and using the number of highest-percent-chance days corresponding to the number of average days of rain the month had. For example, in January, Salt Lake City averages 10 days in the month with precipitation. So the 10 days in January that had the highest percentage chance for precipitation were chosen to be the days that precipitation fell in the climatology forecast.
The persistence forecast was developed by forecasting precipitation for the day after a day which recorded precipitation and forecasting no precipitation for the day following a day which recorded no precipitation.
RESULTS SUMMARY
Temperature
Provider A, using word descriptor forecasts for temperature that were tabled in an above normal, normal, below normal scheme was then compared to actual temperatures from KSLC, which were processed into the same above normal, normal, and below normal scheme by using plus or minus one standard deviation from the average maximum and minimum temperatures. This produced results showing Provider A temperature forecasts were correct 36 percent of the time.
Provider B, using maximum temperature range forecasts for the daily maximum compared to the actual maximum temperature was correct 47 percent of the time with their temperature forecasts.
Climatology, using average maximum and minimum daily temperatures, which by default are all "normal" temperatures, and comparing them to the above normal, normal, below normal scheme developed for Provider A was correct 55 percent of the time.
Persistence, comparing maximum and minimum forecast temperatures that were tabled in the temperature scheme described above for Provider A and compared to the actual maximum and minimum temperatures that were also tabled in the same temperature scheme, was correct 65 percent of the time.
Precipitation
Provider A, using descriptor forecasts for type of precipitation compared to a yes/no on actual precipitation was correct 46 percent of the time on a forecast of 162 precipitation days and correct 54 percent of the time on a forecast of precipitation or a forecast of no precipitation (198 days). The number of actual precipitation days was 131*. For missed forecasts there were 102 cases where precipitation was predicted but none occurred and there were 65 cases where no precipitation was predicted and precipitation did occur.
Provider B, using descriptor forecasts for type of precipitation compared to a yes/no on actual precipitation was correct 19 percent of the time on a forecast of 67 precipitation days and correct 59 percent of the time on a forecast of precipitation or a forecast of no precipitation (220 days). The number of actual precipitation days was 138*. For missed forecasts there were 39 cases where precipitation was predicted but none occurred and there were 109 cases where no precipitation was predicted and precipitation did occur.
Climatology, using a yes/no precipitation-day compared to a yes/no actual precipitation-day was correct 30 percent of the time on a forecast of 91 precipitation days and correct 60 percent of the time on a forecast of precipitation or a forecast of no precipitation (219 days). The number of actual precipitation days was 138. For missed forecasts there were 50 cases where precipitation was predicted but none occurred and there were 96 cases where no precipitation was predicted and precipitation did occur.
Persistence, using a yes/no precipitation-day compared to a yes/no actual precipitation-day was correct 57 percent of the time on a forecast of 137 precipitation days and correct 70 percent of the time on a forecast of precipitation or a forecast of no precipitation (256 days). The number of actual precipitation days was 138. For missed forecasts there were 54 cases where precipitation was predicted but none occurred and there were 54 cases where no precipitation was predicted and precipitation did occur.
* the number of actual precipitation days varies between the providers due to a one month shift in the period when each provider was compared to Salt Lake City.
No comments:
Post a Comment