After digging around, I figured out that the GFS was initializing with bad data. As with any computer model, if you put bad data in... bad data comes out. The GFS was using the Air Force Snow Depth Analysis to initialize it's model run. That data is bad in a heavy sleet and freezing rain event! Kate Howard at the National Centers for Environmental Prediction (NCEP) said:
"We currently use the AFWA (USAF) snow data in the GFS and NAM. Both model teams are aware of the current issues with the AFWA dataset and are looking into solutions for our respective forecast systems."
So, we found the problem! But why is the USAF data wrong? Why does it take a heavy sleet and freezing rain event and make it look like a huge snow storm? Well, remember that sleet accumulates much differently than snow. 1.00" of sleet can be like 8-10" of snow! Here's how the AFWA comes up with it's snow depth:
"Snow depth is estimated daily by the Air Force Weather Agency (AFWA) by merging satellite-derived snow cover data with daily snow depth reports from ground stations. Snow depth reports are updated by additional snowfall data or decreased by calculated snow melt. If no new snow depth or accumulation information is available, surface temperature data is used to calculate a possible snow melt for temperatures above 32-degree Fahrenheit. The merged product estimates daily snow depth in centimeters."
In other words, it logs all of our sleet as snowfall. If you look at how the GFS initialized today, you can see that the snow depth that it initialized on was dead wrong.... This evening, it looks like NCEP tried to manually adjust it, but still puts too much snowfall in our area:
If I'm right about this, it's a huge issue for areas like ours who get sleet and freezing rain a few times in the Winter.
So how does this impact temperature data? Take a look at the projected overnight lows from the GFS for tonight. The data on the left is from the GFS model around lunchtime. The image on the right is from this evening, when it appears that NCEP may have made some adjustments. I'm not positive on that, but surely the model does not think that much snow melted today:
The data that really made me start looking at this again was when I look at the temperatures for Thursday morning. The GFS had us at -10° and parts of Tennessee at -20°! That just does not happen around here much. I wish I could confirm that NCEP made some manual adjustments, but I don't know how to to do that at 12:45 am.... Here is that data for Thursday morning:
Many places that you can get a forecast from are going for overnight lows in the 0-5°. I don't see that happening, but may have to eat those words on Thursday morning. I'm going for lows in the low to mid teens on Wednesday night.
I've been forecasting this week by looking at temperatures aloft. It's the only way to go... for now.
Stay tuned as we iron out this temperature forecasting issue,
Ryan
EDIT: Read the comments section. Great information that I did not know from NOAA.
4 comments:
A good link for you to take a look at is:
http://www.emc.ncep.noaa.gov/jcsda/ggayno/snow/home.html
It shows how Snow Cover and Snow Depth are derived at model initialization.
The GFS and the NAM snow cover are update off the human analysis of snowcover done on the Interactive Multi-Sensor Snow and Ice mapping system. The GFS and NAM are updated at 00z each evening. Between 00z and 18z the models snow cover is updated using the models 6 hour forecast for each respective run (i.e. the 6 hour forecast for snow cover at 00z becomes the initialized snowcover on the 06z run and so on and so forth). That is why you would see a big jump possibly between the 18z at 6 hours and the 00z at initialization.
If you would like more information about the IMS (Interactive Multi-sensor Snow and Ice Mapping system) contact Sean Helfrich (sean.helfrich@noaa.gov)
In our area south of the activity, how do we know what to expect?
Good info and link Sean. So, is the bigger problem the algorithm used for p-type or snow accumulation in earlier model runs?
Post a Comment