I have been able to examine the NSSL-WRF, 00 UTC and 12 UTC NAM this morning.
With regard to the trough that is forecast to be the major player, it is likely that significant severe will break out colocated with the strong forcing along the triple point, in the dry punch into MO, and then another round even later in AR.
The main uncertainty lies in OK, where my hopecast suggests storms could try to break out along the dryline. The problem is that it occurs just around 22-23 UTC when the wind profile might be considered terrible for tornadoes. The wind profiles in general become more favorable further east and later on putting the tornado threat into AR but the window for discrete supercells appears to be small. Rather a squall line of some type will form with probably the chance for embedded supercell structures.
Further south however, there are better wind profiles, but the cap is somewhat stronger. The 00 UTC NSSL-WRF forms a squall line there as indicated by the synthetic satellite imagery.
I don't have a good intuitive feel for what may occur given some of the wind profiles I have seen. I do think the overnight models will struggle as they are typically too far east with any convection. They also struggle to produce individual storms ... will only produce storms in stronger forcing. The resolution of the models, the tendency to produce weaker lapse rates, etc all contribute to the storm bias. That said, the main threat appears to Normans east (Tulsa area and north), northwest (along the triple point), and southeast (secondary dry punch). There is still a chance for central OK. It all depends on if any storms attempt to go up along the dryline, which ultimately depends on the relative balance between the depth of moisture in the warm sector and the cap strength.
I am hoping that LMN and OUN will launch 21 UTC soundings, and maybe 18 UTC soundings so we can really examine the wind profile evolution as well as the cap. This will be a good case for analysis either way as it is a strong forcing case that is highly dependent on mesoscale details that our models may not get correct.
A weather, education, and science blog run amok. Brought to you by James Correia, Jr., PhD. I have a BS from SUNYA in Atmospheric Sciences, MS from FSU in Meteorology, and a PhD from ISU in Agricultural Meteorology. I specialize in mesoscale numerical weather prediction on scales larger than 4km for both forecasting and regional climate. The views expressed here do not reflect those of NOAA, the NWS, or the University of Oklahoma.
Sunday, February 27, 2011
Saturday, February 26, 2011
Uncertainty foci
1. The trough in question is still only half over the upper air network, thus there is uncertainty in its exact structure. It appears to be currently moving more south than east but that trend may be shifting to more east.
2. I found a website to scrutinize the 3 hourly soundings from the NAM. It turns out the cold air advection over OK around 00 UTC may actually be a reflection of a boundary layer deepening up to 550 hPa at KGAG! Thus while it is getting cooler aloft, it does not necessarily imply a forcing mechanism for ascent. Note that 3 hours later, descent is implied from the inversion yielding net warming in the same layer while the boundary layer significantly cools. The exact role this feature plays remains uncertain.
3. Soundings from central OK show a very shallow moist layer and cap aloft which although weakens some remains strong until after 03 UTC. This is saying a lot since the previously mentioned odd double low level jet is clearly playing a big role in the development of deep moisture. Its even difficult to get moisture into KLZK under this scenario.
2. I found a website to scrutinize the 3 hourly soundings from the NAM. It turns out the cold air advection over OK around 00 UTC may actually be a reflection of a boundary layer deepening up to 550 hPa at KGAG! Thus while it is getting cooler aloft, it does not necessarily imply a forcing mechanism for ascent. Note that 3 hours later, descent is implied from the inversion yielding net warming in the same layer while the boundary layer significantly cools. The exact role this feature plays remains uncertain.
3. Soundings from central OK show a very shallow moist layer and cap aloft which although weakens some remains strong until after 03 UTC. This is saying a lot since the previously mentioned odd double low level jet is clearly playing a big role in the development of deep moisture. Its even difficult to get moisture into KLZK under this scenario.
Severe weather potential
Interesting forecast shaping up for tomorrow evening. Somewhat strong trough will move into OK tomorrow bringing with i the chance for significant severe weather to central and eastern OK and continuing eastward overnight.
The more certain portion of the forecast is the later period late Sunday night when it appears probable that a severe MCS/squall line will tear up parts MO, AR, LA again. I think its likely as more time will allow for more moisture to advect northward in what amounts to me to be an odd linkage between a pre-existing low level jet well east of the effects of the trough and the trough. The apparent phasing of the two will occur later sunday evening. These highly dynamic environments lead to some interesting MCSs.
The early part of the threat in central OK is less certain. To me, at the moment, the most interesting part of the forecast is that the features of interest appear to slow down before entering it the Plains, then speed up once they do. This makes for a confusing scenario. The prominent features I see this morning are the dryline in western OK, the dryline warm front intersection in NW OK, and the rather strong 700 hPa cold advection over the dryline all at 00 UTC. Now, I have no idea what to expect in this type of slow then fast large scale regime. This fact alone adds uncertainty for my limited experience of significant severe weather in OK during Feb.
What is not clear is if the cold advection aloft will be enough to remove the cap, and if the instability will be large enough to be realized. We just had some rain followed by cold temperatures, and soil water fractions are high apparently. The last 10 and 30 day rainfall maps show that western OK is dry and eastern OK has seen around 2+ inches. So it is possible that strong sensible heating will take place on Sunday in areas that are drier ahead of the dryline. Will this actually matter? I will tell you on Monday.
The other major issue is how the dryline and shear align. At the moment the mean wind for sunday evening appears to be from the SW and the dryline is oriented more north-south. This angle difference will allow storms to move off the dryline. The shear vector should also orient itself across the dryline but the magnitude of that angle will be very important to storm mode.
I guess my inclination at this moment are that all the necessary ingredients can be found, it will be a matter of how they come together and when that will occur and for how long. For now, at least, it is important to get that moisture screaming northward. I could use a good storm chase.
A lot will change. But I think there is a chance for supercells. I am uncertain where convective initiation will occur, and where severe storms will get going, and what that initial mode will be. I will evaluate the Short range ensemble Forecast (SREF) when the 15z run becomes available later this afternoon. Really this is just me getting my thoughts ramped up ... otherwise known as situational awareness.
The more certain portion of the forecast is the later period late Sunday night when it appears probable that a severe MCS/squall line will tear up parts MO, AR, LA again. I think its likely as more time will allow for more moisture to advect northward in what amounts to me to be an odd linkage between a pre-existing low level jet well east of the effects of the trough and the trough. The apparent phasing of the two will occur later sunday evening. These highly dynamic environments lead to some interesting MCSs.
The early part of the threat in central OK is less certain. To me, at the moment, the most interesting part of the forecast is that the features of interest appear to slow down before entering it the Plains, then speed up once they do. This makes for a confusing scenario. The prominent features I see this morning are the dryline in western OK, the dryline warm front intersection in NW OK, and the rather strong 700 hPa cold advection over the dryline all at 00 UTC. Now, I have no idea what to expect in this type of slow then fast large scale regime. This fact alone adds uncertainty for my limited experience of significant severe weather in OK during Feb.
What is not clear is if the cold advection aloft will be enough to remove the cap, and if the instability will be large enough to be realized. We just had some rain followed by cold temperatures, and soil water fractions are high apparently. The last 10 and 30 day rainfall maps show that western OK is dry and eastern OK has seen around 2+ inches. So it is possible that strong sensible heating will take place on Sunday in areas that are drier ahead of the dryline. Will this actually matter? I will tell you on Monday.
The other major issue is how the dryline and shear align. At the moment the mean wind for sunday evening appears to be from the SW and the dryline is oriented more north-south. This angle difference will allow storms to move off the dryline. The shear vector should also orient itself across the dryline but the magnitude of that angle will be very important to storm mode.
I guess my inclination at this moment are that all the necessary ingredients can be found, it will be a matter of how they come together and when that will occur and for how long. For now, at least, it is important to get that moisture screaming northward. I could use a good storm chase.
A lot will change. But I think there is a chance for supercells. I am uncertain where convective initiation will occur, and where severe storms will get going, and what that initial mode will be. I will evaluate the Short range ensemble Forecast (SREF) when the 15z run becomes available later this afternoon. Really this is just me getting my thoughts ramped up ... otherwise known as situational awareness.
Thursday, February 10, 2011
Trying not to lose
http://www.spacenews.com/policy/110208-house-earth-science-funds-manned-spaceflight.html
The point of the article is a bunch of lawmakers who want to reorient NASA back to human spaceflight. They will do this by taking climate monitoring funding, referred to in the article as global warming funding, away.
I believe a policy like this does 3 things: gives lawmakers the power to control research directions (even if only on this one particular goal), it hurts the climate monitoring via satellite initiatives that we sorely need, and it presumes to send humans back to the moon via, I assume, the constellation program.
I don't think the moon is a good goal and I really like that commercial space transport and delivery is making significant accomplishments via SpaceX and Orbital Technologies. This is good but not great news, since I doubt these companies will profit much. I think we need to think big like Mars. The challenges Mars poses are grand. Materials science, engineering, biochemistry, chemistry, biology, psychology, psychiatry, nanotechnology, etc will all need to be utilized in a major way. It could be the sputnik moment. Of course, the obvious problem is that a Mars trip is one way, right now. And that is why the moon is the next "best" thing.
The satellite issue is important since both weather and climate rely on satellite monitoring. Satellite development is long and expensive but it pays in science even though it costs a ton. The most exciting in my mind is soil moisture which the US has not been able to do, but the Europeans have. Why does NASA do satellites and climate monitoring? Because its a natural fit. They build them, launch them, and monitor them. No other agency is qualified to do that.
The controlling of research dollars and directions by lawmakers ...well...I don't care to comment on that at the moment. These are people trying to save jobs at home to guarantee continued employment for their constituents. But really it aligns with their re-election priorities and that's why it makes more sense to them. The status quo is desirable for jobs and who can blame them. Keep what you have so you don't have to risk asking for money for new job development in your region. especially in a floundering economy.
I don't really have a good sense that this budget stuff will help without a reorganization of our goals ... both public and private. I like slogans like "win the future" because really what we have been doing is trying not to lose. We need high risk high reward activities and they cost money. It will take money and the will to take big risks. But trying not to lose is not working.
The point of the article is a bunch of lawmakers who want to reorient NASA back to human spaceflight. They will do this by taking climate monitoring funding, referred to in the article as global warming funding, away.
I believe a policy like this does 3 things: gives lawmakers the power to control research directions (even if only on this one particular goal), it hurts the climate monitoring via satellite initiatives that we sorely need, and it presumes to send humans back to the moon via, I assume, the constellation program.
I don't think the moon is a good goal and I really like that commercial space transport and delivery is making significant accomplishments via SpaceX and Orbital Technologies. This is good but not great news, since I doubt these companies will profit much. I think we need to think big like Mars. The challenges Mars poses are grand. Materials science, engineering, biochemistry, chemistry, biology, psychology, psychiatry, nanotechnology, etc will all need to be utilized in a major way. It could be the sputnik moment. Of course, the obvious problem is that a Mars trip is one way, right now. And that is why the moon is the next "best" thing.
The satellite issue is important since both weather and climate rely on satellite monitoring. Satellite development is long and expensive but it pays in science even though it costs a ton. The most exciting in my mind is soil moisture which the US has not been able to do, but the Europeans have. Why does NASA do satellites and climate monitoring? Because its a natural fit. They build them, launch them, and monitor them. No other agency is qualified to do that.
The controlling of research dollars and directions by lawmakers ...well...I don't care to comment on that at the moment. These are people trying to save jobs at home to guarantee continued employment for their constituents. But really it aligns with their re-election priorities and that's why it makes more sense to them. The status quo is desirable for jobs and who can blame them. Keep what you have so you don't have to risk asking for money for new job development in your region. especially in a floundering economy.
I don't really have a good sense that this budget stuff will help without a reorganization of our goals ... both public and private. I like slogans like "win the future" because really what we have been doing is trying not to lose. We need high risk high reward activities and they cost money. It will take money and the will to take big risks. But trying not to lose is not working.
Tuesday, February 8, 2011
Meteorological data
UPDATE: Remember when I said there are some missing data? Yeah. Only for the case I want to analyze. The exact 16 hours I most wanted. Funny that the operational data that is saved all over the place has more observations than the archived ASOS data at NCDC. Redundancy is an important part of data archival.
I am analyzing a particular case and have been looking at the unique observations that the Oklahoma mesonet collects. I want to add to this huge data source. Therein lies the issue. Merging data sets is quite a task. Of course the fancier one gets, the more trouble there is.
Recently the NWS added many stations to its list of archived, 1 and 5 minute ASOS data. It is a decent dataset even if it is spatially sparse. The issue is the format. Now regular hourly and special observations get transmitted in METAR format. There is a nice decoder written for GEMPAK which processes this data. I would say it is awesome but it is suitable. What makes it better is that it retains the whole METAR data string for subsequent data mining. This ensures that some of the metadata (+TSGRFC, PRSFR, etc) are not lost.
However, the potential research quality dataset currently being archived at NCDC undergoes no quality control and the files can have transmission problems. It also suffers because if the data are not relayed in time, the data is lost. Thus one must process the data and visually inspect for issues. I did this many years ago as part of my PhD training for the BAMEX field project and it was a nightmare to write code to process the 1 minute data. I ended up having multiple Fortran codes to deal with some of the transmission problems, formatting problems or missing data.
The five minute data are stored in METAR format, but not exactly METAR form. I would think someone could process the minute data into a quality 5 minute data set with a decent, readable format and provide some measure of quality control. (I will share my code which reads the data. ) This could be an interesting data set. As it is now it is difficult to work with, but not impossible. I heard that the community was trying to organize a network of networks for surface data. I hope they succeed and I hope they model whatever they do with the successful components of the Oklahoma Mesonet, MesoWest, and the Iowa Environmental Mesonet.
I am analyzing a particular case and have been looking at the unique observations that the Oklahoma mesonet collects. I want to add to this huge data source. Therein lies the issue. Merging data sets is quite a task. Of course the fancier one gets, the more trouble there is.
Recently the NWS added many stations to its list of archived, 1 and 5 minute ASOS data. It is a decent dataset even if it is spatially sparse. The issue is the format. Now regular hourly and special observations get transmitted in METAR format. There is a nice decoder written for GEMPAK which processes this data. I would say it is awesome but it is suitable. What makes it better is that it retains the whole METAR data string for subsequent data mining. This ensures that some of the metadata (+TSGRFC, PRSFR, etc) are not lost.
However, the potential research quality dataset currently being archived at NCDC undergoes no quality control and the files can have transmission problems. It also suffers because if the data are not relayed in time, the data is lost. Thus one must process the data and visually inspect for issues. I did this many years ago as part of my PhD training for the BAMEX field project and it was a nightmare to write code to process the 1 minute data. I ended up having multiple Fortran codes to deal with some of the transmission problems, formatting problems or missing data.
The five minute data are stored in METAR format, but not exactly METAR form. I would think someone could process the minute data into a quality 5 minute data set with a decent, readable format and provide some measure of quality control. (I will share my code which reads the data. ) This could be an interesting data set. As it is now it is difficult to work with, but not impossible. I heard that the community was trying to organize a network of networks for surface data. I hope they succeed and I hope they model whatever they do with the successful components of the Oklahoma Mesonet, MesoWest, and the Iowa Environmental Mesonet.
Monday, February 7, 2011
Brief Comment on Climate Change
I read this today:
http://www.nytimes.com/2011/02/07/opinion/07krugman.html?_r=1&emc=tnt&tntemail0=y
I think Krugman makes a good point. It is one I have written about before.
Let me say that I don't know if "crisis" is the correct word to use. Higher prices, sure, but not relative to 2007. Economies are more sensitive to weather, absolutely. And as I blogged a few days ago, under the type of climate change I believe we are in, the variability increases with respect to extremes (floods and droughts, heat waves and cold spells, etc). Whether this increased variability would occur under different climate states or different rates of change of climate states, I don't know.
The point is that this type of global weather impact can not only be disruptive and expensive, it points out how vulnerable we are to climate change. This problem will only get worse as more people require more food. Human expansion has also resulted in the fading away of the family farm, and as the current global economic crisis continues more farm subsidies will be on the chopping block as budgets get reigned in. Thus it is the human-earth system that we need to understand. I know first hand that many groups are working on these challenges both academically and through the government.
It will be a while before we know if this is the "first taste" of our vulnerability but I strongly doubt it was. Debate will rage as economic, agricultural, and climate and weather related issues all conspire in various degrees. What is clear is that we remain vulnerable. Technology may have helped in the last 90 years, but it won't save us.
http://www.nytimes.com/2011/02/07/opinion/07krugman.html?_r=1&emc=tnt&tntemail0=y
I think Krugman makes a good point. It is one I have written about before.
Let me say that I don't know if "crisis" is the correct word to use. Higher prices, sure, but not relative to 2007. Economies are more sensitive to weather, absolutely. And as I blogged a few days ago, under the type of climate change I believe we are in, the variability increases with respect to extremes (floods and droughts, heat waves and cold spells, etc). Whether this increased variability would occur under different climate states or different rates of change of climate states, I don't know.
The point is that this type of global weather impact can not only be disruptive and expensive, it points out how vulnerable we are to climate change. This problem will only get worse as more people require more food. Human expansion has also resulted in the fading away of the family farm, and as the current global economic crisis continues more farm subsidies will be on the chopping block as budgets get reigned in. Thus it is the human-earth system that we need to understand. I know first hand that many groups are working on these challenges both academically and through the government.
It will be a while before we know if this is the "first taste" of our vulnerability but I strongly doubt it was. Debate will rage as economic, agricultural, and climate and weather related issues all conspire in various degrees. What is clear is that we remain vulnerable. Technology may have helped in the last 90 years, but it won't save us.
Sunday, February 6, 2011
NSSL WRF performance over OK
I grabbed some 2 meter temperature forecasts from NSSL's WRF 4 km model simulation from 00 UTC 5 February 24 hour forecast to compare with what happened on Saturday.
The model did not do terribly well but it highlighted the issue with snowpack. Lets go to the pictures:
The images depict the 2m temperature in 2 hour increments from 18 to 22 UTC; with the 22 UTC image being the warmest time of this day. What stands out is the warm air over the TX panhandle which does not expand rapidly into OK. Just east of the OK panhandle it warms rapidly, but it does not expand and penetrate eastward. Over OK, the cold patch, which aligns perfectly with the storm total snowfall and thus snow pack, does not appreciably change shape but it does warm a bit. Clearly the model had a poor representation of the snow cover, both in areal extent and depth.
From the previous post, the high temperatures even over the deep snow pack near Tulsa in the core of the model cold patch, got to near 40F ... a difference of 20 F!
The situation eases later as night arrives by 00 UTC. Below are the 24 hour forecast from the model and the initialization from the next cycle.
The differences between these 2 images is difficult to discern but they are still large, because of the eastward shift of the OK cold patch and the eastward extent of warmer air. This is an interesting case where I would expect this type of model to perform better. Diagnosing the evolution of the snow pack and the low level temperature tendencies both aloft and from within the model physics (boundary layer scheme) should shed some light on why the model performed poorly.
The model did not do terribly well but it highlighted the issue with snowpack. Lets go to the pictures:
The images depict the 2m temperature in 2 hour increments from 18 to 22 UTC; with the 22 UTC image being the warmest time of this day. What stands out is the warm air over the TX panhandle which does not expand rapidly into OK. Just east of the OK panhandle it warms rapidly, but it does not expand and penetrate eastward. Over OK, the cold patch, which aligns perfectly with the storm total snowfall and thus snow pack, does not appreciably change shape but it does warm a bit. Clearly the model had a poor representation of the snow cover, both in areal extent and depth.
From the previous post, the high temperatures even over the deep snow pack near Tulsa in the core of the model cold patch, got to near 40F ... a difference of 20 F!
The situation eases later as night arrives by 00 UTC. Below are the 24 hour forecast from the model and the initialization from the next cycle.
The differences between these 2 images is difficult to discern but they are still large, because of the eastward shift of the OK cold patch and the eastward extent of warmer air. This is an interesting case where I would expect this type of model to perform better. Diagnosing the evolution of the snow pack and the low level temperature tendencies both aloft and from within the model physics (boundary layer scheme) should shed some light on why the model performed poorly.
Saturday, February 5, 2011
Temperature forecasting over the snow cover
How high will the temperature get across OK over the snow pack?
The sounding above at OUN was taken at the morning low according to the Norman mesonet site. Now it snowed another inch or two yesterday to make the compacted snow depth anywhere from 3 to 5 inches. The snow we had was an icy consistency. I walked on it and only in the drifts was it actually loose snow.
The forecast high for today was 36 (as of this writing was upped to 38) from the NWS, and I thought that was pretty good. But it is always difficult to predict the high temperature when the ground is snow covered. The high for the day in these situations depends how far from the bare ground you are relative to the low level air mass that is moving in. The 1845 UTC visible satellite image shows where the snow is. The whiter the ground the deeper the snow:
I can't mark where Norman is but trust that we are not very far from the edge of the snowpack and points to the west of us didn't get that much snow in the first storm and barely received any in yesterdays.
However, a forecasting technique I learned at UAlbany was the 850 hPa method. This method assumes that air from above the surface will serve as the upper limit as the boundary layer grows upward. Of course the atmosphere aloft is not usually stagnant so you need to account for that. This mornings OUN sounding had an 850 T of 1.6 C with even warmer temperatures of 6.4C at 911 hPa. This suggests a temperature of 14.3 C from 850, or a temperature of 13.6 C from 911 hPa ... again assuming these temperatures change very little during the day. So the mid to upper 50's!
So the forecast problem is the battle between surface effects: how the incoming radiation can heat the ground and help that mixing process along versus how the mixing occurring elsewhere (in this case to the northwest and west) brings in warmer air over the snowpack. The latter is usually referred to as an internal boundary layer.
Now, the temperature is obviously controlled by the snow depth and snow cover and there should be an obvious temperature contrastor gradient across the plains:
Note, the yellow shading which tells us the potential temperature gradient pretty much aligned along the edge of the snow pack from this morning.
I will add the 18UTC and 21UTC maps when available below:
The potential temperature gradient shifted somewhat east as the snow below has melted and temperatures have warmed.
So far at 4pm local time the temperatures in western OK have risen to 62+F and Norman is currently around 45F! This is pretty much going to be the high for the day.
I will add the mesonets max T plot when it is available:
Now where did this warm air layer aloft come from? Downslope off the mountains as the large scale trough passed through our region. Temperatures at 850 hPa warmed from -15C to 2C in 24 hours at Dodge City, and from -18C to 2C at Amarillo.
A comparison of soundings from Lamont indicated a slow warmup primarily in the boundary layer with just a minor change in the height of warm layer. However by 00 UTC it was clear that the entire column was warming significantly downstream from the mountains. LMN and OUN were clearly situated in a warm advection regime at 00 UTC. Given the sounding structures, you could have used the 850 method with forecast soundings, not upstream soundings given how much the entire temperature profile changed.
Of course, it is very interesting that the sounding at AMA can not be used to show the large surface warming over western OK. AMA only reached 12 C which is around 50F, while western OK reached 62F. The implication is that the warm pocket over Oklahoma this morning was bigger (or may have even developed aloft) than could be revealed by the sounding network. Either way these mesoscale details (snow cover or lack thereof, warm pockets of air aloft from downsloping) made it difficult for forecasters to utilize observations for short term forecasts of temperature. This is a much more complex scenario for what seems like a straightforward case of temperature forecasting over snowpack.
As a side note, the temperature warmed above freezing and so all the snow in the rain gauge is melting. Now a ton of it was blown around but what was secured to the gauge from the last two snowfalls should tell us something about the sleet that fell, then the snow, and then yesterdays heavier snow. The liquid total from the melt is: as of 9:40pm is 0.32".
The sounding above at OUN was taken at the morning low according to the Norman mesonet site. Now it snowed another inch or two yesterday to make the compacted snow depth anywhere from 3 to 5 inches. The snow we had was an icy consistency. I walked on it and only in the drifts was it actually loose snow.
The forecast high for today was 36 (as of this writing was upped to 38) from the NWS, and I thought that was pretty good. But it is always difficult to predict the high temperature when the ground is snow covered. The high for the day in these situations depends how far from the bare ground you are relative to the low level air mass that is moving in. The 1845 UTC visible satellite image shows where the snow is. The whiter the ground the deeper the snow:
I can't mark where Norman is but trust that we are not very far from the edge of the snowpack and points to the west of us didn't get that much snow in the first storm and barely received any in yesterdays.
However, a forecasting technique I learned at UAlbany was the 850 hPa method. This method assumes that air from above the surface will serve as the upper limit as the boundary layer grows upward. Of course the atmosphere aloft is not usually stagnant so you need to account for that. This mornings OUN sounding had an 850 T of 1.6 C with even warmer temperatures of 6.4C at 911 hPa. This suggests a temperature of 14.3 C from 850, or a temperature of 13.6 C from 911 hPa ... again assuming these temperatures change very little during the day. So the mid to upper 50's!
So the forecast problem is the battle between surface effects: how the incoming radiation can heat the ground and help that mixing process along versus how the mixing occurring elsewhere (in this case to the northwest and west) brings in warmer air over the snowpack. The latter is usually referred to as an internal boundary layer.
Now, the temperature is obviously controlled by the snow depth and snow cover and there should be an obvious temperature contrastor gradient across the plains:
Note, the yellow shading which tells us the potential temperature gradient pretty much aligned along the edge of the snow pack from this morning.
I will add the 18UTC and 21UTC maps when available below:
The potential temperature gradient shifted somewhat east as the snow below has melted and temperatures have warmed.
So far at 4pm local time the temperatures in western OK have risen to 62+F and Norman is currently around 45F! This is pretty much going to be the high for the day.
I will add the mesonets max T plot when it is available:
Now where did this warm air layer aloft come from? Downslope off the mountains as the large scale trough passed through our region. Temperatures at 850 hPa warmed from -15C to 2C in 24 hours at Dodge City, and from -18C to 2C at Amarillo.
A comparison of soundings from Lamont indicated a slow warmup primarily in the boundary layer with just a minor change in the height of warm layer. However by 00 UTC it was clear that the entire column was warming significantly downstream from the mountains. LMN and OUN were clearly situated in a warm advection regime at 00 UTC. Given the sounding structures, you could have used the 850 method with forecast soundings, not upstream soundings given how much the entire temperature profile changed.
Of course, it is very interesting that the sounding at AMA can not be used to show the large surface warming over western OK. AMA only reached 12 C which is around 50F, while western OK reached 62F. The implication is that the warm pocket over Oklahoma this morning was bigger (or may have even developed aloft) than could be revealed by the sounding network. Either way these mesoscale details (snow cover or lack thereof, warm pockets of air aloft from downsloping) made it difficult for forecasters to utilize observations for short term forecasts of temperature. This is a much more complex scenario for what seems like a straightforward case of temperature forecasting over snowpack.
As a side note, the temperature warmed above freezing and so all the snow in the rain gauge is melting. Now a ton of it was blown around but what was secured to the gauge from the last two snowfalls should tell us something about the sleet that fell, then the snow, and then yesterdays heavier snow. The liquid total from the melt is: as of 9:40pm is 0.32".
Thursday, February 3, 2011
I take exception
A recent article appeared on Foxnews.com that I found noteworthy.
I wish to challenge the notion that anyone, with any certainty, can use any recent weather event no matter how large as a sign of or lack of global climate change. Climate as we all should know is about statistics. It will take a decade or longer to know how epic this latest blizzard was.
What Al Gore got correct was the scientific evidence. He correctly stated that under global warming scenarios it has been shown that variability increases. And even during a warming trend, globally, there can be dips, significant dips (even negative anomalies) regionally. This evidence suggests that this CAN be part of global warming scenarios. Of course we don't know the reverse because I am not sure that anyone has done a global cooling experiment. Please enlighten me if such a study has been done.
The second issue I wish to be picky about is the notion of predictions associated with climate. We are assuming a CO2 increase in what can only be described as complex models, but models that do not represent the full coupled climate system. This is necessary because building complex models requires a solid foundation upon which to add complexity. And unfortunately by building complex models we can not say that a model error at this point is actually wrong because all the processes we observe in the real world are not present in the model. It is an interesting problem to say the least. One that is being tackled on the weather side as well.
What we do know is that as model resolution improves we get better, but not perfect solutions. That is good news for weather and climate. But long term climate prediction is still not an initial value problem. Though some argue this point feverishly. It is a true scientific issue and debate will continue in the scientific arena, not in the media.
Speaking of media:
I wish to challenge the notion that anyone, with any certainty, can use any recent weather event no matter how large as a sign of or lack of global climate change. Climate as we all should know is about statistics. It will take a decade or longer to know how epic this latest blizzard was.
What Al Gore got correct was the scientific evidence. He correctly stated that under global warming scenarios it has been shown that variability increases. And even during a warming trend, globally, there can be dips, significant dips (even negative anomalies) regionally. This evidence suggests that this CAN be part of global warming scenarios. Of course we don't know the reverse because I am not sure that anyone has done a global cooling experiment. Please enlighten me if such a study has been done.
The second issue I wish to be picky about is the notion of predictions associated with climate. We are assuming a CO2 increase in what can only be described as complex models, but models that do not represent the full coupled climate system. This is necessary because building complex models requires a solid foundation upon which to add complexity. And unfortunately by building complex models we can not say that a model error at this point is actually wrong because all the processes we observe in the real world are not present in the model. It is an interesting problem to say the least. One that is being tackled on the weather side as well.
What we do know is that as model resolution improves we get better, but not perfect solutions. That is good news for weather and climate. But long term climate prediction is still not an initial value problem. Though some argue this point feverishly. It is a true scientific issue and debate will continue in the scientific arena, not in the media.
Speaking of media:
"If it all seems confusing and contradictory, other experts say, the real blame lies not with the climate, or with science, or even scientists or former politicians, but with the incompetent media for failing to provide critical context for readers. "
Indeed. In this very article! Examine it closely. The first half of the article presents one side and then trails off into the other side and only by the late middle does context begin to appear. And then just as context settles in, they bring in the 1970's cooling argument. This argument was created and propagated by press reports which misrepresented the science. Which makes it irrelevant in the current discussion.
And then they close by stating that science changes! Of course science changes. It does so because we update our theories based on new evidence, new data, new analysis. And yes even scientists can be wrong. They go where the science leads them and not every avenue leads somewhere or even leads to the correct somewhere. Thats why we attempt to make results reproducible.
We call it climate change because we know way more about the weather and climate and can state with confidence that change is the best way to describe it. Some regions will warm, others may cool. Some will get more precipitation, others less. It is an important scientific distinction. It is not changing the message, however.
Climate change is not contentious because the science is weak. It is contentious because the science is young. It is further complicated because of the economic impact any action on carbon emissions might have. Scientists still have the duty to warn about impending climate fluctuations or even climate change. And the climate scientists have spoken, in consensus, to warn us about the effects of increasing CO2. They do so with uncertainty; the range of possible warming scenarios. They do so with caution. What our policymakers should be doing is deciding how to act responsibly not deciding which science is correct. The science updates all on its own.
What is not easily updated is how well scientific communication occurs between scientists and policymakers, scientists and the public, and policymakers and the public.
Education on Snow
It is late. I am on a snow day. And I feel like teaching.
Real snow days are not for play dates, driving around, or going to the movies. No offense Freakonmics blogger. But snow days in my mind are dedicated to physics, science, engineering, and daredevil stunts. That is, when you have finished getting paid for shoveling driveways and sidewalks around the neighborhood. Lets not forget some good family time where you build a snowman family.*
Designing, building, and organizing an awesome sledding hill complete with jumps is an effort in fun, design, creativity, and organization. Furthermore constructing snow walls or barriers in order to have epic snowball fights is also a must. And lets not forget finding a wall so that you can pile up 3-6 feet and jump into it from above. Going super fast down a hill (engineering, physics, design), jumping off a wall (testing gravity and snow compression), and snowball fights (principles of compression, strengths of materials, chemistry) are all awesome feats of awesomeness (copyright jimmyc, because its late and I said so)!
Kids don't go outside nearly as often as when I was a kid, but they should, and snow is the perfect excuse. "Its cold" is not a reason to stay inside. Your kids haveabercrombie and fitch Eddie Bauer (I am old) gloves, hats, scarves, underarmour, goretex coats, etc. Use them dammit! I might be strange, but I would walk miles in the snow just to do it. And I know I am not alone. I saw it in a movie once, "O Captain, My Captain".
The blog post had the air of "old" people having a snow day. Snow days are about kids (movie: Snow Day). And if kids have snow days they should be out being kids learning about the awesomeness of snow. And, yes sometimes that means doing hard work to achieve your goals. Maybe it is shoveling the sidewalk or digging out the family car, or snow removal from the roof. And then its sledding but not before gathering all the snow on your block to make the awesome luge run you saw on the Olympics ... getting water to make it nice and icy (Chemistry, engineering, physics, science!). Just to go faster.**
So stop being "old". You will have plenty of time to make your kids old. Give them the tools to be young, creative, and inventive ... away from the computer (unless autocad can do design work on snow). I guess I am saying we should always be thinking about investments not immediate, gratifying incentives.
* Not to be confused with collegiate activities which involve constructing *other things* with snow.
**Someone in OK was killed when she tied her sled to a truck and stood in it as they crossed a bridge. Not the kind of activity I endorse.
Real snow days are not for play dates, driving around, or going to the movies. No offense Freakonmics blogger. But snow days in my mind are dedicated to physics, science, engineering, and daredevil stunts. That is, when you have finished getting paid for shoveling driveways and sidewalks around the neighborhood. Lets not forget some good family time where you build a snowman family.*
Designing, building, and organizing an awesome sledding hill complete with jumps is an effort in fun, design, creativity, and organization. Furthermore constructing snow walls or barriers in order to have epic snowball fights is also a must. And lets not forget finding a wall so that you can pile up 3-6 feet and jump into it from above. Going super fast down a hill (engineering, physics, design), jumping off a wall (testing gravity and snow compression), and snowball fights (principles of compression, strengths of materials, chemistry) are all awesome feats of awesomeness (copyright jimmyc, because its late and I said so)!
Kids don't go outside nearly as often as when I was a kid, but they should, and snow is the perfect excuse. "Its cold" is not a reason to stay inside. Your kids have
The blog post had the air of "old" people having a snow day. Snow days are about kids (movie: Snow Day). And if kids have snow days they should be out being kids learning about the awesomeness of snow. And, yes sometimes that means doing hard work to achieve your goals. Maybe it is shoveling the sidewalk or digging out the family car, or snow removal from the roof. And then its sledding but not before gathering all the snow on your block to make the awesome luge run you saw on the Olympics ... getting water to make it nice and icy (Chemistry, engineering, physics, science!). Just to go faster.**
So stop being "old". You will have plenty of time to make your kids old. Give them the tools to be young, creative, and inventive ... away from the computer (unless autocad can do design work on snow). I guess I am saying we should always be thinking about investments not immediate, gratifying incentives.
* Not to be confused with collegiate activities which involve constructing *other things* with snow.
**Someone in OK was killed when she tied her sled to a truck and stood in it as they crossed a bridge. Not the kind of activity I endorse.
Tuesday, February 1, 2011
more storm communication
Did the forecast scenario's and/or forecaster methods work tonight? On one hand, the uncertainty on the depth of snow was well communicated. But both the TV and NWS focused in on a time frame when the precip would start ... 3 hours later and they missed on the precipitation type. Midnight was in numerous forecasts until the radar indicated otherwise. Pesky thunderstorms again. The precipitation type in Norman was rain for like 5 minutes before massive amounts of sleet were being dumped on us.
I don't think this was a failure at all. Just an example of how difficult it is to convey uncertainty information. In the public eye you have to convey a time. But when you do, it needs to be accurate and precise. And it just isn't possible to be precise and accurate in situations like this. I can let the sleet go because it was amazingly awesome that so much sleet fell ... better than freezing rain!
We live in very uncertain times and we simply lack the communication skills, that practical knowledge of how to convey uncertainty, without falling back to hard and precise numbers. Humans have had to live with but not necessarily convey uncertainty information. We say things are uncertain then make assumptions and decisions based on that uncertainty probably without thinking too hard about conveying it. We acknowledge it, decide, and move on. Whats in the past may be reflected upon, but the situations are always unique such that it is hard to know what, if any, information may have changed your mind one way or another. Sometimes we make decisions based on gut instinct or informed decisions about "likely outcomes". At least we think we know what the outcome will be. More often than not we stumble upon the outcome while trying to make our way towards our anticipated outcome.
I find it interesting and also compelling that weather information, specifically concerning uncertainty, finds itself in the same situation as personal medical data and also the terror alert system. There was a good TED talk about the personal medical data; how it is displayed, what information to convey; giving the statistics of where you fit in. They are tailoring the information to the end user. In a similar way the terror alert system promises specific information, how to act on it, and who and where need to be alerted. It sounds like they are issuing a weather service style advisory,watch, warning type of system. I am eager to see what they bring to the table and how.
To wrap up, our society is dealing with uncertainty communication, and it is working its way down into all aspects of life. Clearly we have lots of work ahead and the lessons we learn now might help with regional climate change. A more educated society is required in these uncertain times to utilize this uncertainty information.
I don't think this was a failure at all. Just an example of how difficult it is to convey uncertainty information. In the public eye you have to convey a time. But when you do, it needs to be accurate and precise. And it just isn't possible to be precise and accurate in situations like this. I can let the sleet go because it was amazingly awesome that so much sleet fell ... better than freezing rain!
We live in very uncertain times and we simply lack the communication skills, that practical knowledge of how to convey uncertainty, without falling back to hard and precise numbers. Humans have had to live with but not necessarily convey uncertainty information. We say things are uncertain then make assumptions and decisions based on that uncertainty probably without thinking too hard about conveying it. We acknowledge it, decide, and move on. Whats in the past may be reflected upon, but the situations are always unique such that it is hard to know what, if any, information may have changed your mind one way or another. Sometimes we make decisions based on gut instinct or informed decisions about "likely outcomes". At least we think we know what the outcome will be. More often than not we stumble upon the outcome while trying to make our way towards our anticipated outcome.
I find it interesting and also compelling that weather information, specifically concerning uncertainty, finds itself in the same situation as personal medical data and also the terror alert system. There was a good TED talk about the personal medical data; how it is displayed, what information to convey; giving the statistics of where you fit in. They are tailoring the information to the end user. In a similar way the terror alert system promises specific information, how to act on it, and who and where need to be alerted. It sounds like they are issuing a weather service style advisory,watch, warning type of system. I am eager to see what they bring to the table and how.
To wrap up, our society is dealing with uncertainty communication, and it is working its way down into all aspects of life. Clearly we have lots of work ahead and the lessons we learn now might help with regional climate change. A more educated society is required in these uncertain times to utilize this uncertainty information.
More storm discussion
The plot thickens. Not only are subsequent model forecasts increasing the snow for central OK, but the global model has backed way off. The SREF was going crazy with anywhere from 1inch to 22 inches of snow for Norman as of 2100 UTC. The operational NAM was going for 24 inches, and the GFS was going for 1 inch. Talk about uncertainty. The funny thing is that the NAM gets its boundary conditions from the previous GFS model run.
I speculate that the width and depth of the cold air helped get the GFS a good initialization and as a result a good depiction for the NAM. But maybe that cold air depiction was projected onto too large scale such that the ingredients for this event were misaligned. Also, the NAM ingredients may be too well aligned such that convective feedback may have been playing a role in creating a lot of precipitation but was able to simulate the banded features well.
As of 9:25pm there is the lovely, heart pumping sound of thunder, big lightning streaking across the sky, and ice pellets! Some of which could count as hail. I had to take a break so I could enter some observations for the W-PING project at NSSL. On a related note station KPVJ in OK reported 0.25" of liquid equivalent in an hour (6.3 mm/hr). Impressive rate. If only we had data for the precipitation rates at all mesonet stations!
It looks like the main issues of the day were (in no particular order):
1. where would cold air be, its depth, and the depth and magnitude of the warm air layer aloft,
2. where would thunderstorms initiate and then travel over,
3. as a result of the position of the cold air, where would the surface low be,
4. and related to 3 where would the maximum divergence aloft be (horizontally coupled jets).
I looked at the SREF placement of the Upper Level Jet (ULJ) and looked at the diagnostics from SUNYA* (irrotational and non-divergent wind at 300 hPa; the 850 q vectors with the along and cross front components of the vertical motion). This event looks very similar to east coast snow storms with coupled jets. In this case the irrotational wind center was intense over OK nearly tripling in value over a 6 hour period (18 to 24 hr forecast) from this mornings NAM. The along front forcing for ascent was forecast to come through in 2 waves ... the leading one much weaker than the 2nd. Aloft the forcing for ascent is forecast to be more widespread and last longer into tomorrow late morning.
There were hints that a coupled jet structure was developing at 0000 UTC this evening. But closer inspection of the soundings suggested that a the true jet was near 200 hPa as evidenced by the 150 knot winds near 210 hPa at LZK and the 120 kts at SGF and 135 at LMN and OUN at about 145 kts. Only 0600 UTC balloons at TOP and LMN could help identify if a coupled jet feature did develop.
*Thanks to Kevin Tyle for trying to keep SUNYPAK alive!
I speculate that the width and depth of the cold air helped get the GFS a good initialization and as a result a good depiction for the NAM. But maybe that cold air depiction was projected onto too large scale such that the ingredients for this event were misaligned. Also, the NAM ingredients may be too well aligned such that convective feedback may have been playing a role in creating a lot of precipitation but was able to simulate the banded features well.
As of 9:25pm there is the lovely, heart pumping sound of thunder, big lightning streaking across the sky, and ice pellets! Some of which could count as hail. I had to take a break so I could enter some observations for the W-PING project at NSSL. On a related note station KPVJ in OK reported 0.25" of liquid equivalent in an hour (6.3 mm/hr). Impressive rate. If only we had data for the precipitation rates at all mesonet stations!
It looks like the main issues of the day were (in no particular order):
1. where would cold air be, its depth, and the depth and magnitude of the warm air layer aloft,
2. where would thunderstorms initiate and then travel over,
3. as a result of the position of the cold air, where would the surface low be,
4. and related to 3 where would the maximum divergence aloft be (horizontally coupled jets).
I looked at the SREF placement of the Upper Level Jet (ULJ) and looked at the diagnostics from SUNYA* (irrotational and non-divergent wind at 300 hPa; the 850 q vectors with the along and cross front components of the vertical motion). This event looks very similar to east coast snow storms with coupled jets. In this case the irrotational wind center was intense over OK nearly tripling in value over a 6 hour period (18 to 24 hr forecast) from this mornings NAM. The along front forcing for ascent was forecast to come through in 2 waves ... the leading one much weaker than the 2nd. Aloft the forcing for ascent is forecast to be more widespread and last longer into tomorrow late morning.
There were hints that a coupled jet structure was developing at 0000 UTC this evening. But closer inspection of the soundings suggested that a the true jet was near 200 hPa as evidenced by the 150 knot winds near 210 hPa at LZK and the 120 kts at SGF and 135 at LMN and OUN at about 145 kts. Only 0600 UTC balloons at TOP and LMN could help identify if a coupled jet feature did develop.
*Thanks to Kevin Tyle for trying to keep SUNYPAK alive!
Subscribe to:
Posts (Atom)