Title: Toward Better
1Toward Better Analysis Grids
Already have LAPS, MSAS, ADAS
all are pretty good - but each have drawbacks
2Drawbacks
Data latency issues
Data comes in bursts, many with old data included
First guess issues
a 2-km analysis using a 20km first guess
Analysis issues
Timeliness
Drawing for all the data
3Could we make an analysis system using something
similar to serp?
Could it mitigate some of the other drawbacks?
fast enough to "re-run" and consider new data for
old hours draw to all data points use a GFE
grid as the first guess - no resolution
differences
4Every hour
Look through netCDF files of METAR and mesonet
data and find recently received data (look
through last 6 hours?) Go back, and re-run the
analysis for 6 hours ago, based on all the data
now available. Then do 5 hours ago, 4 hours
ago, etc.
5Every hour continued
do one analysis using previous hour analysis as
first guess. Also do one using GFE model
forecast as first guess. Then average the two
(helps to quickly "get rid of" bad data). Do for
T, Td, and Wind Then use normal GFE tools to
calculate RH, MinT, MaxT, etc.
6After many improvements to serp efficiency, we
can run 7 hours (6 hours ago to now) of analyses
for T, Td, Wind - at 2.5km resolution in about 10
minutes on GFEDEVEL boxes. (typically around 300
T obs, 100 Td obs, 100 wind obs - after a couple
of hours) Can be run via cron - no user
interaction needed.
7STATUS
Worked on speed/efficiency of serp tool Working
on ease of configuration - do netCDF data
retrieval using python instead of perl. Need a
GUI user interface to manually get rid of the
occasional bad data values Need a seperate
MinT/MaxT analysis. probably run manually - but
get as much data as we can automatically.
8The resulting grids - Temperature
9The resulting grids - Dewpoint
10The resulting grids - RH (derived)
11The resulting grids - Wind
12Typical Data Distribution - Temperature