Sadashiva Devadiga - PowerPoint PPT Presentation

About This Presentation
Title:

Sadashiva Devadiga

Description:

MODIS Land Product Quality ... to different products communication overhead within science team The Land Data Operational Product Evaluation ... – PowerPoint PPT presentation

Number of Views:58
Avg rating:3.0/5.0
Slides: 32
Provided by: Sadashiva3
Category:

less

Transcript and Presenter's Notes

Title: Sadashiva Devadiga


1
MODIS Land Product Quality Assessment
  • Sadashiva Devadiga
  • 610.2/614.5 Branch MeetingMarch 1, 2011

2
MODIS Land Product Quality Assessment
  • Introduction
  • Satellite Product Performance
  • Sources of Error
  • MODIS Land QA
  • MODIS Land Organization
  • MODIS Land Product Interdependency
  • MODIS Land QA Components and Role
  • LDOPE QA Activities
  • QA Sampling
  • Dissemination of QA results
  • Algorithm Testing and Evaluation
  • QA Tools
  • Summary

3
IntroductionSatellite Product Performance
  • The research application usages of satellite
    data derived products put a high priority on
    providing statements concerning product
    performance
  • The correct interpretation of scientific
    information from global, long term series of
    remote-sensing products requires the ability to
    discriminate between product artifacts and
    changes in the Earth physical processes being
    monitored.
  • For example, is it global warming or sensor
    calibration decay ?

4
IntroductionSatellite Product Performance
  • But the reality is that although every attempt is
    made to ensure that products are generated
    without error, it is generally neither desirable
    nor practical to delay distribution until
    products are proven error-free or until known
    errors have been removed by product reprocessing
  • errors may be introduced at any time during the
    life of the instrument and may not be identified
    for a considerable period
  • the user community plays an important, although
    informal, role in assessing product performance

5
IntroductionSatellite Product Performance
  • Product performance information is provided by
  • Validation Quantify product accuracy by
    comparison with truth/reference data
    distributed over a range of representative
    conditions
  • Quality Assessment Evaluate product scientific
    quality with respect to intended performance
  • both are integral activities in the production of
    science quality products.
  • Product performance information is required by
  • users to consider products in their appropriate
    scientific context
  • the science team to identify products that are
    performing poorly so that improvements may be
    implemented

6
IntroductionProduct Performance Quality
Assessment
  • Evaluate product scientific quality with respect
    to intended performance.
  • Performed by examination of products, usually
    without inter-comparison with other data.
  • A routine near-operational activity.
  • Results are stored in the product as per-pixel
    flags and metadata at the product file level
    (written in the production code and
    retrospectively).
  • The QA process is the first step in problem
    resolution, may lead to
  • update of production codes
  • science algorithms
  • to rectify issues identified through QA.
  • Users should check QA results when ordering and
    using products to ensure that the products have
    been generated without error or artifacts.

7
IntroductionProduct Performance Validation
  • Quantify product accuracy over a range of
    representative conditions
  • Performed by comparison of product samples with
    independently derived data that include field
    measurements and remote sensing products with
    established uncertainties
  • Typically periodic/episodic activities e.g.,
    field validation campaigns
  • Results published in the literature and on web
    sites years after product generation.
  • Results define the error bar for the entire
    product collection and are not intended to
    capture artifacts and issues that may reduce the
    accuracy of individual product granules.
  • Users should consider validation results with
    respect to the accuracy requirements of their
    applications.

8
IntroductionSources of Error
  • Errors may be introduced by numerous, sometimes
  • interrelated, causes that include
  • instrument errors
  • incomplete transmission of instrument and
    ephemeris data from the satellite to ground
    stations
  • incomplete instrument characterization and
    calibration knowledge
  • geolocation uncertainties
  • use of inaccurate ancillary data sets
  • software coding errors
  • software configuration failures (whereby
    interdependent products are made with mismatched
    data formats or scientific content)
  • algorithm sensitivity to surface, atmospheric and
    remote sensing variations
  • errors introduced by the production, archival and
    distribution processes

9
IntroductionExample of Products with Error
Data loss in granule 2140 on day 2011035 due to
FOT Contact Error
Striping in LSR product from the Mirror Side
Polarization Difference in band 3 of Terra MODIS
MODIS data affected by Partial Solar Eclipse on
Jan 04, 2011
Gridded LSR from 2008213.h09v05. shows
geolocation error resulting from a maneuver
which was later waived too late
LST dependency on latitude, traced to the Cloud
Mask which is an input to LST algorithm
Stripes of Fire in granule 0830, day 2005068
Band 21 was degraded, Error in the new emissive
LUT used by the L1B
10
MODIS Land Product Quality AssessmentMODIS Land
QA Land Team Organization
  • The MODLAND Science Teams and Science Computing
    Facilities are distributed across the United
    States.
  • The Science Teams are responsible for developing
    the science algorithms and processing software
    used to produce one or more of the MODLAND
    products
  • The processing software are run in a dedicated
    production environment
  • the MODIS Adaptive Processing System (MODAPS)
    located at NASA Goddard Space Flight Center
    (GSFC)
  • The standard MODLAND products generated by the
    MODAPS are archived at MODAPS (LAADS) and sent to
    Distributed Active Archive Centers (DAACs) for
  • product archival
  • product distribution to the user community

11
MODIS Land QAMODIS Land Products
  • Energy Balance Product Suite
  • Surface Reflectance
  • Land Surface Temperature, Emmisivity
  • BRDF/Albedo
  • Snow/Sea-ice Cover
  • Vegetation Parameters Suite
  • Vegetation Indices
  • LAI/FPAR
  • GPP/NPP
  • Land Cover/Land Use Suite
  • Land Cover/Vegetation Dynamics
  • Vegetation Continuous Fields
  • Vegetation Cover Change
  • Fire and Burned Area

12
MODIS Land QALand Product Interdependency
13
MODIS Land QAQA Roles
  • The Science Team are responsible for performing
    QA of their products, but it is time consuming,
    complex and difficult to manage.
  • large number of products
  • large data volume
  • dependencies that exist between products
  • different QA procedures applicable to different
    products
  • communication overhead within science team
  • The Land Data Operational Product Evaluation
    (LDOPE) facility was formed to support the ST and
    to provide a coordination mechanism for MODLANDs
    QA activities
  • The MODAPS processing and DAAC archival staff are
    responsible for ensuring the non-scientific
    quality of the products, they ensure that
  • production codes are correctly configured
  • products are made using the correct input data
  • products are not corrupted in the production,
    transfer, archival, or retrieval processes.

14
MODIS Land QAQA Components
  • Code
  • automatic QA documented as per pixel QA flags and
    as QA metadata
  • MODAPS/DAAC
  • non-science production, archive and distribution
    QA (by operators)
  • SCF
  • selective science QA (by science team),
    communicated to LDOPE
  • LDOPE
  • routine and coordinated science QA (by science
    team representatives)
  • testing dependencies
  • MODLAND QA services on LDOPE web site
  • Global Golden tile Browse, Animations, Time
    series
  • Science Quality Flag Science Quality Flag
    Explanation
  • Known issues
  • Competent User Feedback
  • DAAC User Services

15
MODIS Land QAData and QA Flow
  • All QA issues are reported to LDOPE for initial
    investigation
  • LDOPE does QA of all products, mostly generic
    QA, works with SCFs on science specific QA
  • SCFs perform QA of selected products and is
    responsible for scientific QA of their product.

Data
QA
16
LDOPE QA Activities (1/2)
  • Routine Operational QA of Land Products
  • Sample data granules by examination of global
    browses, golden tile browses, animations and time
    series for product quality problems.
  • Where inspections indicate low product quality or
    anomalous behavior, the relevant product granules
    are subjected to more detailed assessment
  • Adhoc QA in response to anticipated or reported
    events or issues
  • Investigate issues reported by data users, DAACs,
    and Science Teams
  • Investigate possible product issues in response
    to satellite maneuvers, instrument problem, MCST
    actions (LUT updates) and other reported data
    problems such as change or missing ancillary etc.

17
LDOPE QA Activities (2/2)
  • Disseminate QA Results
  • All results posted on the QA web page
  • Known product issues are posted on the QA know
    issue page. Issues are categorized as Pending,
    Closed, or Note and are updated to reflect the
    current production status.
  • Document the Product Quality i.e. update Science
    Quality Flag and Explanation
  • Work with Science Team to resolve the issues.
  • Test and evaluate algorithm updates
  • Suggest algorithm updates to resolve known issues
  • Understand algorithm updates and identify the
    science tests
  • Do an independent evaluation of the test results
    and report the evaluation to science teams.
  • Develop, distribute and maintain QA tools
  • Maintain QA tools required for data analysis
  • Identify and implement new QA tools for use at
    the QA facility and for use by the science team
  • Tools can be generic and product specific

18
LDOPE QA ActivitiesQA Sampling Global Browses
  • Land PGEs generate coarse spatial resolution
    version of the products (5km) using appropriate
    aggregation schemes.
  • Selected data sets from the coarse resolution
    products are projected into a global coordinate
    system and displayed on the MODIS Land QA web
    page
  • The browse images are generated in JPEG/GIFF
    format with fixed contrast stretching and color
    LUTs to enable consistent temporal comparison.
  • The web interface supports interactive selection
    of browse products and zooming and panning at 5km
    resolution.
  • MODIS Land Global Browse Web Page
  • http//landweb.nascom.nasa.gov/cgi-bin/browse/brow
    se.cgi

19
LDOPE QA ActivitiesQA Sampling - Animation
  • Animations provide another effective way to
    illustrate the MODIS Land product functioning and
    to assess the product quality.
  • LDOPE generates yearly animations of the n-day
    global browses at coarser resolution and regional
    animation of product browses for individual
    continents at higher resolution.
  • Animation of Global Browses
  • http//landweb.nascom.nasa.gov/animation/
  • Animation using Google Earth
  • http//landweb.nascom.nasa.gov/gEarth/

20
LDOPE QA ActivitiesQA Sampling Golden Tiles
  • LDOPE monitors product quality by examining the
    full resolution browses of the gridded products
    at fixed geographical locations of size 10 deg x
    10 deg known as golden tiles.
  • Golden tile browses are posted from the recent
    32-days of production.
  • Animation of these browses enable quick review of
    the products from these locations.
  • Golden Tile Browses and animations on the web
  • http//landweb.nascom.nasa.gov/cgi-bin/goldt/goldt
    Browse.cgi

21
LDOPE QA ActivitiesQA Sampling Product Time
Series
  • In many cases, issues that affect product
    performance are seen only through examination of
    long-term product series
  • Time series of summary statistics are derived
    from all the gridded (L2G, L3, L4) MODLAND
    products at the Golden Tiles.
  • Summary statistics include mean, standard
    deviation and number of observations.
  • Only good quality observations are used to
    compute the statistics.
  • Statistics are computed separately for each
    biome, land cover and for predetermined sites of
    3kmx3km size.
  • Product time series analyses capture changes in
    the instrument characteristics and calibration,
    algorithm sensitivity to surface (e.g.,
    vegetation phenology), atmospheric (e.g., aerosol
    loading) and remote sensing (e.g.,
    sun-surface-sensor geometry) conditions that
    change temporally and enable comparison between
    reprocessed products and between different years
  • Golden Tile Time Series on the QA web page
  • http//landweb.nascom.nasa.gov/cgi-bin/tsplots/gen
    Option.cgi

22
LDOPE QA ActivitiesDissemination of QA Results
  • Informal QA results
  • Product issues posted on a public web site with
    examples, algorithm version and occurrence
    information. The issues are labeled as either
    Pending, Closed or Note.
  • Known Product Issue on the web
  • http//landweb.nascom.nasa.gov/cgi-bin/QA_WWW/newP
    age.cgi?fileNameterra_issues
  • Science QA metadata also posted on the web site
  • Product Quality Documentation on the web
  • http//landweb.nascom.nasa.gov/cgi-bin/QA_WWW/qaFl
    agPage.cgi?satterraverC5

23
LDOPE QA ActivitiesAlgorithm Testing and
Evaluation
  • Product Collections and Collection Reprocessing
  • The MODLAND products has been reprocessed several
    times (C1, C3, C4, and C5).
  • Reprocessing involves applying the latest
    available version of the science algorithm to the
    MODIS instrument data and using the best
    available calibration and geolocation
    information.
  • A collection numbering scheme is used to
    differentiate between different reprocessing
    runs.
  • The collection number is apparent in the product
    filename e.g. MCD12Q2.A2006001.h20v08.005.20093092
    04143.hdf
  • All major algorithm updates are proposed,
    implemented, tested and evaluated before the
    collection reprocessing.
  • Only minor updates to algorithm are accepted
    within a collection reprocessing
  • Under rare circumstances another reprocessing of
    selected products within a collection are
    approved (e.g. C4.1 LST, C5.1 Atmosphere).

24
LDOPE QA ActivitiesAlgorithm Testing and
Evaluation
LDOPE works with the science teams in planning of
the science tests and evaluation of the test
results.
Science team code development and SCF testing
  • Integrate the code into MODAPS production
    environment
  • Code unit test
  • Update file specification and production rules

Integration
  • Run code in MODAPS
  • globally
  • multiple days
  • Science team and QA group evaluate
  • extent of the algorithm change
  • impact on downstream products

Science Test
Product ChangeApproval
MODAPS Production
25
LDOPE QA ActivitiesAlgorithm Testing and
Evaluation
  • LDOPE maintains a Science Test web page
    containing following information
  • Algorithm change
  • Test plan and status
  • Evaluation Result
  • LDOPEs C5 Land Science Test Web Page
  • http//landweb.nascom.nasa.gov/cgi-bin/QA_WWW/newP
    age.cgi?fileNamesciTestMenu_C5
  • LDOPEs C6 Land Science Test Web Page
  • http//landweb.nascom.nasa.gov/cgi-bin/QA_WWW/newP
    age.cgi?fileNamesciTestMenu_C6

26
LDOPE QA ActivitiesQA Tools
  • LDOPE develops and maintains a set of tools for
    use in QA of the MODIS land products and other
    related input products
  • Generic Tools Applicable to most of the products
  • Product specific Applicable to only a small
    subset of products. Mostly developed by
    individual science teams.
  • Tools are developed in C and compiled and tested
    for Linux/Irix/PC
  • Tools have uniform syntax/command line format
  • Tools can be easily scripted for batch processing
  • Processes HDF4 files and generates HDF4 output
  • Tools are transparent to MODIS/VIIRS/AVHRR data
  • ENVI-based GUI interface has been developed to
    run most of the tools within ENVI
  • A subset of generic LDOPE QA tools are
    distributed to public by the LP DAAC

27
LDOPE QA ActivitiesQA Tools
  • QA based Masking of MODIS 8-day LSR to filter
    cloud clear land with low/average aerosol Uses
    pixel level QA for masking

MOD09A1.A2000305.h20v10.005.2006330041025.hdf RGB
composite of Surface Reflectance Bands 1, 3, and
4
28
LDOPE QA ActivitiesQA Tools
  • Making Coarse Resolution Products
  • by subsample by majority class

MOD12Q1.A2001001.h20v11.004.2004358134406.hdf Fals
e color image of Land Cover
MOD09A1.A2000305.h20v10.005.2006330041025.hdf RGB
composite of Surface Reflectance Bands 1, 3, and
4
Water Open shrub lands Crop lands Broadleaf
forest, mixed forest, Savanna Deciduous needle
leaf forest grassland
29
LDOPE QA ActivitiesQA Tools
  • Simple SDS arithmetic the tool internally reads
    and handles fill values

LST_Day_1km
LST_Night_1km
LST_Day_1km LST_Night_1km
MOD11A2.A2001065.h20v10.005.2007002071908.hdf
lt 0 0 2 K 2 6 K gt 6 K Not computed
30
LDOPE QA ActivitiesQA Tools
  • Unpack QA Bits Transparent to all products in
    HDF4

31
Summary
  • LDOPE was introduced as a centralized QA facility
    supporting the MODIS Land Science Team in the
    Assessment of the Land Data Products
  • Perform routine and coordinated QA of all MODIS
    land products.
  • Work with science teams to resolve quality
    related problems in products and suggest
    algorithm updates
  • Test and evaluate algorithm improvements.
  • Currently LDOPE also supports QA of land products
    from the VIIRS land algorithms and AVHRR
    reprocessing for generation of Long Term Data
    Records.
  • LDOPE works within the Land PEATE to evaluate
    performance of VIIRS Land algorithms and suggest
    improvements to algorithms
  • Production and evaluation of LTDR (Long Term Data
    Records) series by reprocessing of the AVHRR data.
Write a Comment
User Comments (0)
About PowerShow.com