Title: Comments from User Services
1Comments from User Services
- C. Boquist/Code 423
- The HDF Group Meeting
- 1 April 2009
2Question
- People from the HDF Group will be meeting with
ESDIS next week. Do any of you have any feedback
from your data centers that you would like me to
mention? I know this is a vague question, but Im
looking for your input on any issues you or your
user community may be experiencing.
3Responses
Users find the 2 web sites confusing
http//hdfeos.org http//www.hdfgroup.org Many
comments from users that they find the site hard
to navigate. Experienced broken links Have
directories changed? They need to update their
info on the pagesfor example, under the urls to
visit, the Earth Science Enterprise no longer
exists and has not been for quite some time.
They should possibly remove the DAACs language
to be more consistent with the ESDIS project.
4Responses
Users in the past couple of years have complained
that there is no longer support of the HDF
formatMany miss the HDF Users Forum from
before. They could use more documentation
explaining the differences between the processing
of HDF 4, HDF 5, and how the HDF-EOS formats and
products correlate. We receive requests for data
in netCDF, ASCII, GeoTIFF (or other GIS friendly
formats) tools to convert HDF data to these
formats would be nice.
5Responses
A good number of QuikSCAT users have reported
problems when working with HDF files on a Windows
platform. In almost every case it seemed to be an
issue of improperly linking or referencing the
HDF libraries when attempting to read the file.
Traditionally I would direct these users back
to the HDF Group, but I never had any users
report back to let me know if these issues had
been resolved. It would be nice to hear back from
HDF to see if they were able to resolve such
issues and if they could provide us with some
examples to help us troubleshoot these problems
in the future.
6Responses
Regarding HDF 1. The C API for HDF5 is OK
but it is of limited use because it's still
pretty low-level. Higher level functions that
make use of the C ability to figure out what
data type is being handled and act accordingly,
for example, would be most useful.2. HDF debug
mode is not terrifically helpful. There are an
awful lot of different debug flags floating
around in there with no instructions as to how to
set them. Having to resort to hacking the
makefiles is extreme but the only way, or so it
seems, to set some of the debug flags.
7Responses
3. SDgetchunkinfo() currently retrieves only
information about the chunking of an SDS it
doesn't retrieve information about the
compression, even though it writes the
information it does provide to a structure that
sets aside room for compression information as
well. There is no other way to inquire about SDS
compression, either. Are there any plans to give
users access to that information?
8Responses
Regarding HDFEOS and the Toolkit 1. The latest
revision of the Toolkit includes HDF4.2r3, but in
HDF4.2r3, both hdiff and hrepack crash on some of
our output. When will the Toolkit be built to
include HDF4.3r4, which doesn't have those
problems?2. When will HDF5 V1.8 be included as
part of the Toolkit?
9Responses
3. There is no good way to compile the Toolkit
with shared libraries enabled, which turned out
to be necessary to linking with some Perl scripts
on our machine. Hacking the Toolkit and HDFEOS
install scripts (with the aid of HDFEOS
personnel) solved only part of the problem
because the install script still builds static
libraries for the zip, szip, and jpeg utilities
and HDFEOS uses precompiled Gctp libraries.
(Another makefile hack did the Gctp library.) The
whole install method needs to be more versatile.
10Responses
4. In the HDF-EOS Swath interface, it's possible
to define dimension maps connecting data
collected with different spatial resolution. The
mapping is described by two integers named offset
and increment. For instance, the geolocation
information in a MOD021KM file is a 5x5
sub-sampling of the 1km geolocation file. Row 0,
column 0 of geolocation information corresponds
exactly to row 2, column 2 of the image data, so
the offset is 2, and the increment is 5, in both
directions. However, for the MOD02HKM files, the
relationship between the 500 m image data and the
1 km geolocation data doesn't fit this model. In
the track direction, each 1 km pixel is centered
exactly half-way between two 500 m pixels, so the
offset should be 0.5 but an integer can't hold
that value. (cont.)
11Responses
4. (cont.) Can anything be done to address this
problem? We currently add a "FractionalOffset"
file attribute to document this, but only tools
that know about this attribute can make any use
of it. 5. While the HDF-EOS library has
functions that can set Swath dimension map
information, it doesn't have any which use that
information. Shouldn't there be a SWinterpolate()
which uses that information to interpolate data
fields, similar to the way GDinterpolate does for
Grid data?