modis.collect
ModisRawH5
Warning
only implemented for MOD/MYD 11/13 products!
Class representing HDF5 file containing raw MODIS data.
This class will create an HDF5 file and collect data from MODIS HDF files into it. This file can then be used for smoothing in a subsequent step.
__init__(self, files, targetdir, vam_product_code=None, interleave=False)
special
Initialize instance ModisRawH5 class.
This creates an ModisRawH5 object. If the corresponding HDF5 file
already exists, it'll be automatically linked to it. If not,
the file will be created on the first update
run.
All HDF files in the files
list will be collected.
The user needs to be make sure that files
are of the same product, spatial
extent and that temporal consistency is conserved!
To make sure the update workflow is functioning as intended, it's important
that targetdir
is set correctly. This way existing HDF5 files can be updated,
and new ones created.
To select a specific subdataset, vam_product_code
needs to be provided. If not,
the defaults will be extracted (VIM / TDA/ TDT).
For VIM, 16 day composite products can be interleaved to form a synthetic 8 day product
if both satellites (MOD & MYD) are present in files
. The resulting HDF5 file
will be named with MXD
as product code.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
files |
List[str] |
A list of absolute paths to MODIS raw hdf files to be processed |
required |
vam_product_code |
str |
VAM product code to be processed (default VIM/LTD) |
None |
targetdir |
str |
Target directory for raw MODIS HDF5 file |
required |
interleave |
bool |
Boolean flag if MOD/MYD 13 products should be interleaved |
False |
Exceptions:
Type | Description |
---|---|
ValueError |
If other product than MOD/MYD 11/13 are provided. |
AssertionError |
If files from multiple products are provided (except interleave). |
AssertionError |
If duplicates are detected that can't be handled. |
AssertionError |
If |
create(self, compression='gzip', chunks=None)
Creates HDF5 file.
If the corresponding HDF5 is not found in the target directory,
it's created.
If no chunking scheme is specified using chunks
,
a generic one of (number rows // 25, 1) will be used where the rows represent the spatial dimension
and the colums the temporal dimension.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
compression |
str |
Compression for data (default = gzip). |
'gzip' |
chunks |
Tuple[int] |
Chunksize for data (tuple of 2 int; default = (rows//25, 1)). |
None |
Exceptions:
Type | Description |
---|---|
AssertionError |
If |
HDF5CreationError |
If creation of HDF5 file fails. |
update(self, force=False)
Updates MODIS raw HDF5 file with raw data.
The files specified in __init__
get collected into the HDF5 file,
which is either created before or already existed after a previous initialization.
If a HDF file can't be read, the datapoints are filled using the internal nodata value.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
force |
bool |
Force collect, using nodata for non-readable input files. |
False |
Exceptions:
Type | Description |
---|---|
AssertionError |
If dates of files to be collected are not after the ones aleady contained in the file. |
HDF5WriteError |
If writing to the HDF5 file fails. |
IOError |
If process fails to read HDF input file and force = False. |
Retruns
collected (List): List of collected HDF files.