There is a small example located in the `hlhdf/pyhl' directory called `rave_info_type' which implements a small compound type definition. Basically this module defines an object containing xscale, yscale,xsize and ysize variables. This module has also got a type class which should be used.
import _pyhl
import _rave_info_type
# Create the rave info HDF5 type
typedef = _rave_info_type.type()
# Create the rave info HDF5 object
obj = _rave_info_type.object()
# Set the values
obj.xsize=10
obj.ysize=10
obj.xscale=150.0
obj.yscale=150.0
aList = _pyhl.nodelist()
# Create a datatype node
aNode = _pyhl.node(_pyhl.TYPE_ID,''/MyDatatype'')
# Make the datatype named
aNode.commit(typedef.hid())
aList.addNode(aNode)
# Create an attribute containing the compound type
aNode = _pyhl.node(_pyhl.ATTRIBUTE_ID,''/myCompoundAttribute'')
# Note that I use both itemSize and lhid
# Also note how I translate the compound object to a string
aNode.setScalarValue(typedef.size(),obj.tostring(),''compound'',typedef.hid())
aList.addNode(aNode)
# Better create a dataset also with the compound type
obj.xsize=1
obj.ysize=1
aNode = _pyhl.node(_pyhl.DATASET_ID,''/myCompoundDataset'')
# I use setArrayValue instead
aNode.setArrayValue(typedef.size(),[1],obj.tostring(),''compound'',typedef.hid())
aList.addNode(aNode)
# And finally write the HDF5 file.
aList.write(``compound_test.hdf'')
When checking this file with h5dump, the command syntax would be:
prompt% h5dump compound_test.hdf
And the result would be:
HDF5 "compound_test.hdf" { GROUP "/" { ATTRIBUTE "myCompoundAttribute" { DATATYPE { H5T_STD_I32LE "xsize"; H5T_STD_I32LE "ysize"; H5T_IEEE_F64LE "xscale"; H5T_IEEE_F64LE "yscale"; } DATASPACE { SCALAR } DATA { { [ 10 ], [ 10 ], [ 150 ], [ 150 ] } } } DATATYPE "MyDatatype" { H5T_STD_I32LE "xsize"; H5T_STD_I32LE "ysize"; H5T_IEEE_F64LE "xscale"; H5T_IEEE_F64LE "yscale"; } DATASET "myCompoundDataset" { DATATYPE { "/MyDatatype" } DATASPACE { SIMPLE ( 1 ) / ( 1 ) } DATA { { [ 1 ], [ 1 ], [ 150 ], [ 150 ] } } } } }