I hope somebody out there is able to help.
I am trying to use HARP to read S5P L2 HCHO files but the software is crashing on certain files. An example file can be downloaded here: Dropbox - File Deleted - Simplify your life
The code, and error message are below:
import harp
infile=‘S5P_RPRO_L2__HCHO___20180802T161044_20180802T175412_04161_01_010105_20190216T121336.nc’
hcho_product=harp.import_product(infile)
Traceback (most recent call last):
File “”, line 1, in
File “/Users/ianashpole/anaconda2/envs/harp_test/lib/python3.7/site-packages/harp/_harppy.py”, line 1134, in import_product
raise CLibraryError()
harp._harppy.CLibraryError: [HDF5] H5Z_filter_deflate(): inflate() failed (major=“Data filters”, minor=“Unable to initialize object”) (H5Zdeflate.c:123)
The above error only occurs with certain files, however. Some files of the same format can be read without a problem:
infile=‘S5P_RPRO_L2__HCHO___20181001T172123_20181001T190451_05013_01_010105_20190225T043227.nc’
hcho_product=harp.import_product(infile)
hcho_product
<Product variables=odict_keys([‘scan_subindex’, ‘datetime_start’, ‘datetime_length’, ‘orbit_index’, ‘validity’, ‘latitude’, ‘longitude’, ‘latitude_bounds’, ‘longitude_bounds’, ‘sensor_latitude’, ‘sensor_longitude’, ‘sensor_altitude’, ‘solar_zenith_angle’, ‘solar_azimuth_angle’, ‘sensor_zenith_angle’, ‘sensor_azimuth_angle’, ‘pressure’, ‘tropospheric_HCHO_column_number_density’, ‘tropospheric_HCHO_column_number_density_uncertainty_random’, ‘tropospheric_HCHO_column_number_density_uncertainty_systematic’, ‘tropospheric_HCHO_column_number_density_validity’, ‘tropospheric_HCHO_column_number_density_avk’, ‘HCHO_volume_mixing_ratio_dry_air_apriori’, ‘tropospheric_HCHO_column_number_density_amf’, ‘tropospheric_HCHO_column_number_density_amf_uncertainty_random’, ‘tropospheric_HCHO_column_number_density_amf_uncertainty_systematic’, ‘HCHO_slant_column_number_density’, ‘HCHO_slant_column_number_density_uncertainty’, ‘absorbing_aerosol_index’, ‘cloud_albedo’, ‘cloud_albedo_uncertainty’, ‘cloud_fraction’, ‘cloud_fraction_uncertainty’, ‘cloud_altitude’, ‘cloud_altitude_uncertainty’, ‘cloud_pressure’, ‘cloud_pressure_uncertainty’, ‘surface_albedo’, ‘surface_altitude’, ‘surface_altitude_uncertainty’, ‘surface_pressure’, ‘index’])>
An example ‘good’ file can be downloaded here: Dropbox - File Deleted - Simplify your life
Both of these files were downloaded from the Sentinel-5P Pre-Operations Data Hub (https://s5phub.copernicus.eu/dhus/#/home) on 13 Nov 2019.
I am working on a MacBook Pro (OX 10.13.6), and running python 3.7.3 in a clean anaconda environment that I installed HARP (and it’s dependencies) in, as instructed here: Installation — HARP 1.21 documentation
I’ve tried the classic uninstall/reinstall harp, restart machine etc but the problem keeps arising. I can also confirm that the python package netCDF4 also has issues with the first file:
import netCDF4
infile=‘S5P_RPRO_L2__HCHO___20180802T161044_20180802T175412_04161_01_010105_20190216T121336.nc’
fh=netCDF4.Dataset(infile,mode=“r”)
fh.groups[‘PRODUCT’].variables[‘formaldehyde_tropospheric_vertical_column’].shape
(1, 3245, 450)
hcho=fh.groups[‘PRODUCT’].variables[‘formaldehyde_tropospheric_vertical_column’][0,:,:]
Traceback (most recent call last):
File “”, line 1, in
File “netCDF4/_netCDF4.pyx”, line 3695, in netCDF4._netCDF4.Variable.getitem (netCDF4/_netCDF4.c:37910)
File “netCDF4/_netCDF4.pyx”, line 4376, in netCDF4._netCDF4.Variable._get (netCDF4/_netCDF4.c:47130)
RuntimeError: NetCDF: HDF error
As with HARP, netCDF4 can read the second file without an issue. Is anyone able to help me get data from this file (and others that have the same issue)?!
Thanks and all the best,
Ian