Opening S-5p with harp.import_product()

Hi,

I am trying to read a Sentinel-5p Level-2 NO2 product using the HARP python interface:

import harp

product = .../S5P_OFFL_L2__NO2____20200401T110350_20200401T124521_12784_01_010302_20200403T040102
test_harp = harp.import_product(product)

I am getting an error from time to time claiming:

---------------------------------------------------------------------------
CLibraryError                             Traceback (most recent call last)
~/Training/Sentinel.py in <module>
----> 1 test_harp = harp.import_product(product)

~/anaconda3/envs/rus/lib/python3.7/site-packages/harp/_harppy.py in import_product(filename, operations, options, reduce_operations, post_operations)
   1185     if _lib.harp_import(_encode_path(filename), _encode_string(operations), _encode_string(options),
   1186                         c_product_ptr) != 0:
-> 1187         raise CLibraryError()
   1188 
   1189     try:

CLibraryError: [HDF5] H5F_open(): file close degree doesn't match (major="File accessibilty", minor="Unable to initialize object") (H5Fint.c:1738)

When acessing the file with xarray - xarray.import_dataset(product) any issue in the process happens and I can process my data properly.

Any clue on this matter?

This might be because you already (or still) have the file open somewhere else (also using the HDF5 library underneath). Is this something you can confirm?

1 Like

I think it may be linked to this issue.

1 Like

That could explain the issue as prior to running harp.import_product() I am opening that file with xarray. However, there must be still something wrong since I use a with statmenent , which in theory closes the file automatically.

with xr.open_dataset(product, group=‘PRODUCT’) as file:

Anyways, If Ido not run xarray in advacned, harp.import_product() does not generate that error anymore. However, now I am facing a new situation (error):

In some files (I am processing 25 that cover my study area) when I use harp.import_product() the following is happening:

1- If I use harp.import_product(product), it reads the file properly
2- If I ise harp.import_product(product, operations=convert_operations), then the following error takes place:

---------------------------------------------------------------------------
NoDataError                               Traceback (most recent call last)
~/Training/Sentinel.py in <module>
      8 
----> 9     test_harp = harp.import_product(product, operations=convert_operations)
     11 

~/anaconda3/envs/rus/lib/python3.7/site-packages/harp/_harppy.py in import_product(filename, operations, options, reduce_operations, post_operations)
   1190         # Raise an exception if the imported C product contains no variables, or variables without data.
   1191         if _lib.harp_product_is_empty(c_product_ptr[0]) == 1:
-> 1192             raise NoDataError()
   1193 
   1194         # Convert the C product into its Python representation.

NoDataError: product contains no variables, or variables without data

Where convert_operations is a list containing derive, bin_spatial, and keep operations

From previous posts in this forum, I have seen this may be caused by a corrupted file and a redownload should solved. However, since opening the product without any operatoin works fine, I assume the issue is related to my operations. However, why is it working for some files (out of the 25 I have) and not some one in particular?

Any previous experience on this?

This is probably because you are using a filter that for some products filters out all satellite ground pixels in the product. It just means that there a no good quality measurements in the product. This is something that can happen.
You can explicitly catch the harp.NoDataError exception in your python code to handle this case.

I am also getting this error for about 20% of the files downloaded for January 2020:

CLibraryError: [HDF5] H5O__prefix_deserialize(): bad object header version number (major=“Object header”, minor=“Wrong version number”) (C:\ci\hdf5_1545244154871\work\src\H5Ocache.c:1231)
OR
CLibraryError: [HDF5] H5B__cache_deserialize(): wrong B-tree signature (major=“B-Tree node”, minor=“Bad value”) (C:\ci\hdf5_1545244154871\work\src\H5Bcache.c:181)

and sometimes other CLibraryErrors

I am not processing the files using xarray before but simply using “harp.import_product”.

It is strange that most of the files are executed normally and only around 20% of the files produce such errors. Could be that the files are corrupted during download?

Thanks.

It is very likely that the products got corrupted during download indeed.

Please try to redownload to see if that resolves it.

Has this issue got resolved already,
I encountered a similar error
CLibraryError: [HDF5] H5F_open(): file close degree doesn’t match (major=“File accessibilty”, minor=“Unable to initialize object”) (C:\ci\hdf5_1545244154871\work\src\H5Fint.c:1738)

And my Level 3 files get partially made ???

The resolution is mentioned right above. Just redownload the files as suggested.

Dear Sandar,

I have as test two s5p NO2 OFF_L2, both has been opened and processed properly using netcdf4, but now with the following piece of script,

export_path = “D:\TESTS\S-5P-SCRIPTS\DATA\process”

for i in input_files_OFFL:
harp_L2_L3 = harp.import_product(i, operations="
tropospheric_NO2_column_number_density_validity>75;
derive(tropospheric_NO2_column_number_density [Pmolec/m2]);
derive(datetime_stop {time});
latitude > 28. [degree_north] ; latitude < 31.6 [degree_north]; longitude > 50. [degree_east]; longitude < 56.1 [degree_east];
bin_spatial(360, 28., 0.01, 610, 50., 0.01);
derive(latitude {latitude}); derive(longitude {longitude});
keep(NO2_column_number_density, tropospheric_NO2_column_number_density,
stratospheric_NO2_column_number_density, NO2_slant_column_number_density,
tropopause_pressure, absorbing_aerosol_index, cloud_fraction, sensor_altitude,
sensor_azimuth_angle, sensor_zenith_angle, solar_azimuth_angle, solar_zenith_angle)")

export_folder="{export_path}/{name}".format(export_path=export_path, name=i.split('/')[-1].replace('L2', 'L3'))

harp.export_product(harp_L2_L3, export_folder, file_format='netcdf')

print(colored(‘All L2 products covnertod to L3’, ‘green’))

I got the following error,

Would you please to help!

This issue has already been discussed here

Thanks a lot, so, Is it not solved bug?,

Or is my filter is the problem, >75

Because, I don’t think, it’s download issue, I tested each single product separately.

Please explain what else you are doing in your script. Are you also using other libraries to access the products besides harp? e.g. h5py? Because if so, then that is your problem.

This is not a bug in HARP.

Dear Sandar,

These are all the libraries I’m using to access the products, but not h5py,

from netCDF4 import Dataset
import numpy as np
import matplotlib.pyplot as plt
from matplotlib.colors import LogNorm
from mpl_toolkits.basemap import Basemap
from glob import iglob
from os.path import join
from termcolor import colored
import sys
import os
import harp

Then it is your use of the netCDF4 library that is causing the problems. You cannot have the same file open with HDF5 (which both netCDF4 and HARP use underneath) at the same time when they use different fclose_degrees.
You have to make sure that your file is closed properly with netCDF4 before performing the harp import

1 Like

The error is right there at the bottom. You are not providing correct arguments. Please review error messages first before posting on this forum

ok I think I solved it but now I have a new error.
[HDF5] H5F__super_read(): truncated file: eof = 25682432, sblock->base_addr = 0, stored_eof = 458939235 (major=“File accessibilty”, minor=“File has been truncated”) (C:\ci\hdf5_1545244154871\work\src\H5Fsuper.c:623)
Do you know what should I do?

This means that your file is corrupted. Please try to download it again.