Sentinel 5P- CO NaN array after using Harp convert

Dear team,
I downloaded the data

S5P_OFFL_L2__CO_____20200610T210512_20200610T224642_13783_01_010302_20200612T104836.nc
and was trying to create a geotiff image from this.
As mentioned I used harp to convert to an intermediate nc file using this :

harpconvert -a ‘CO_column_number_density_validity>50;bin_spatial(920,-5,0.025,1160,86,0.025);’
S5P_OFFL_L2__CO_____20200610T210512_20200610T224642_13783_01_010302_20200612T104836.nc testing.nc

as wel as this :
harpconvert -a ‘CO_column_number_density_validity>50;keep(datetime_start,datetime_length,CO_column_number_density,latitude_bounds,longitude_bounds);bin_spatial(2001,-90,0.01,2001,-180,0.01);derive(CO_column_number_density)”,derive(CO_column_number_density)”,post_operations=
“bin();squash(time,
(latitude_bounds,longitude_bounds));derive(latitude
{latitude});derive(longitude {longitude})’
S5P_OFFL_L2__CO_____20200610T210512_20200610T224642_13783_01_010302_20200612T104836.nc ll3.nc

It does covert to the new nc files, however, when i try to open this
using gdal and read the array it shows a unique single value

import gdal
sc=gdal.Open(’/home/arjundebnath21/s5/ll3.nc’,gdal.GA_ReadOnly)
band_ds = gdal.Open(sc.GetSubDatasets()[0][0], gdal.GA_ReadOnly)
band_array = band_ds.ReadAsArray()
np.unique(band_array)

The array has the value array([9.96920997e+36]), which i believe is
the Nodata value.

Can you please help me understand why this is happening ?

Also , After converting to a h5 file , if I want to convert it to a
tiff file , there are different bands for the lat and long bounds (
whose shapes are different from the CO column density ) . Can you also
please guide how to create a single geotiff file from this ?

Your help would be much appreciated .
Thanks and regards,
Arjun

There are several things wrong with what you are trying to do.

First of all, the harpconvert command that you posted could never have worked. You are mixing up the python and command line interfaces. If you post something, please post what you actually did. That command could never have produced a ll3.nc file.

Second, you are using a very strange target grid. you are using a grid that uses 2000x2000 cells starting at (-90,-180) with a cell size of 0.01. This means that your upper left corner is (-70,-160). Are you really looking for a South Pole area?

The value is 9.96920997e+36 is something that GDAL comes up with. The original data values are NaN. Which makes sense, since there is like no SO2 detected around the South Pole.

Right i see the error now, im using this command and this is working

harpconvert -a ‘CO_column_number_density_validity>50 latitude > -55 [degree_north]; latitude < 80 [degree_north];keep(datetime_start,datetime_length,CO_column_number_density,latitude_bounds,longitude_bounds);bin_spatial(271,-55,0.5,721,-180,0.5);bin();squash(time, (latitude_bounds,longitude_bounds));derive(latitude {latitude});derive(longitude {longitude});exclude(latitude_bounds,longitude_bounds,latitude_bounds_weight,longitude_bounds_weight,count,weight)’.nc onefile.nc

Also, is it possible to use the harp merge from the python script itself? by using the
harp.import_product

Yes. You can just provide a list of product file paths as first argument to harp.import_product(). It will then perform the merge as part of the import. You would have to use this post_operations as you quoted above to properly split the operations to be performed on each file before the merge, or on the result after the merge.

yeah thanks, just figured this out based on the docs

import harp
p=harp.import_product(filename=glob.glob(os.path.join(’~/s5/oneday/’,‘S*.nc’)),
operations=“CO_column_number_density_validity>50;latitude > -55 [degree_north]; latitude < 80 [degree_north];bin_spatial(271,-55,0.5,721,-180,0.5); derive(longitude {longitude}); derive(latitude{latitude})”,
-55 [degree_north]; latitude < 80 [degree_north];bin_spatial(271,-55,0.5,721,-180,0.5); derive(longitude {longitude}); derive(latitude{latitude})",
post_operations=“bin();squash(time, (latitude,longitude))”)

harp.export_product(p, “h_script.nc”, file_format=‘netcdf’)

Also, Any suggestion on how to convert the merged nc files to a single raster ? like how do we handle the nodata value while exporting to tiff
Thanks :slight_smile:

Personally, I am using gdal_translate for this, which works fine.
I am not sure what you mean with ‘dealing with nodata values’. How you deal with NaN data is very domain specific and depends on what you actually want to use the data for.

Hi,
I used this tool to merge all 14 orbit files to a single file ( 1 day )

product = harp.import_product(filename=glob.glob(os.path.join(’~/’,‘S*.zip’)),
operations=‘CO_column_number_density_validity>50;derive(CO_column_volume_mixing_ratio {time} [ppbv]);keep(latitude_bounds,longitude_bounds,CO_column_volume_mixing_ratio);bin_spatial(1801,-90,0.1,3601,-180,0.1)’,
post_operations=‘bin();squash(time, (latitude_bounds,longitude_bounds));derive(latitude {latitude});derive(longitude {longitude});exclude(latitude_bounds,longitude_bounds,latitude_bounds_weight,longitude_bounds_weight,count,weight)’)
harp.export_product(product, “all.nc”, file_format=‘netcdf’)

when i try to visualise this is what i get
i just wanted to check if this looks normal , because it seems to have missed regions above canada

Tropomi measures reflected sunlight, so you will not have measurements where the poles are (near) dark (i.e. if it is winter there).