S5P Total Ozone Column | Important parameters missing

Dear Sander
According to the S5P TOC ReadMe Files, http://www.tropomi.eu/document/product-readme-file-total-ozone-combined-nrti-offl, page 7, for the filtering of the TOC users should not use the qa but rather a combination of parameters, namely:
FOR THE NRTI PRODUCT:
• ozone_total_vertical_columnn out of [0 to 0.45] 􀀁
• ozone_effective_temperature out of [180 to 280] 􀀁
• fitted_root_mean_square larger than 0.01 􀀁
FOR THE OFFL PRODUCT:
• ozone_total_vertical_column out of [0 to 0.45]
• ozone_effective_temperature out of [180 to 260]
• ring_scale_factor out of [0 to 0.15]
• effective_albedo out of [-0.5 to 1.5]

If I am reading this correctly, http://stcorp.github.io/harp/doc/html/ingestions/S5P_L2_O3.html, the ring scale factor and the fitted root mean square are not extracted.

Would be very useful, and necessary for whoever wants to use HARP for S5P TOC studies.

Best wishes
MariLiza

Further to my email above, I would like to request that the following parameter is also included in he ingestion:
‘PRODUCT/SUPPORT_DATA/DETAILED_RESULTS/ozone_ghost_column’

I would also like to enquire as to the following:
How can I set the ingestion to read the CRB cloud parameters, such as the cloud fraction ? is the
condition: OFFL
set automatically when the HAPR reads in an OFFL file or do I need to explicitly state this?

Many thanks again,
MariLiza

If I am reading this correctly, S5P_L2_O3 — HARP 1.21 documentation, the ring scale factor and the fitted root mean square are not extracted.

Would be very useful, and necessary for whoever wants to use HARP for S5P TOC studies.

In HARP we don’t ingest the ring_scale_factor or fitted_root_mean_square since these variables do not fall into any of the naming conventions in HARP (they are very algorithm specific and therefore cannot be ‘harmonised’).
For the ingestions that are builtin into HARP we have strict rules. First because of some special optimisations that we have when you perform filtering and use large datasets (the ingestions are quite complex internally to try to make them fast). But we also need to make sure that the ingestions are exemplary. So some types of quantities will never end up in there.
If this quality filtering approach for O3 would have been permanent, we might have solved this by hardcoding the filtering within the HARP import function (to be turned on as an ingestion option). But I know that there will be an update of the L2 product at some point that allows filtering based on just the qa_value again (although this may take a while), so we are dealing with a temporary situation.
HARP assumes that all filtering for S5P L2 can be done on the qa_value which is the original intent of that variable. So we are dealing with an anomalous situation.
The role of HARP is not to solve the temporary problems that are in products. Those should be solved within the products themselves.

There are currently two solutions. You can create your own conversion of the S5P L2 product into HARP format (performing the filtering yourself) and then continue on that HARP file using the HARP library to perform whatever HARP operation. This is already a valid approach to support any type of file format that is not supported as ingestion by the HARP software itself.

The second approach, which is currently more practical, is to patch the qa_value values in your O3 products such that they become 100/1.0 if they match this custom filter or 0 if they don’t (properly taking into account the scale factor attribute that is applicable on this variable). You can then ingest the products using HARP again and apply a filter on the validity variable as normal.
It shouldn’t be too difficult to create a python script that uses the HDF5 (or netCDF4) library to do patch the O3 products.
I know that this approach is currently used by Google EarthEngine to work around this issue.

1 Like

Good morning Sander,

We have used custom IDL scripts to read in the S5P TOCs since PhaseE1 as part of the MPC, and we have been filtering as suggested by the algorithm/data providers. We can of course continue to do so. I am just seeing how much faster things can happen with HARP and I thought we could revamp our TOC extracting routines as well as all the other species routines that I’ve been working on, as you know. We can of course wait for v02 which should solve the qa issue, hopefully.

Thank you for explaining the differences between algo-specfic and product-specific variables.

Best wishes,
MariLiza

I would like to request that the following parameter is also included in he ingestion:
‘PRODUCT/SUPPORT_DATA/DETAILED_RESULTS/ozone_ghost_column’

This ghost column is also a very algorithm specific quantity. It is not integral to use the data (as is the case for for instance the apriori and averaging kernel). We therefore don’t include it within the harmonised set of variables.

I would also like to enquire as to the following:
How can I set the ingestion to read the CRB cloud parameters, such as the cloud fraction ? is the
condition: OFFL
set automatically when the HAPR reads in an OFFL file or do I need to explicitly state this?

The OFFL condition just means that you need to read an L2 offline product (compared to reading an NRTI product). The OFFL and NRTI products don’t have the same content, so you might end up with different quantities when ingesting these products with HARP.
In this case you should always get a cloud fraction. The actual mapping to the original variable in the product is just different between OFFL and NRTI.

The approach where you patch the O3 products (and then use HARP for the ingestion) will probably be the most future proof solution in your case. You would just throw away this patch script step once the V2 data arrives.
Maybe someone else on the forum can provide/create such a script?

Thanks Sander for getting back to me so promptly, much appreciated. To be honest, this is the first time I am hearing of this patching option and I am not familiar with who might use it etc. However, since we need the ghost ozone column for our validation, as well as the other parameters, we might resort to extracting new ncdf HARP-compliant files, a process I am now familiar with. We shall see.

Best wishes
MariLiza