You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Here are analysis files one of our own Arvor floats from the upwelling area in the South Atlantic. It is very stable since the beginning but has always shown an negative offset at the lower range of our expected mapping uncertainty. In the past I have always acted on the credo ‘trust your float’ and have not corrected offsets when they were smaller than +-0.01. But since the discussion at las AST I am worried about the unclosed SSH budget. But I am also worried about overcorrecting and forcing all the float data towards climatology, knowing how imperfect the climatology is.
For this float a deployment CTD exists with calibrated data and this comparison does not show a need for a negative correction of the float data. If ever the float is too fresh. And I would not have a good explanation why the lab calibration should have been so bad, that the float measures wrongly from the start with a bias of ~-0.01. Therefor I have applied no correction, and since I trust my float I have also left the Qc at 1. But I would like to get your opinion on this. And maybe we need to communicate with the other dm operators how to deal with situations like this.
The text was updated successfully, but these errors were encountered:
I share your concerns about the unclosed SSH budget and overcorrection of floats. I think I would have done the same as you in such a case, for both correction and flags. The correction is uncertain but does not seem to exceed +/- 0.01, so it seems sensible not to correct the salinity of the float and leave the flag 1.
The salinity measurements from this float agree with deployment CTD and are stable over its time series. Those are sufficient reasons to call these salinity values good ('1'), with no need for adjustment. The <0.01 offset from ref data is within the range of temporal variability - applying an adjustment here would mean removing that temporal signal, which is something that we've told delayed-mode folks NOT to do since the beginning of the Argo program. I'm sure all delayed-mode folks understand this principle. The non-closure of the SSH budget after 2018 is due to the analysis using real-time data, not delayed-mode data, and in no way implies that we should change any of our delayed-mode principles.
PedroVelez
changed the title
[expert]
[expert] DMQC and enclosed SSH budget
Jun 7, 2021
Here are analysis files one of our own Arvor floats from the upwelling area in the South Atlantic. It is very stable since the beginning but has always shown an negative offset at the lower range of our expected mapping uncertainty. In the past I have always acted on the credo ‘trust your float’ and have not corrected offsets when they were smaller than +-0.01. But since the discussion at las AST I am worried about the unclosed SSH budget. But I am also worried about overcorrecting and forcing all the float data towards climatology, knowing how imperfect the climatology is.
For this float a deployment CTD exists with calibrated data and this comparison does not show a need for a negative correction of the float data. If ever the float is too fresh. And I would not have a good explanation why the lab calibration should have been so bad, that the float measures wrongly from the start with a bias of ~-0.01. Therefor I have applied no correction, and since I trust my float I have also left the Qc at 1. But I would like to get your opinion on this. And maybe we need to communicate with the other dm operators how to deal with situations like this.
The text was updated successfully, but these errors were encountered: