Replies: 6 comments
-
Seems like this is effectively choosing a different error weighting. Perhaps things will fit better if we account for errors in general, if we are not already. |
Beta Was this translation helpful? Give feedback.
-
I think there are a couple of issues here and in fact this was starting to be addressed by @Caddy-Jones from ILL but we decided there was a lot more to it. In discussions at SAS2022 this might be a project that is continued by the next person? But the discussion which I think is somewhere else in tickets and/or PR is
|
Beta Was this translation helpful? Give feedback.
-
Any reweighting makes parameter uncertainties suspect. It would be better to incorporate the systematics into model than to diddle to weights on the measurement. Particularly with high statistics and larger Q range on the X-ray sources small variation in the resolution function or fine-grained details of the sample may distort the fit. It may be that the Gaussian resolution model is completely inappropriate. For one of our reflectivity instruments we are combining Q values where the variation in Q in each bin is much larger than the δQ on the individual points. For that data a square resolution function worked well. Similarly, when integrating around a ring on a 2D detector for a SAXS machine, you may find that the variation in Q for the combined pixels is higher then the δQ in each pixel. Prior to throwing low Q data with a Qⁿ scaling on ΔQ it would be worth checking if a better resolution model can fit the data. You may need to improve the resolution integral, including theory values between the nominal Q values, to properly compute the resolution. You may have systematic effects such as detector tilt which will increase the variation in true Q values between the pixels, leading to a broader ΔQ. Using bumps directly from sasmodels you should be able test the effect of an improved resolution function pretty quickly by adding a new class similar to those in Localized twisting and stretching will lead to small variation in the resulting (Q, ΔQ) for each bin with only short range correlation. With some small tweaks to the |
Beta Was this translation helpful? Give feedback.
-
I disagree in this suggestion. This would mean to effectively change the errors in data. We measure means and standard deviations when doing an experiment, and none of these should be changed. If you have a model that only fits high-q, it is of course perfectly valid only to fit that part of the data, and state that. But a reweighting as suggested here is, in my opinion, misleading and corresponds to changing the data (namely the measured standard deviations). |
Beta Was this translation helpful? Give feedback.
-
It seems there is strong agreement that this is a bad idea. |
Beta Was this translation helpful? Give feedback.
-
I think it is more complicated than that and I think it would be appropriate to bring defenders of the practice to the discussion in order to better understand and get a more balanced view. To be clear this is not an uncommon practice. In the meantime, and despite my rather deep skepticism of such suggestions in the past, I’ve been trying to think more carefully about why this practice is not uncommon in the broader community. I’m guessing that this must have originated prior to data routinely containing errors on I. In such a situation, only the low Q will be fit which can be certainly be mitigated by this method. However, in view of our discussion on weighting to match SAXS and SANS, if the reported uncertainties are tiny compared to the true value, then again I believe the low Q data will dominate the fits? In either case, I believe that this will only be a significant problem if the model does not perfectly capture the true scatterer .. but that is generally the case with analytical modes. So I guess there may be some legitimate needs. In these cases though, at the very least, one should not claim an uncertainty on the parameters? But again, I think we need to bring in some strong defenders of this approach to the discussion to get a fuller picture. |
Beta Was this translation helpful? Give feedback.
-
Enhancement suggestion.
Olivier Tache (of pySAXS) suggests that SasView should also permit minimisation of (expt-data)*q^n vs q where n=[2,3,4], and not just (expt-data) vs q. The idea is to improve fitting at high-Q for those data that need it. See attached figures.
Discussion with @butlerpd suggests this could perhaps be implemented as an alternative alongside the present dI weighting schemes, possibly having more physical sense than options like sqrt(I) or 1/I?
Beta Was this translation helpful? Give feedback.
All reactions