You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was wondering why the optical data requires standardisation and therefore the additional computation of the mean and std - An additional step that could be avoided if the optical data is radiometrically corrected. So for sentinel-2 divide by 10000 to get the digital values to reflectance which by default is in the range of 0-1.
Moreover, if the data is standardised, one usually needs to do a running mean/std computation over batches which if I understand correctly is an approximation of the actual mean/stdv. If I am not missing anything it makes more sense to use reflectance?
Second to this, if one had to fine-tune clay on reflectance data instead of standardised data would you foresee any issues?
The text was updated successfully, but these errors were encountered:
I was wondering why the optical data requires standardisation and therefore the additional computation of the mean and std - An additional step that could be avoided if the optical data is radiometrically corrected. So for sentinel-2 divide by 10000 to get the digital values to reflectance which by default is in the range of 0-1.
Moreover, if the data is standardised, one usually needs to do a running mean/std computation over batches which if I understand correctly is an approximation of the actual mean/stdv. If I am not missing anything it makes more sense to use reflectance?
Second to this, if one had to fine-tune clay on reflectance data instead of standardised data would you foresee any issues?
The text was updated successfully, but these errors were encountered: