I am trying to do a convergence test by averaging 10 random waves from a set of Raman spectra, and plotting the standard error. Then repeating this for another set of 10 waves, and plotting the standard error again until a high enough number of waves have been selected for the standard error to converge. Is there a procedure or an alternative way of doing this?
Can you please explain a little more what you are trying to do? If you have the many Raman spectra already loaded, then it is relatively straight-forward to write some code to select 10 of these at random and average them. But I do not understand what you mean by the plotting standard error of this bunch of spectra?
Have you got some code that you have already written that can act as a starting point?
Can you please explain a little more what you are trying to do? If you have the many Raman spectra already loaded, then it is relatively straight-forward to write some code to select 10 of these at random and average them. But I do not understand what you mean by the plotting standard error of this bunch of spectra?
Have you got some code that you have already written that can act as a starting point?
Regards,
Kurt
January 25, 2018 at 03:54 am - Permalink