Minimizing sum of differences
puglie12
I have two datasets, one which is measured data and one which is modeled data, both which correspond to a time wave. I need to minimize the sum of their absolute differences,
i.e. min||Σi λi * x(t)i – y(t)||, where x(t) is the modeled values and y(t) are the measured values. I need to find values of lambda (λ) that minimize this expression. So far all I can do is generate random values for lambda, solve for the summation and find which value of lambda corresponds to the minimal sum, but I know this is not correct.
Function MinimizeSum(modelled, measured)
wave modelled, measured
variable i
make/o/n=(numpnts(modelled)) lambda
lambda=gnoise(1)
duplicate/o modelled diff
diff=nan
duplicate/o diff sum_wave
sum_Wave=NaN
For(i=0;i<(numpnts(modelled)); i+=1)
diff[i] = abs(lambda[i]*modelled[i]-measured[i])
sum_wave = sum(diff)
endfor
Wavestats sum_wave
end
wave modelled, measured
variable i
make/o/n=(numpnts(modelled)) lambda
lambda=gnoise(1)
duplicate/o modelled diff
diff=nan
duplicate/o diff sum_wave
sum_Wave=NaN
For(i=0;i<(numpnts(modelled)); i+=1)
diff[i] = abs(lambda[i]*modelled[i]-measured[i])
sum_wave = sum(diff)
endfor
Wavestats sum_wave
end
How do I properly calculate values of lambda to minimize this expression?
Thank you for any help,
Stephanie
To learn more, execute this Igor command:
DisplayHelpTopic "Finding Minima and Maxima of Functions"
John Weeks
WaveMetrics, Inc.
support@wavemetrics.com
October 31, 2014 at 10:25 am - Permalink
John Weeks
WaveMetrics, Inc.
support@wavemetrics.com
October 31, 2014 at 10:36 am - Permalink