New book about programming Igor
MSchmid
I got the feeling that many people actually had problems with programming Igor and so I decided to write a little book about it.
It is now available on Amazon.
I tried to write the book from a more general perspective (I used C/C++ and Python in the past) and discussed things like code encapsulation, graphical user interfaces, or regular expressions on a beginners level (some people are completely unaware that Igor even supports concepts like regular expressions and the like).
And several new (and nice) features of Igor 7 are mentioned as well.
May 10, 2018 at 08:03 am - Permalink
--
J. J. Weimer
Chemistry / Chemical & Materials Engineering, UAH
May 10, 2018 at 10:41 am - Permalink
I too have order a copy and look forward to it.
Maybe we should form a book club ;)
Andy
May 10, 2018 at 01:32 pm - Permalink
May 11, 2018 at 02:29 am - Permalink
@MSchimd: I've sent my longer feedback to the email provided in the book.
May 11, 2018 at 07:27 am - Permalink
May 12, 2018 at 01:57 am - Permalink
I got the book and read it over the weekend. Very nice. I even learned/appreciated a thing or two.
I will provide some additional feedback directly.
I think there is an opportunity to have a discussion as how to teach getting the most out of Igor Pro when the target user is most likely a scientist/engineer as their day job and need a tool to get things done. This is in contrast to programmers as the primary audience for some other programs/languages.
The hybrid interface is a unique feature which I use often in doing ad hoc analysis. Also I find myself creating a user interface even for myself when I am exploring data, something again Igor Pro does so very well. So I think there may be common workflows that could act as teaching method to bring the new user up the learning curve faster.
Andy
May 13, 2018 at 07:52 pm - Permalink
I updated the book when Igor 8 was released and included another example that shows how to do a curve fit with a neural network. Because I think neural networks are awesome, I list this example below, so that also everyone with an older version of the book can see it.
Run this module with
The parameter 150 is the number of training sets for the NN which works best. Note that getting the number of training sets and learning rate right can be a bit finicky. Play around yourself.
static constant N =70
static constant low = 0
static constant high = 1
static constant width = 0.01
static function teach(M)
variable M // number of parameter sets for training
variable i
make /o/d/n=(M,2) TrainingParameters
wave par = TrainingParameters
// for simplicity, use random parameters in a reasonable range
// first column: amplitude
// second column: position
// then, each row contains a full parameter set
par[][0]=0.1 + 0.8*(0.5+enoise(0.5)) //[0.1 ; 0.9]
par[][1]=0.5 + enoise(0.45) //[0.05 ; 0.95]
// generate the curves of the training parameters
make /o/d/n=(M,N) TrainingCurves
wave tc = TrainingCurves
SetScale /I y, low, high, tc // note the normalization to [0,1]
// store them in rows, not in columns
for (i=0; i<M; i+=1)
tc[i][] = par[i][0]*exp(-(y-par[i][1])^2/width)
endfor
//now the neural network will learn the
//connection between parameters and curveshape
NeuralNetworkTrain nhidden=50, input=tc ,output=par
// the result of this learning process will be saved in two
// waves M_weights1 and M_weights2
// these waves contain all the necessary information for running the network
end
static function run()
// ------------
// make an arbitrary test curve
make /o/d/n=(N) sampleWave // number of points has to be the same as
// in the training set!
wave sW = sampleWave
SetScale /I x,low,high, sW
// neural networks are better with interpolating rather then extrapolating:
// use smaller ranges than in the training set
variable randomHeight = 0.2 + 0.6*(0.5+enoise(0.5)) //[0.2 ; 0.8]
variable randomLoc = 0.5 + enoise(0.25) //[0.25 ; 0.75]
sW = randomHeight*exp(-(x-randomLoc)^2/width)
sw += gnoise(0.01)
// ------------
// make references to the output waves of the training session
wave W1 = M_weights1
wave W2 = M_weights2
// run the neural network
NeuralNetworkRun input=sW, weightsWave1=W1, weightsWave2=W2
// ------------
// draw the result
// the wave W_NNResults is automatically created by the neural network
wave NNRes = W_NNResults
make /D /O /N=(N) NNCurve
wave NNC = NNCurve
SetScale /I x,low, high, NNC
NNC = NNRes[0]*exp(-(x-NNRes[1])^2/width)
end
static function show()
// call this function only after NeuNet#run() was active at least once
// so all waves are actually there
wave sW = sampleWave
wave NNC = NNCurve
Display sW
AppendToGraph NNC
end
June 1, 2018 at 06:06 am - Permalink
Just bought one for the lab, that's going to help new users.
July 30, 2018 at 07:31 pm - Permalink