Memory allocation after deleting big waves
ChrLie
According to the Mac activity monitor, Igor then still occupies 800-900 MB of memory. When loading a new data set of this size into a 3D wave this results in an "out of memory" error, as the loading procedure contains a Duplicate command. Restarting Igor is the only option. This behaviour seems to be rather random, sometimes it works, sometimes not.
Is there any work-around for this? My goal is to do batch processing on such data sets which requires sequential killing and loading.
Note on Mac, a 32 bit application like Igor has access to 4GB of memory.
We do have a 64 bit Igor but that (currently) runs only on Windows.
Larry Hutchinson
WaveMetrics
support@WaveMetrics.com
December 13, 2013 at 01:26 pm - Permalink
Yes, true, the 4GB limit is never exceeded. So far I had no problems with batch processing and up to 40 datasets but the waves were smaller than 800 GB.
December 18, 2013 at 07:46 am - Permalink
So, is there a work-around?
A way to force de- & re-allocation of a certain big memory block?
I am running Mac OS 10.10.2 & Igor 6.36 .
I iteratively create/delete/create/... a 1d integer wave of same maximal size (ca. 120kB) using "concatenate".
Is there a way to avoid excessive memory consumption and re-use the same memory block instead?
Thanks.
May 5, 2015 at 05:04 am - Permalink
Larry Hutchinson
WaveMetrics
support@WaveMetrics.com
May 5, 2015 at 01:25 pm - Permalink