I'm currently working with a system whereby I want to look at the
effect of entire set of cuts by creating an output file with a
nested directory structure:
source/
/mode1 .. /modeX
/cut1 ... /cutX
/additional selection1
/additional selection2
and so on. For the most part it works very nicely, except that for
any given source, I have to create the output directories and
book the histograms inside them in advance, and I'm winding up with
a rather enormous memory usage that scales with the number of
directories I create. This starts to create problems with
performance when the amount of memory used is too large. I'm guessing
what is happening is that until the file/directory is closed, the
histograms inside each directory are loaded into local memory and
are taking up a substantial amount of space.
Is this a known problem, and is there any solution other than limiting
the number of directories that are "open" at one time?
Thanks,
Amanda
This archive was generated by hypermail 2b29 : Sat Jan 04 2003 - 23:51:11 MET