[ROOT] RE:A C++ question...

From: Masaharu Goto (MXJ02154@nifty.ne.jp)
Date: Sat Jul 14 2001 - 07:39:22 MEST


Hello Orjan,

Thank you for your message.

The 200,000 elements limit may come from physical memory
resources and/or swap space you have on your computer. Since
it consumes 1.6Gbytes, it will be very difficult to load everything
on memory at once unless you have a big machine.

Unfortunately, I do not have a direct solution to it now.

Rene,
Do you have any comments?


Thank you
Masaharu Goto



>Date: Wed, 11 Jul 2001 16:33:46 +0200
>From: Orjan Nordhage <nordhage@tsl.uu.se>
>To: MXJ02154@niftyserve.or.jp
>Subject: A C++ question...
>
>Hello!
>
>Short presentation: I'm working at TSL at University of Uppsala in the
>WASA-group, looking at reactions like pp->pp + more. The detector result
>is a so called ntuple-file, which I let a C++-program read. The program
>then produces an outfile, wich contains 2 columns of numbers, which I
>use ROOT for plotting.
>
>However, this works just fine, ROOT is a great tool for this kind of
>things. But the problem is that the ntuple-file is very large, that is
>contains very many numbers, and I store them in a vector. That also work
>just fine until I want to read the whole file and store every important
>value, like 1 or 2 millions. Unfortunately, C++ has this limit of about
>200.000 elements for a double vector. Now, I wonder, do You know how to
>get around this problem? Can I include som directory <supervectors.h> or
>something? Or do You have any other suggestion? Of course, I can divide
>the reading into parts, and define several vectors, but I prefer not to,
>since the storing process is quite advanced as it is.
>
>Yours sincerely
>
>                           / ヨrjan Nordhage
>



This archive was generated by hypermail 2b29 : Tue Jan 01 2002 - 17:50:52 MET