Re: Manage a ROOT file from a streaming of the ROOT file content

From: ê°¶¬ËÉ <donal0412_at_gmail.com>
Date: Thu, 26 Apr 2012 17:33:36 +0800


Hi Philippe ,
I'm trying to use DFS_Fuse to access data in HDFS using some ROOT based applications.
I always get some error like
 'SysError in <TFile::WriteBuffer>: error writing to file /hdfs/... (-1) (Operation not supported)'

I guess that's because there's random writes in the TFile class, since HDFS does not support random write.
Am I right?
Is that also the reason why the HDFS plugin is readonly?

Thanks!

2012/4/23 Massimiliano Fasi <Massimiliano.Fasi_at_pg.infn.it>

> Thank you, it exactly answered my question about URL pattern. I changed it
> (and moved the ROOT file I wanted to read in hdfs' root directory), but
> still things went wrong, and I got the same error:
>
> SysError in <THDFSFile::THDFSFile>: Unable to open file
> hdfs:///mySearchTreeFile_68_2_**Ewz.root in HDFS (No such file or
> directory)
> SysError in <THDFSFile::THDFSFile>: file hdfs:///mySearchTreeFile_68_2_**Ewz.root
> can not be opened for reading (No such file or directory)
>
>
> *** Break *** segmentation violation
> Generating stack trace...
> [ stack trace follows ]
>
> Cluster seems to be well configured, and I'm sure the file exists: I can
> check its existence by hadoop applications and open it by FUSE. Any guess?
>
>
> Cheers,
>
> Massimiliano
>
> Fons Rademakers <Fons.Rademakers_at_cern.ch> ha scritto:
>
> [added Brian the author of this plugin in cc]
>>
>> Have a look at the doc here:
>>
>> http://root.cern.ch/lxr/**source/io/hdfs/src/THDFSFile.**cxx<http://root.cern.ch/lxr/source/io/hdfs/src/THDFSFile.cxx>
>>
>> Cheers, Fons.
>>
>>
>> On 23/04/2012 10:15, Massimiliano Fasi wrote:
>>
>>> Hi Fons,
>>>
>>> you were right. I re-built ROOT library with HDFS plugin, and now ROOT is
>>> able to find HDFS library, but application still doesn't work. I get this
>>> error:
>>>
>>> SysError in <THDFSFile::THDFSFile>: Unable to open file
>>> hdfs://hydra1:54310/user/fasi/**testROOT/mySearchTreeFile_68_**2_Ewz.root
>>> in
>>> HDFS (No such file or directory)
>>> SysError in <THDFSFile::THDFSFile>: file
>>> hdfs://hydra1:54310/user/fasi/**testROOT/mySearchTreeFile_68_**2_Ewz.root
>>> can
>>> not be opened for reading (No such file or directory)
>>>
>>> *** Break *** segmentation violation
>>> Generating stack trace...
>>> [ stack trace follows ]
>>>
>>> But I'm sure the file /user/fasi/testROOT/**
>>> mySearchTreeFile_68_2_Ewz.root
>>> exists in HDFS. Is there in your opinion any mistake in path string or
>>> somewhere else?
>>>
>>> Thank you,
>>>
>>> Massimiliano
>>>
>>> Fons Rademakers <Fons.Rademakers_at_cern.ch> ha scritto:
>>>
>>> Of course you've to make sure the HDFS plugin is build. It is not part
>>>> of
>>>> the standard binary version. Get the source and make sure all
>>>> prerequisite libs for HDFS support are installed and do
>>>>
>>>> ./configure
>>>> make
>>>>
>>>> Cheers, Fons.
>>>>
>>>> On 22 Apr 2012, at 09:28, Massimiliano Fasi
>>>> <Massimiliano.Fasi_at_pg.infn.it> wrote:
>>>>
>>>> Hi Fons,
>>>>>
>>>>> and thank you for your explanation. I tried to use TFile::Open(),
>>>>> getting a different behaviour of the application. Indeed, now I get
>>>>> this
>>>>> error:
>>>>>
>>>>> Error in <TUnixSystem::DynamicPathName>**: HDFS[.so | .dll | .dylib |
>>>>> .sl
>>>>> | .dl | .a] does not exist in
>>>>> :/storage/root/root/lib/root:/**storage/root/root/lib/root:/**
>>>>> storage/root/root/lib/root:/**storage/root/root/lib/root:/**
>>>>> storage/root/root/lib/root:/**storage/root/root/lib/root:/**
>>>>> storage/root/root/lib/root:/**storage/root/root/lib/root:/**
>>>>> storage/root/root/lib/root:/**storage/root/root/lib/root:/**
>>>>> storage/root/root/lib/root:/**storage/root/root/lib/root:/**
>>>>> storage/root/root/lib/root:/**storage/root/root/lib/root:/**
>>>>> storage/root/root/lib/root:/**storage/root/root/lib/root:/**
>>>>> storage/root/root/lib/root:.:/**storage/root/root/lib/root::/**
>>>>> storage/root/root/lib/root/**cint/cint/stl
>>>>>
>>>>>
>>>>> [where /storage/root/root is ROOT installation directory ]
>>>>>
>>>>> It seems that now the program knows that it has to open a file stored
>>>>> in
>>>>> HDFS, even though it can't do it succesfully. Any hint on how to fix
>>>>> that new issue?
>>>>>
>>>>> Cheers,
>>>>> Massimiliano
>>>>>
>>>>> Fons Rademakers <Fons.Rademakers_at_cern.ch> ha scritto:
>>>>>
>>>>> Hi,
>>>>>>
>>>>>> to any I/O plugin work you have to open the files via TFile::Open(),
>>>>>> like
>>>>>>
>>>>>> TFile* fileInput = TFile::Open(line.c_str());
>>>>>>
>>>>>> this static method will load the HDFS plugin triggered by hdfs:// an
>>>>>> will return a TFile derived THDFSFile object. The way you were doing
>>>>>> it
>>>>>> you were getting a standard local TFile object that was trying to open
>>>>>> a local file.
>>>>>>
>>>>>> Let me know if you've more success with TFile::Open().
>>>>>>
>>>>>> Cheers, Fons.
>>>>>>
>>>>>>
>>>>>
>>>>> ------------------------------**------------------------------**----
>>>>> This message was sent using IMP, the INFN Perugia Internet Messaging
>>>>> Program.
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>> ------------------------------**------------------------------**----
>>> This message was sent using IMP, the INFN Perugia Internet Messaging
>>> Program.
>>>
>>>
>>>
>> --
>> Org: CERN, European Laboratory for Particle Physics.
>> Mail: 1211 Geneve 23, Switzerland
>> E-Mail: Fons.Rademakers_at_cern.ch Phone: +41 22 7679248
>> WWW: http://fons.rademakers.org Fax: +41 22 7669640
>>
>>
>>
>
>
> ------------------------------**------------------------------**----
> This message was sent using IMP, the INFN Perugia Internet Messaging
> Program.
>
>
>
Received on Thu Apr 26 2012 - 11:33:44 CEST

This archive was generated by hypermail 2.2.0 : Fri Apr 27 2012 - 17:50:02 CEST