Re: Manage a ROOT file from a streaming of the ROOT file content

From: Hassen Riahi <hassen.riahi_at_pg.infn.it>
Date: Thu, 3 May 2012 15:17:03 +0200


Hi Brian,

Sorry for the delay to answer.

> Hi Hassen,
>
> In case if you would have some time to contribute a patch this week...

Fine with me. I am working on.

Hassen

>
> I would observe that many folks run into issues with CLASSPATH, but
> building it (for the most part) is mostly a function of determining
> the correct values for $HADOOP_HOME and $HADOOP_CONF_DIR, then
> bootstrapping the $CLASSPATH by iterating through $HADOOP_HOME/lib.
> Each of these variable has a fairly sane default that could be used
> (as it is not a 100% reliable value, it should be overridden via
> ROOT's GetEnv/SetEnv).
>
> The first time a THDFSFile or THDFSSystem object is initialized, we
> could have a helper method to set $CLASSPATH if none is already
> present.
>
> This would really help beginners get started.
>
> Brian
>
> On Apr 30, 2012, at 5:00 AM, Hassen Riahi wrote:
>
>> Hi all,
>>
>> The problem is fixed now and the plugin seems to work. This
>> behavior was caused by the export of a wrong CLASSPATH (as Brian
>> has imagined).
>> As said before, we are exporting the following CLASSPATH:
>>
>> /usr/lib/hadoop-0.20/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop-0.20/
>> lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/
>> aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-
>> cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/usr/lib/
>> hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/lib/
>> commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-
>> httpclient-3.1.jar:/usr/lib/hadoop-0.20/lib/commons-lang-2.4.jar:/
>> usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/
>> hadoop-0.20/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop-0.20/
>> lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/
>> usr/lib/hadoop-0.20/lib/guava-r09-jar:/usr/lib/hadoop-0.20/lib/
>> hadoop-fairscheduler-0.20.2-cdh3u3.jar:/usr/lib/hadoop-0.20/lib/
>> hsqldb-1.8.0.10.jar:/usr/lib/hadoop-0.20/lib/jackson-core-
>> asl-1.5.2.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-
>> asl-1.5.2.jar:/usr/lib/hadoop-0.20/lib/jasper-compiler-5.5.12.jar:/
>> usr/lib/hadoop-0.20/lib/jasper-runtime-5.5.12.jar:/usr/lib/
>> hadoop-0.20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/
>> jetty-6.1.26.cloudera.1.jar:/usr/lib/hadoop-0.20/lib/jetty-servlet-
>> tester-6.1.26.cloudera.1.jar:/usr/lib/hadoop-0.20/lib/jetty-
>> util-6.1.26.cloudera.1.jar:/usr/lib/hadoop-0.20/lib/
>> jsch-0.1.42.jar:/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/usr/lib/
>> hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/
>> log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/mockito-all-1.8.2.jar:/
>> usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/lib/hadoop-0.20/lib/
>> servlet-api-2.5-20081211.jar:/usr/lib/hadoop-0.20/lib/servlet-
>> api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/lib/slf4j-api-1.4.3.jar:/
>> usr/lib/hadoop-0.20/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/
>> hadoop-0.20/lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/hadoop-0.20.2-
>> cdh3u3-ant.jar:/usr/lib/hadoop-0.20/hadoop-0.20.2-cdh3u3-core.jar:/
>> usr/lib/hadoop-0.20/hadoop-0.20.2-cdh3u3-examples.jar:/usr/lib/
>> hadoop-0.20/hadoop-0.20.2-cdh3u3-test.jar:/usr/lib/hadoop-0.20/
>> hadoop-0.20.2-cdh3u3-tools.jar:/usr/lib/hadoop-0.20/hadoop-
>> ant-0.20.2-cdh3u3.jar:/usr/lib/hadoop-0.20/hadoop-ant.jar:/usr/lib/
>> hadoop-0.20/hadoop-core-0.20.2-cdh3u3.jar:/usr/lib/hadoop-0.20/
>> hadoop-core.jar:/usr/lib/hadoop-0.20/hadoop-examples-0.20.2-
>> cdh3u3.jar:/usr/lib/hadoop-0.20/hadoop-examples.jar:/usr/lib/
>> hadoop-0.20/hadoop-test-0.20.2-cdh3u3.jar:/usr/lib/hadoop-0.20/
>> hadoop-test.jar:/usr/lib/hadoop-0.20/hadoop-tools-0.20.2-
>> cdh3u3.jar:/usr/lib/hadoop-0.20/hadoop-tools.jar:/usr/lib/
>> hadoop-0.20/conf/hdfs-site.xml:/usr/lib/hadoop-0.20/conf/core-
>> site.xml
>>
>> Changing in the CLASSPATH /usr/lib/hadoop-0.20/conf/hdfs-site.xml:/
>> usr/lib/hadoop-0.20/conf/core-site.xml by /usr/lib/hadoop-0.20/
>> conf:/usr/lib/hadoop-0.20/lib/guava-r09-jarjar.jar has fixed the
>> issue.
>>
>> cheers
>> Hassen
>>
>>> Hi Brian,
>>>
>>> I think your hint was helpful to understand what was going on.
>>> Indeed I ran that code:
>>>
>>> #include <THDFSFile.h>
>>> #include <iostream>
>>>
>>> using namespace std;
>>>
>>> int main() {
>>> THDFSSystem* sys = new THDFSSystem();
>>> void* dir = sys->OpenDirectory("/");
>>> const char* elem;
>>> elem=sys->GetDirEntry(dir);
>>> while((elem=sys->GetDirEntry(dir)))
>>> cout << elem << endl;
>>> return 0;
>>> }
>>>
>>> and got this output:
>>>
>>> hdfs:///storage
>>> hdfs:///lib
>>> hdfs:///selinux
>>> hdfs:///vmlinuz
>>> hdfs:///srv
>>> hdfs:///tmp
>>> hdfs:///lib32
>>> hdfs:///proc
>>> hdfs:///var
>>> hdfs:///user
>>> hdfs:///sys
>>> hdfs:///opt
>>> hdfs:///sbin
>>> hdfs:///initrd.img.old
>>> hdfs:///boot
>>> hdfs:///usr
>>> hdfs:///dev
>>> hdfs:///lib64
>>> hdfs:///bin
>>> hdfs:///media
>>> hdfs:///initrd.img
>>> hdfs:///etc
>>> hdfs:///home
>>> hdfs:///.X0-lock
>>> hdfs:///hs_err_pid1971.log
>>> hdfs:///root
>>> hdfs:///mnt
>>> hdfs:///vmlinuz.old
>>> hdfs:///lost+found
>>>
>>> that is, despite the "hdfs:///" string, the list of files in my
>>> "/" directory of the local filesystem. Have you got any
>>> explanation of such behaviour?
>>>
>>> Thank you,
>>>
>>> Massimiliano
>>>
>>> Brian Bockelman <brian.bockelman_at_cern.ch> ha scritto:
>>>
>>>> Hi Hassen,
>>>>
>>>> Can you increase the logging for your Hadoop client? That might
>>>> reveal something useful.
>>>>
>>>> Maybe try using THDFSSystem to list the root directory "/" would
>>>> be enlightening? The should tell you if it's at least talking to
>>>> the NN.
>>>>
>>>> Brian
>>>>
>>>> On Apr 24, 2012, at 8:33 AM, Hassen Riahi wrote:
>>>>
>>>>> Hi Brian,
>>>>>
>>>>> The same error:
>>>>>
>>>>> SysError in <THDFSFile::THDFSFile>: Unable to open file hdfs:///
>>>>> mySearchTreeFile_68_2_Ewz.root in HDFS (No such file or directory)
>>>>> SysError in <THDFSFile::THDFSFile>: file hdfs:///
>>>>> mySearchTreeFile_68_2_Ewz.root can not be opened for reading (No
>>>>> such file or directory)
>>>>>
>>>>> Here is the content of core-site.xml:
>>>>>
>>>>> # cat /usr/lib/hadoop-0.20/conf/core-site.xml
>>>>> <?xml version="1.0"?>
>>>>> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>>>>>
>>>>> <!-- Put site-specific property overrides in this file. -->
>>>>>
>>>>> <configuration>
>>>>> <property>
>>>>> <name>fs.default.name</name>
>>>>> <value>hdfs://hydra1:54310</value>
>>>>> <description>The name of the default file system. A URI whose
>>>>> scheme and authority determine the FileSystem implementation. The
>>>>> uri's scheme determines the config property (fs.SCHEME.impl)
>>>>> naming
>>>>> the FileSystem implementation class. The uri's authority is
>>>>> used to
>>>>> determine the host, port, etc. for a filesystem.</description>
>>>>> </property>
>>>>> </configuration>
>>>>>
>>>>> Thanks
>>>>> Hassen
>>>>>
>>>>>> Hi Hassen,
>>>>>>
>>>>>> Try changing things to:
>>>>>>
>>>>>>> hdfs:///mySearchTreeFile_68_2_Ewz.root
>>>>>>
>>>>>> and see if that helps. What's the contents of core-site.xml?
>>>>>>
>>>>>> Brian
>>>>>>
>>>>>> On Apr 24, 2012, at 7:08 AM, Hassen Riahi wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> Unfortunately the error persists. Here is the CLASSPATH (*),
>>>>>>> the error (**) and an ls of the file in HDFS (***)
>>>>>>>
>>>>>>> Thanks
>>>>>>> Hassen
>>>>>>>
>>>>>>> (*)
>>>>>>> hdfs_at_hydra1:~/max$ echo $CLASSPATH
>>>>>>> /usr/lib/hadoop-0.20/lib/ant-contrib-1.0b3.jar:/usr/lib/
>>>>>>> hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/
>>>>>>> aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-
>>>>>>> cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/
>>>>>>> usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/
>>>>>>> hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/
>>>>>>> commons-httpclient-3.1.jar:/usr/lib/hadoop-0.20/lib/commons-
>>>>>>> lang-2.4.jar:/usr/lib/hadoop-0.20/lib/commons-
>>>>>>> logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-
>>>>>>> api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/
>>>>>>> usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/
>>>>>>> lib/guava-r09-jar:/usr/lib/hadoop-0.20/lib/hadoop-
>>>>>>> fairscheduler-0.20.2-cdh3u3.jar:/usr/lib/hadoop-0.20/lib/
>>>>>>> hsqldb-1.8.0.10.jar:/usr/lib/hadoop-0.20/lib/jackson-core-
>>>>>>> asl-1.5.2.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-
>>>>>>> asl-1.5.2.jar:/usr/lib/hadoop-0.20/lib/jasper-
>>>>>>> compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-
>>>>>>> runtime-5.5.12.jar:/usr/lib/h!
>>> adoop-0.2
>>>>>>> 0/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/
>>>>>>> jetty-6.1.26.cloudera.1.jar:/usr/lib/hadoop-0.20/lib/jetty-
>>>>>>> servlet-tester-6.1.26.cloudera.1.jar:/usr/lib/hadoop-0.20/lib/
>>>>>>> jetty-util-6.1.26.cloudera.1.jar:/usr/lib/hadoop-0.20/lib/
>>>>>>> jsch-0.1.42.jar:/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/usr/
>>>>>>> lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/
>>>>>>> log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/mockito-
>>>>>>> all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/lib/
>>>>>>> hadoop-0.20/lib/servlet-api-2.5-20081211.jar:/usr/lib/
>>>>>>> hadoop-0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/
>>>>>>> hadoop-0.20/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/lib/
>>>>>>> slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/lib/
>>>>>>> xmlenc-0.52.jar:/usr/lib/hadoop-0.20/hadoop-0.20.2-cdh3u3-
>>>>>>> ant.jar:/usr/lib/hadoop-0.20/hadoop-0.20.2-cdh3u3-core.jar:/
>>>>>>> usr/lib/hadoop-0.20/hadoop-0.20.2-cdh3u3-examples.jar:/usr/lib/
>>>>>>> hadoop-0.20/hadoop-0.20.2-cdh3u3-test.jar:/usr/lib/hadoop-0.20/
>>>>>>> hadoop-0.20.2-cdh3u3-tools.jar:/usr/lib/hadoop-0.20/hadoop-
>>>>>>> ant-0.2!
>>> 0.2-cdh3u
>>>>>>> 3.jar:/usr/lib/hadoop-0.20/hadoop-ant.jar:/usr/lib/hadoop-0.20/
>>>>>>> hadoop-core-0.20.2-cdh3u3.jar:/usr/lib/hadoop-0.20/hadoop-
>>>>>>> core.jar:/usr/lib/hadoop-0.20/hadoop-examples-0.20.2-
>>>>>>> cdh3u3.jar:/usr/lib/hadoop-0.20/hadoop-examples.jar:/usr/lib/
>>>>>>> hadoop-0.20/hadoop-test-0.20.2-cdh3u3.jar:/usr/lib/hadoop-0.20/
>>>>>>> hadoop-test.jar:/usr/lib/hadoop-0.20/hadoop-tools-0.20.2-
>>>>>>> cdh3u3.jar:/usr/lib/hadoop-0.20/hadoop-tools.jar:/usr/lib/
>>>>>>> hadoop-0.20/conf/hdfs-site.xml:/usr/lib/hadoop-0.20/conf/core-
>>>>>>> site.xml
>>>>>>>
>>>>>>> (**)
>>>>>>> Opening hdfs://hydra1:54310/mySearchTreeFile_68_2_Ewz.root
>>>>>>> SysError in <THDFSFile::THDFSFile>: Unable to open file hdfs://
>>>>>>> hydra1:54310/mySearchTreeFile_68_2_Ewz.root in HDFS (No such
>>>>>>> file or directory)
>>>>>>> SysError in <THDFSFile::THDFSFile>: file hdfs://hydra1:54310/
>>>>>>> mySearchTreeFile_68_2_Ewz.root can not be opened for reading
>>>>>>> (No such file or directory)
>>>>>>>
>>>>>>> (***)
>>>>>>> hdfs_at_hydra1:~/max$ /usr/lib/hadoop/bin/hadoop dfs -ls /
>>>>>>> mySearchTreeFile_68_2_Ewz.root
>>>>>>> 12/04/24 13:56:06 INFO security.UserGroupInformation: JAAS
>>>>>>> Configuration already set up for Hadoop, not re-installing.
>>>>>>> Found 1 items
>>>>>>> -rw-r--r-- 3 hdfs supergroup 3793496 2012-04-23 10:32 /
>>>>>>> mySearchTreeFile_68_2_Ewz.root
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>> Whoops - very sorry, the fs.default.name is kept in core-
>>>>>>>> site.xml. You'll also need that.
>>>>>>>>
>>>>>>>> Brian
>>>>>>>>
>>>>>>>> On Apr 23, 2012, at 12:45 PM, Hassen Riahi wrote:
>>>>>>>>
>>>>>>>>> Hi Brian,
>>>>>>>>>
>>>>>>>>> Adding hdfs-site.xml in the CLASSPATH doesn't seem to fix
>>>>>>>>> the problem. The same error (*) is seen with the following
>>>>>>>>> CLASSPATH:
>>>>>>>>>
>>>>>>>>> # echo $CLASSPATH
>>>>>>>>> /usr/lib/hadoop-0.20/lib/ant-contrib-1.0b3.jar:/usr/lib/
>>>>>>>>> hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/
>>>>>>>>> aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-
>>>>>>>>> cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/
>>>>>>>>> usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/
>>>>>>>>> hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/
>>>>>>>>> commons-httpclient-3.1.jar:/usr/lib/hadoop-0.20/lib/commons-
>>>>>>>>> lang-2.4.jar:/usr/lib/hadoop-0.20/lib/commons-
>>>>>>>>> logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-
>>>>>>>>> api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-
>>>>>>>>> net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/
>>>>>>>>> lib/hadoop-0.20/lib/guava-r09-jar:/usr/lib/hadoop-0.20/lib/
>>>>>>>>> hadoop-fairscheduler-0.20.2-cdh3u3.jar:/usr/lib/hadoop-0.20/
>>>>>>>>> lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop-0.20/lib/jackson-
>>>>>>>>> core-asl-1.5.2.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-
>>>>>>>>> asl-1.5.2.jar:/usr/lib/hadoop-0.20/lib/jasper-
>>>>>>>>> compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-
>>>>>>>>> runtime-5.5.12.jar:/usr/lib!
>>> /hadoop-0
>>>>>>>>> .20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/
>>>>>>>>> jetty-6.1.26.cloudera.1.jar:/usr/lib/hadoop-0.20/lib/jetty-
>>>>>>>>> servlet-tester-6.1.26.cloudera.1.jar:/usr/lib/hadoop-0.20/
>>>>>>>>> lib/jetty-util-6.1.26.cloudera.1.jar:/usr/lib/hadoop-0.20/
>>>>>>>>> lib/jsch-0.1.42.jar:/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/
>>>>>>>>> usr/lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/
>>>>>>>>> lib/log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/mockito-
>>>>>>>>> all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/
>>>>>>>>> lib/hadoop-0.20/lib/servlet-api-2.5-20081211.jar:/usr/lib/
>>>>>>>>> hadoop-0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/
>>>>>>>>> hadoop-0.20/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/lib/
>>>>>>>>> slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/lib/
>>>>>>>>> xmlenc-0.52.jar:/usr/lib/hadoop-0.20/hadoop-0.20.2-cdh3u3-
>>>>>>>>> ant.jar:/usr/lib/hadoop-0.20/hadoop-0.20.2-cdh3u3-core.jar:/
>>>>>>>>> usr/lib/hadoop-0.20/hadoop-0.20.2-cdh3u3-examples.jar:/usr/
>>>>>>>>> lib/hadoop-0.20/hadoop-0.20.2-cdh3u3-test.jar:/usr/lib/
>>>>>>>>> hadoop-0.20/hadoop-0.20.2-cdh3u3-tools.jar:/usr/lib/
>>>>>>>>> hadoop-0.20/hadoop-ant!
>>> -0.20.2-c
>>>>>>>>> dh3u3.jar:/usr/lib/hadoop-0.20/hadoop-ant.jar:/usr/lib/
>>>>>>>>> hadoop-0.20/hadoop-core-0.20.2-cdh3u3.jar:/usr/lib/
>>>>>>>>> hadoop-0.20/hadoop-core.jar:/usr/lib/hadoop-0.20/hadoop-
>>>>>>>>> examples-0.20.2-cdh3u3.jar:/usr/lib/hadoop-0.20/hadoop-
>>>>>>>>> examples.jar:/usr/lib/hadoop-0.20/hadoop-test-0.20.2-
>>>>>>>>> cdh3u3.jar:/usr/lib/hadoop-0.20/hadoop-test.jar:/usr/lib/
>>>>>>>>> hadoop-0.20/hadoop-tools-0.20.2-cdh3u3.jar:/usr/lib/
>>>>>>>>> hadoop-0.20/hadoop-tools.jar:/usr/lib/hadoop-0.20/conf/hdfs-
>>>>>>>>> site.xml
>>>>>>>>>
>>>>>>>>> Thanks
>>>>>>>>> Hassen
>>>>>>>>>
>>>>>>>>> (*)
>>>>>>>>>
>>>>>>>>> SysError in <THDFSFile::THDFSFile>: Unable to open file
>>>>>>>>> hdfs:///mySearchTreeFile_68_2_Ewz.root in HDFS (No such file
>>>>>>>>> or directory)
>>>>>>>>> SysError in <THDFSFile::THDFSFile>: file hdfs:///
>>>>>>>>> mySearchTreeFile_68_2_Ewz.root can not be opened for reading
>>>>>>>>> (No such file or directory)
>>>>>>>>>
>>>>>>>>>> Hi Massimiliano,
>>>>>>>>>>
>>>>>>>>>> This typically indicates Hadoop's hdfs-site.xml is not in
>>>>>>>>>> your CLASSPATH environment variable. When this happens,
>>>>>>>>>> instead of using the NN, Hadoop will fall back to using the
>>>>>>>>>> local filesystem.
>>>>>>>>>>
>>>>>>>>>> What is the value of $CLASSPATH?
>>>>>>>>>>
>>>>>>>>>> Brian
>>>>>>>>>>
>>>>>>>>>> On Apr 23, 2012, at 4:06 AM, Massimiliano Fasi wrote:
>>>>>>>>>>
>>>>>>>>>>> Thank you, it exactly answered my question about URL
>>>>>>>>>>> pattern. I changed it (and moved the ROOT file I wanted to
>>>>>>>>>>> read in hdfs' root directory), but still things went
>>>>>>>>>>> wrong, and I got the same error:
>>>>>>>>>>>
>>>>>>>>>>> SysError in <THDFSFile::THDFSFile>: Unable to open file
>>>>>>>>>>> hdfs:///mySearchTreeFile_68_2_Ewz.root in HDFS (No such
>>>>>>>>>>> file or directory)
>>>>>>>>>>> SysError in <THDFSFile::THDFSFile>: file hdfs:///
>>>>>>>>>>> mySearchTreeFile_68_2_Ewz.root can not be opened for
>>>>>>>>>>> reading (No such file or directory)
>>>>>>>>>>>
>>>>>>>>>>> *** Break *** segmentation violation
>>>>>>>>>>> Generating stack trace...
>>>>>>>>>>> [ stack trace follows ]
>>>>>>>>>>>
>>>>>>>>>>> Cluster seems to be well configured, and I'm sure the file
>>>>>>>>>>> exists: I can check its existence by hadoop applications
>>>>>>>>>>> and open it by FUSE. Any guess?
>>>>>>>>>>>
>>>>>>>>>>> Cheers,
>>>>>>>>>>>
>>>>>>>>>>> Massimiliano
>>>>>>>>>>>
>>>>>>>>>>> Fons Rademakers <Fons.Rademakers_at_cern.ch> ha scritto:
>>>>>>>>>>>
>>>>>>>>>>>> [added Brian the author of this plugin in cc]
>>>>>>>>>>>>
>>>>>>>>>>>> Have a look at the doc here:
>>>>>>>>>>>>
>>>>>>>>>>>> http://root.cern.ch/lxr/source/io/hdfs/src/THDFSFile.cxx
>>>>>>>>>>>>
>>>>>>>>>>>> Cheers, Fons.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On 23/04/2012 10:15, Massimiliano Fasi wrote:
>>>>>>>>>>>>> Hi Fons,
>>>>>>>>>>>>>
>>>>>>>>>>>>> you were right. I re-built ROOT library with HDFS
>>>>>>>>>>>>> plugin, and now ROOT is
>>>>>>>>>>>>> able to find HDFS library, but application still doesn't
>>>>>>>>>>>>> work. I get this
>>>>>>>>>>>>> error:
>>>>>>>>>>>>>
>>>>>>>>>>>>> SysError in <THDFSFile::THDFSFile>: Unable to open file
>>>>>>>>>>>>> hdfs://hydra1:54310/user/fasi/testROOT/
>>>>>>>>>>>>> mySearchTreeFile_68_2_Ewz.root in
>>>>>>>>>>>>> HDFS (No such file or directory)
>>>>>>>>>>>>> SysError in <THDFSFile::THDFSFile>: file
>>>>>>>>>>>>> hdfs://hydra1:54310/user/fasi/testROOT/
>>>>>>>>>>>>> mySearchTreeFile_68_2_Ewz.root can
>>>>>>>>>>>>> not be opened for reading (No such file or directory)
>>>>>>>>>>>>>
>>>>>>>>>>>>> *** Break *** segmentation violation
>>>>>>>>>>>>> Generating stack trace...
>>>>>>>>>>>>> [ stack trace follows ]
>>>>>>>>>>>>>
>>>>>>>>>>>>> But I'm sure the file /user/fasi/testROOT/
>>>>>>>>>>>>> mySearchTreeFile_68_2_Ewz.root
>>>>>>>>>>>>> exists in HDFS. Is there in your opinion any mistake in
>>>>>>>>>>>>> path string or
>>>>>>>>>>>>> somewhere else?
>>>>>>>>>>>>>
>>>>>>>>>>>>> Thank you,
>>>>>>>>>>>>>
>>>>>>>>>>>>> Massimiliano
>>>>>>>>>>>>>
>>>>>>>>>>>>> Fons Rademakers <Fons.Rademakers_at_cern.ch> ha scritto:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Of course you've to make sure the HDFS plugin is build.
>>>>>>>>>>>>>> It is not part of
>>>>>>>>>>>>>> the standard binary version. Get the source and make
>>>>>>>>>>>>>> sure all
>>>>>>>>>>>>>> prerequisite libs for HDFS support are installed and do
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> ./configure
>>>>>>>>>>>>>> make
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Cheers, Fons.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On 22 Apr 2012, at 09:28, Massimiliano Fasi
>>>>>>>>>>>>>> <Massimiliano.Fasi_at_pg.infn.it> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Hi Fons,
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> and thank you for your explanation. I tried to use
>>>>>>>>>>>>>>> TFile::Open(),
>>>>>>>>>>>>>>> getting a different behaviour of the application.
>>>>>>>>>>>>>>> Indeed, now I get this
>>>>>>>>>>>>>>> error:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Error in <TUnixSystem::DynamicPathName>: HDFS[.so
>>>>>>>>>>>>>>> | .dll | .dylib | .sl
>>>>>>>>>>>>>>> | .dl | .a] does not exist in
>>>>>>>>>>>>>>> :/storage/root/root/lib/root:/storage/root/root/lib/
>>>>>>>>>>>>>>> root:/storage/root/root/lib/root:/storage/root/root/
>>>>>>>>>>>>>>> lib/root:/storage/root/root/lib/root:/storage/root/
>>>>>>>>>>>>>>> root/lib/root:/storage/root/root/lib/root:/storage/
>>>>>>>>>>>>>>> root/root/lib/root:/storage/root/root/lib/root:/
>>>>>>>>>>>>>>> storage/root/root/lib/root:/storage/root/root/lib/
>>>>>>>>>>>>>>> root:/storage/root/root/lib/root:/storage/root/root/
>>>>>>>>>>>>>>> lib/root:/storage/root/root/lib/root:/storage/root/
>>>>>>>>>>>>>>> root/lib/root:/storage/root/root/lib/root:/storage/
>>>>>>>>>>>>>>> root/root/lib/root:.:/storage/root/root/lib/root::/
>>>>>>>>>>>>>>> storage/root/root/lib/root/cint/cint/stl
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> [where /storage/root/root is ROOT installation
>>>>>>>>>>>>>>> directory ]
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> It seems that now the program knows that it has to
>>>>>>>>>>>>>>> open a file stored in
>>>>>>>>>>>>>>> HDFS, even though it can't do it succesfully. Any hint
>>>>>>>>>>>>>>> on how to fix
>>>>>>>>>>>>>>> that new issue?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Cheers,
>>>>>>>>>>>>>>> Massimiliano
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Fons Rademakers <Fons.Rademakers_at_cern.ch> ha scritto:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> to any I/O plugin work you have to open the files via
>>>>>>>>>>>>>>>> TFile::Open(), like
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> TFile* fileInput = TFile::Open(line.c_str());
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> this static method will load the HDFS plugin
>>>>>>>>>>>>>>>> triggered by hdfs:// an
>>>>>>>>>>>>>>>> will return a TFile derived THDFSFile object. The way
>>>>>>>>>>>>>>>> you were doing it
>>>>>>>>>>>>>>>> you were getting a standard local TFile object that
>>>>>>>>>>>>>>>> was trying to open
>>>>>>>>>>>>>>>> a local file.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Let me know if you've more success with TFile::Open().
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Cheers, Fons.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> ----------------------------------------------------------------
>>>>>>>>>>>>>>> This message was sent using IMP, the INFN Perugia
>>>>>>>>>>>>>>> Internet Messaging
>>>>>>>>>>>>>>> Program.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> ----------------------------------------------------------------
>>>>>>>>>>>>> This message was sent using IMP, the INFN Perugia
>>>>>>>>>>>>> Internet Messaging Program.
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> --
>>>>>>>>>>>> Org: CERN, European Laboratory for Particle Physics.
>>>>>>>>>>>> Mail: 1211 Geneve 23, Switzerland
>>>>>>>>>>>> E-Mail: Fons.Rademakers_at_cern.ch Phone: +41
>>>>>>>>>>>> 22 7679248
>>>>>>>>>>>> WWW: http://fons.rademakers.org Fax: +41
>>>>>>>>>>>> 22 7669640
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ----------------------------------------------------------------
>>>>>>>>>>> This message was sent using IMP, the INFN Perugia Internet
>>>>>>>>>>> Messaging Program.
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>
>>>
>>>
>>> ----------------------------------------------------------------
>>> This message was sent using IMP, the INFN Perugia Internet
>>> Messaging Program.
>>>
>>>
>>
>
Received on Thu May 03 2012 - 15:17:31 CEST

This archive was generated by hypermail 2.2.0 : Sat May 05 2012 - 11:50:01 CEST