I have downloaded the hadoop connnectors and examples. Also the hadoop is running. how can i connect to hadoop/how to test the examples.
I gave like as below as job,
hadoop jar hadoop-vertica-example.jar com.vertica.hadoop.VerticaExample
-libjars /usr/lib/hadoop/lib/hadoop-vertica.jar,/usr/lib/hadoop/lib/vertica-jdk5.jar
-Dmapred.vertica.hostnames=192.168.x.x -Dmapred.vertica.port=5433
-Dmapred.vertica.username=hdfs -Dmapred.vertica.database=HADOOP
-Dmapred.vertica.debug=1
it show the error as,
Exception in thread "main" java.lang.RuntimeException: java.sql.SQLException: [Vertica][VJDBC](100176) Failed to connect to host 192.168.x.x on port 5433. Reason: Connection timed out:connect
How to resolve this.
How to connect Vertica to hadoop.
Moderator: NorbertKrupa
Re: How to connect Vertica to hadoop.
I have successively connected the had0op 0.20.2, pig with Vertica using pig script.
Re: How to connect Vertica to hadoop.
Hey buddy can you please explain whole procedure how u connected vertica with hadoop
-
- Newbie
- Posts: 3
- Joined: Wed Nov 20, 2013 7:18 pm
Re: How to connect Vertica to hadoop.
Hello,
I am trying to connect Hadoop with Vertica, but not able to do it.
As per documentation, I created the VerticaExample.java file, but not able to compile it.
Can anybody help me on this.
I am trying to connect Hadoop with Vertica, but not able to do it.
As per documentation, I created the VerticaExample.java file, but not able to compile it.
Can anybody help me on this.
- Attachments
-
- error2.txt
- (8.27 KiB) Downloaded 1041 times
-
- error.txt
- (1.75 KiB) Downloaded 1024 times
Re: How to connect Vertica to hadoop.
Hi!
[DELETED]
[DELETED]
Last edited by id10t on Sat May 09, 2015 3:16 pm, edited 1 time in total.
-
- Newbie
- Posts: 3
- Joined: Wed Nov 20, 2013 7:18 pm
Re: How to connect Vertica to hadoop.
thanks...
I got the resolution... it was compilation issue.
I got the resolution... it was compilation issue.
-
- Newbie
- Posts: 3
- Joined: Wed Nov 20, 2013 7:18 pm
Re: How to connect Vertica to hadoop.
Hello,
I am unable to load a Text File from HDFS into HP Vertica using Hadoop streaming method. I have followed exact steps as mentioned in "HP Vertica Analytics Platform Version 6.1.x Documentation" but no success yet. Can any one help !!
Below is the log details:
Error: Could not find or load main class job_201311211158_0011
[hduser@centos ~]$ hadoop jar $HADOOP_HOME/contrib/streaming/hadoop-streaming-*.jar -libjars $HADOOP_HOME/lib/hadoop-vertica.jar -Dmapred.reduce.tasks=0 -Dmapred.vertica.output.table.name=streaming -Dmapred.vertica.output.table.def="intcol integer, floatcol float, varcharcol varchar" -Dmapred.vertica.hostnames=localhost -Dmapred.vertica.port=5433 -Dmapred.vertica.username=hduser -Dmapred.vertica.password=ranjan14 -Dmapred.vertica.database=Vmart -Dmapred.vertica.output.delimiter="|" -Dmapred.vertica.output.terminator="~" -input /tmp/textdata.txt -output /tmp/output -mapper "python /home/hduser/work/mapper.py" -outputformat com.vertica.hadoop.deprecated.VerticaStreamingOutput
Warning: $HADOOP_HOME is deprecated.
packageJobJar: [/tmp/hadoop-hduser/hadoop-unjar4618000824638461660/] [] /tmp/streamjob5803503291157496379.jar tmpDir=null
13/11/22 02:22:17 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13/11/22 02:22:17 WARN snappy.LoadSnappy: Snappy native library not loaded
13/11/22 02:22:17 INFO mapred.FileInputFormat: Total input paths to process : 1
13/11/22 02:22:17 INFO streaming.StreamJob: getLocalDirs(): [/tmp/hadoop-hduser/mapred/local]
13/11/22 02:22:17 INFO streaming.StreamJob: Running job: job_201311211158_0012
13/11/22 02:22:17 INFO streaming.StreamJob: To kill this job, run:
13/11/22 02:22:17 INFO streaming.StreamJob: /usr/local/hadoop/hadoop-1.2.1/libexec/../bin/hadoop job -Dmapred.job.tracker=localhost:9001 -kill job_201311211158_0012
13/11/22 02:22:17 INFO streaming.StreamJob: Tracking URL: http://localhost:50030/jobdetails.jsp?j ... 11158_0012
13/11/22 02:22:18 INFO streaming.StreamJob: map 0% reduce 0%
13/11/22 02:22:59 INFO streaming.StreamJob: map 100% reduce 100%
13/11/22 02:22:59 INFO streaming.StreamJob: To kill this job, run:
13/11/22 02:22:59 INFO streaming.StreamJob: /usr/local/hadoop/hadoop-1.2.1/libexec/../bin/hadoop job -Dmapred.job.tracker=localhost:9001 -kill job_201311211158_0012
13/11/22 02:22:59 INFO streaming.StreamJob: Tracking URL: http://localhost:50030/jobdetails.jsp?j ... 11158_0012
13/11/22 02:22:59 ERROR streaming.StreamJob: Job not successful. Error: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201311211158_0012_m_000001
13/11/22 02:22:59 INFO streaming.StreamJob: killJob...
Streaming Command Failed!
Thanks in advance!!!
Ravi Ranjan
I am unable to load a Text File from HDFS into HP Vertica using Hadoop streaming method. I have followed exact steps as mentioned in "HP Vertica Analytics Platform Version 6.1.x Documentation" but no success yet. Can any one help !!
Below is the log details:
Error: Could not find or load main class job_201311211158_0011
[hduser@centos ~]$ hadoop jar $HADOOP_HOME/contrib/streaming/hadoop-streaming-*.jar -libjars $HADOOP_HOME/lib/hadoop-vertica.jar -Dmapred.reduce.tasks=0 -Dmapred.vertica.output.table.name=streaming -Dmapred.vertica.output.table.def="intcol integer, floatcol float, varcharcol varchar" -Dmapred.vertica.hostnames=localhost -Dmapred.vertica.port=5433 -Dmapred.vertica.username=hduser -Dmapred.vertica.password=ranjan14 -Dmapred.vertica.database=Vmart -Dmapred.vertica.output.delimiter="|" -Dmapred.vertica.output.terminator="~" -input /tmp/textdata.txt -output /tmp/output -mapper "python /home/hduser/work/mapper.py" -outputformat com.vertica.hadoop.deprecated.VerticaStreamingOutput
Warning: $HADOOP_HOME is deprecated.
packageJobJar: [/tmp/hadoop-hduser/hadoop-unjar4618000824638461660/] [] /tmp/streamjob5803503291157496379.jar tmpDir=null
13/11/22 02:22:17 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13/11/22 02:22:17 WARN snappy.LoadSnappy: Snappy native library not loaded
13/11/22 02:22:17 INFO mapred.FileInputFormat: Total input paths to process : 1
13/11/22 02:22:17 INFO streaming.StreamJob: getLocalDirs(): [/tmp/hadoop-hduser/mapred/local]
13/11/22 02:22:17 INFO streaming.StreamJob: Running job: job_201311211158_0012
13/11/22 02:22:17 INFO streaming.StreamJob: To kill this job, run:
13/11/22 02:22:17 INFO streaming.StreamJob: /usr/local/hadoop/hadoop-1.2.1/libexec/../bin/hadoop job -Dmapred.job.tracker=localhost:9001 -kill job_201311211158_0012
13/11/22 02:22:17 INFO streaming.StreamJob: Tracking URL: http://localhost:50030/jobdetails.jsp?j ... 11158_0012
13/11/22 02:22:18 INFO streaming.StreamJob: map 0% reduce 0%
13/11/22 02:22:59 INFO streaming.StreamJob: map 100% reduce 100%
13/11/22 02:22:59 INFO streaming.StreamJob: To kill this job, run:
13/11/22 02:22:59 INFO streaming.StreamJob: /usr/local/hadoop/hadoop-1.2.1/libexec/../bin/hadoop job -Dmapred.job.tracker=localhost:9001 -kill job_201311211158_0012
13/11/22 02:22:59 INFO streaming.StreamJob: Tracking URL: http://localhost:50030/jobdetails.jsp?j ... 11158_0012
13/11/22 02:22:59 ERROR streaming.StreamJob: Job not successful. Error: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201311211158_0012_m_000001
13/11/22 02:22:59 INFO streaming.StreamJob: killJob...
Streaming Command Failed!
Thanks in advance!!!
Ravi Ranjan