hive-vertica integration via SQOOP - [Vertica][VJDBC]

Moderator: NorbertKrupa

Post Reply
ardee3949
Newbie
Newbie
Posts: 1
Joined: Wed Aug 13, 2014 9:17 pm

hive-vertica integration via SQOOP - [Vertica][VJDBC]

Post by ardee3949 » Wed Aug 13, 2014 9:32 pm

Hello all :D
I am trying to move some table with couple of thousands rows that I've created withing Hive into vertica . Before getting to the issue I will state the steps I took :

1- create a table in hive using by :
create table test_load
(uid varchar(200),
record_id varchar(500),
api_id varchar(500),
receive_dt varchar(500),
receive_dt_epoch varchar(500),
insert_dt varchar(500),
insert_table_audit_log varchar(500),
author_age varchar(200),
author_avatar varchar(1000),
author_category varchar(1000),
author_contributions varchar(500),
author_email varchar(500),
author_gender varchar(100),
author_hash_id varchar(500),
author_id_source varchar(500),
author_language varchar(500),
author_link varchar(1000),
author_location varchar(500),
author_name varchar(200),
author_registered varchar(500),
author_talk varchar(500),
author_type varchar(500),
author_username varchar(500),
content varchar(10000),
content_type varchar(400),
content_hash varchar(500),
created_at_dt varchar(100),
created_at_date varchar(100),
created_at_dt_UTC varchar(100),
created_at_date_UTC varchar(100),
geo_latitude varchar(100),
geo_longitude varchar(100),
hashtags varchar(100),
interaction_id_source varchar(100),
link varchar(100),
mention_ids varchar(100),
mentions varchar(100),
publisher_name varchar(100),
received_at_source varchar(100),
schema_version varchar(200),
source varchar(100),
subtype varchar(100),
title varchar(100),
data_source_type varchar(100);
ROW FORMAT DELIMITED FIELDS TERMINATED BY '|';

B-load the data into the hive table I just created by
load data inpath '/tmp/sample_load.txt' into table test_load;
which will successfully put the data into the hive table and everything looks neat!
I set the JDBC connector (vertica-jdbc-7.0.1-0.jar) in/usr/lib/sqoop/lib in my hadoop cluster and as hdfs user run the following command :
sqoop export --driver com.vertica.jdbc.Driver --connect jdbc:vertica://X.X.X.X/BDDEc2a --username hadoop_vertica --password vertica --table test_load_sqoop --export-dir /apps/hive/warehouse/test_load/sample_load.txt --input-fields-terminated-by '|' --lines-terminated-by '\n'

here is the error I get :

14/08/13 15:05:14 INFO mapreduce.Job: Task Id : attempt_1407254734181_0039_m_000000_2, Status : FAILED
Error: java.io.IOException: Can't export data, please check failed map task logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.io.IOException: java.sql.SQLException: [Vertica][VJDBC](4856) ERROR: Syntax error at or near ","
at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:635)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:84)


What am I missing her ?! :(

scutter
Master
Master
Posts: 301
Joined: Tue Aug 07, 2012 2:15 am

Re: hive-vertica integration via SQOOP - [Vertica][VJDBC]

Post by scutter » Thu Aug 14, 2014 3:57 pm

Have you tried using the HCatalog Connector instead of using Sqoop?

—Sharon
Sharon Cutter
Vertica Consultant, Zazz Technologies LLC

Post Reply

Return to “JDBC”