There is a new version, Cloudera Connector Powered by Teradata, that you may want to try out...it takes advantage of TDCH. It's available on Cloudera's website for download.
It's a matter of preference if you prefer to use Sqoop Command Line with the Cloudera Connector Powered by Teradata or Command Line with TDCH. Since the Cloudera Connector Powered by Teradata uses TDCH, the performance is similar to TDCH command line.
I would like to know who is using TDCH, and what stage people are in with respect to their deployment.
Please send me an email (email@example.com) and let me know. Please indicate customer name if you are a Teradata customer.
The Sqoop Integration Edition is for Hadoop distributions to use to integrate with Sqoop. For example, Hortonworks has used it to create the "Hortonworks Connector for Teradata". Cloudera has used it to create the "Cloudera Connector Powered by Teradata". The products use the Sqoop Command Line.
The Teradata Connector for Hadoop (Command Line Edition), doesn't use Sqoop Command Line, just command line.
Teradata Connector for Hadoop is currently certified with HDP 220.127.116.11 and HDP 1.3.
Test it in Hadoop 2.1.0, failed on incompatible interface.
hi..I'm getting the following error message when running the command to import data from Teradata to Hadoop.
13/10/31 15:37:25 INFO tool.TeradataImportTool: TeradataImportTool starts at 1383248245608
Hi. Is there any workaround to run the command without having create view access for the user? I tried using an userID which had create view access, but it is throwing me error " The user does not have CREATE TABLE access to database dbc". Does the userID requires both create table and create view access for the database to run this command?
Since you are using a sourcequery you will need create view and create table access. I think you should set the database to something other than DBC, probably tedw if you have create table and create view access.
Also if you are just trying to play with the tool instead of a sourcequery provide a source table, that way you should not need CT or CV access.
Finally I think you should read this document Teradata Connector for Hadoop Tutorial - Teradata Developer ...
and if you are not familiar with how Teradata works, it would be helpful talk to your DBA and review this document with the DBA.
Thank you thedba.
I got permissions from my user and I'm able to import the table to hdfs using teradata connector(version 1.0.9). Now I'm trying to import the data from teradata directly into hive table. I'm using the following exports and command.
I am getting similar error like @thedba when i am trying to export data from HDFS to Teradata: