These are the files I am having,
Recently, we have an investigation of the following usage in hadoop to find if we should imporove our product to support following cases. Has anybody has the requirement to run a hive job on the hadoop slave node(not hive node), through the other tools or applicatons? like templeton.
SQOOP TD Connector I had used was from Cloudera - version 1.0.5.
I have not tried TDCH command line yet. Is that recommended over sqoop with TD connector? Is that expected to be faster? If yes, why?
I have a couple of situations
1. Teradata 13.10.6 and HDP 1.0.X
2. Teradata 13.10.6 and HDP 1.3
Which connectors should i be using which would suit the above versions?
What is the difference between the command line edition and the sqoop integration edition?
Please provide your email address. The Cloudera Connector for Teradata doesn't use TDCH. The new Cloudera Connector Powered by Teradata does. It should be out on Cloudera's website soon.
Have you seen the Teradata Connector for Hadoop Tutorial that is attached? Sqoop is open source so you can get more information here: http://sqoop.apache.org/
Have you tried using the Teradata Connector for Hadoop (Command Line Edition)? The Teradata Connector for Hadoop (Sqoop Integration Edition) is designed for hadoop distribution vendors. You can download the product with manuals from the hadoop distribution's website.
Which Sqoop connector are you referring to? Hortonworks or Cloudera? Did you try to use TDCH (Command Line Edition)?
TDCH has not been certified with HDP 188.8.131.52. We have certified with HDP 184.108.40.206. The next certification in the pipeline is HDP 1.3.