Teradata Connector for Hadoop Now Available

Connectivity
Connectivity covers the mechanisms for connecting to the Teradata Database, including driver connectivity via JDBC or ODBC.

Re: Teradata Connector for Hadoop now available

Teradata Connector for Hadoop (Command Line Edition): Hadoop 2.x: 

Just curious to understand if this version (1.4.1) of TDCH supports Hadoop 2.7, as I am intended to use in MapR 5.0 which as per my knowldge works with Hadoop 2.7.x?

Please advise. Thanks.

Enthusiast

Re: Teradata Connector for Hadoop now available

Can someone point me to link for latest documentation of Teradata Connector for Hadoop 1.4.1 (Command Line Edition) . The documentation above seems very old (Apr 2013).

I am looking to understand what support is provided in new version of connector for incremental imports from teradata to hadoop.

Thanks,

Anand

Enthusiast

Re: Teradata Connector for Hadoop now available

HI,

Issue with importing data from teradata table which has delimiter , \r, \n in description column text which is VARCHAR(100).

As a work around we are using -enclosedby '"' parameter bring data into HDFS, and then processing it to remove \n characters if present in column text.

Sqoop has an option as below, which drop specified delimiter, \n, \r from columns if present in column text.

--hive-drop-import-delims Drops \n, \r, and \01 from string fields when importing to Hive

Not sure, how do I get attention of teradata to resolve this one, if it is included in next release of TDCH it will save lot of processing and data can be straight imported to hive table.

Thanks,

Anand

Enthusiast

Re: Teradata Connector for Hadoop now available

Thanks, I can now see that documentation has been updated and is relavant to v1.4

Cheers!!

Re: Teradata Connector for Hadoop now available

Anand,

You mentioned that hive-drop-import-delims has been solved in TDCH v1.4 which I could not find it. Could you please point me the right direction. Thanks. 

Enthusiast

Re: Teradata Connector for Hadoop now available

Hi,

I am also facing this issue with delimiter \r and \n and don't find any solution for it.

TDCH does not support the feature  --hive-drop-import-delims yet as far i tested.

I have TDCH v1.4 installed.

Even the sqoop command doesn't support this feauture for Teradata. It works for other databases though. The only option i found is to generate sequence file from TDCH.

However, i need to export this file again to Oracle and Sqoop can export only text files.

Please let me know if any suggestions.

Thanks,

Sridhar

Enthusiast

Re: Teradata Connector for Hadoop now available

Hello  i need help please

the problem is that version of teradata,

    InfoKey    InfoData

    LANGUAGE SUPPORT MODE    Standard

    RELEASE    12.00.03.33

    VERSION    12.00.03.38

and read, the prerequisites is : Teradata Database 13.0

and hadoop is apache 2.7, what can i do for i use sqoop?

thanks

René

What can i do?

Re: Teradata Connector for Hadoop now available

Hi

In TDCH command password for db is mentioned in command itself.

So is there any other way so that we don't need to specify password in command ? (i'm trying to hide password)

can we set pass in env variable for any file ?

Enthusiast

Re: Teradata Connector for Hadoop now available

Hello,

As I can see from the TDCH(Version 1.4) user guide it creates multiple JDBC connection to create connection and export data from a Teradata database and that is does not uses Teradata fastexport utitily to extract the data but I would like to know how FastExport extracts the data and how it is different from the method TDCH for the same.

  • It would be very helpful if someone could highlight the main architectural difference between the two methods(TDCH and fastexport for extraction)
  • Which should be a better option(TDCH vs Fastexport utiltiy) to extract huge volume of data(in TB's) from Teradata tables.
  • Whether the TDCH comes with a free license or it comes along with teradata installation
Enthusiast

Re: Teradata Connector for Hadoop now available

Hi,

I would like to know if the TDCH  for teradata supports the following

  1.  Hbase as a target?
  2. Also while loading into Hadoop whether it provided any support for the various compression formats like bzip,snappy?
  3. Whether it provides any feature for streaming data migration between Teradata and Hadoop systems?

Would be very grateful if someone could throw some light into the queries posted above.