Read and Write FastExport file format in Hive has just become easier

Hadoop
The Teradata Portfolio for Hadoop is a flexible offering of products and services for our customers to integrate Hadoop into a Teradata environment and across a broader enterprise architecture, while taking advantage of the world-class Teradata service and support. The Hadoop Channel covers the hardware and software features, tips and best practices on all the components of the Teradata Portfolio for Hadoop.
Enthusiast

Read and Write FastExport file format in Hive has just become easier

https://cwiki.apache.org/confluence/display/Hive/TeradataBinarySerde

 

A new SerDe has been made available to speed up the data exchange between Teradata and Hive.

 

Typical steps to move data from Teradata to Hive:

  1. export table via TPT into multiple gzipped binary files
  2. upload the files via webhdfs to HDFS
  3. register the files as Hive external table using the Teradata SerDe

Steps to move data from Hive to Teradata:

  1. insert into a Hive table which is stored as Teradata SerDe
  2. download the files via webhdfs to a local server which is close to Teradata
  3. import files via TPT into empty staging table
Tags (3)