Export to Hadoop table fails

Teradata Studio
Enthusiast

Export to Hadoop table fails

Exporting data from a Teradata table to an Hive table using the studio (v14.10.01.201310271220) fails silently. Nothing happens since there is a bad parameter called within the tdch command launched.

If you dig through the mapreduce job logs and display the only one task logs you'll see that the job terminated successfully but with this error message:

unrecognized parameter: -mappernum
hadoop jar teradata-hadoop-connector.jar
com.teradata.hadoop.tool.TeradataImportTool
[-conf <conf file>] (optional)
[-jobtype <job type>] (values: hdfs, hive, and hcat, default is hdfs)
[-fileformat <file format>] (values: sequencefile, textfile, avrofile, orcfile and rcfile, default is textfile)
[-classname <classname>] (optional)
[-url <url>] (optional)
[-username <username>] (optional)
[-password <password>] (optional)
[-batchsize <batchsize>] (optional, default value is 10000)
[-accesslock <true|false>] (optional, default value is false)
[-queryband <queryband>] (optional)
[-targetpaths <path>] (optional, applicable for hdfs and hive jobs)]
[-sourcetable <tablename>] (optional, use -sourcetable or -sourcequery but not both)
[-sourceconditions <conditions>] (optional, use with -sourcetable option)
[-sourcefieldnames <fieldnames>] (optional, comma delimited format)
[-sourcequery <query>] (optional, use either -sourcetable or -sourcequery but not both)
[-sourcecountquery <countquery>] (optional, use with -sourcequery option)
[-targetdatabase <database>] (optional)
[-targettable <table>] (optional)
[-targetfieldnames <fields>] (optional, comma separated format
[-targettableschema <schema>] (optional, comma separated format
[-targetpartitionschema <schema>] (optional, comma separated format, used with -targettableschema only
[-separator <separator>] (optional,used to separate fields in text)
[-lineseparator <lineseparator>] (optional, used to separate different lines, only useful for hive and hcat job)
[-enclosedby <enclosed-by-character> (optional, used to enclose text, only useful for hdfs job)]
[-escapedby <escaped-by-character> (optional, used to escape special characters, only useful for hdfs job)]
[-nullstring <string>] (optional, a string to replace null value of string type)
[-nullnonstring <string>] (optional, a string to replace null value of non-string type, only useful for hdfs job)
[-method <method>] (optional import method, values: split.by.partition, split.by.hash, split.by.value and split.by.amp only for Teradata version 14.10.00.02, default is split.by.hash)
[-nummappers <num>] (optional, default is 2)
[-splitbycolumn <columnname>] (optional for split.by.hash and split.by.value methods)
[-forcestage <true>] (optional force to use stage table, default is false)
[-stagetablename <tablename>] (optional, stage table name should be 20 characters or less)
[-stagedatabase <database>] (optional)
[-numpartitionsinstaging <num>] (optional)
[-hiveconf <target path>] (optional, required for hive and hcat jobs launched on non-name nodes)]
[-usexview <true|false>] (optional, default is true)
[-avroschemafile <path>] (optional, a file path for Avro schema definition)
[-h|help] (optional)

Intercepting System.exit(16)

<<< Invocation of Main class completed <<<

Failing Oozie Launcher, Main class [com.teradata.hadoop.tool.TeradataImportTool], exit code [16]

Oozie Launcher failed, finishing Hadoop job gracefully

-nummappers parameters should be used instead.

I'm using teradata hadoop connector 1.2.1 but this parameter is still not good with versions 1.0.6 and 1.0.9

1 REPLY
Teradata Employee

Re: Export to Hadoop table fails

We noticed that TDCH 1.2.1 fails with the bad parameter. The 1.0.9 (the version we support...we don't yet support TDCH 1.2.1) ignored the parmeter.

The configureOozie script for our upcoming release will not specify that parameter.