after using the Teradata Hadoop Connector to load some data to a table in Teradata I noted the parameter "sourcepaths" assumes the folder is located in the hadoop machine assigned to the machine where I'm running the command.
Do you know if it's possible to use full path (including the server and port) to specificy the location of the source data to be imported? Right now I tried by setting the full uri to the file (something like hdfs://server:port/path/to/file) but the tool skips everything and assumes path/to/file is on the server is the one assigned to the machine where I'm running the command.
Because TDCH makes use of the MR framework to interface with data in a given Hadoop cluster, and the TDCH job is being submitted to the MR framework through the client node of that Hadoop cluster, TDCH can only read/write data in the Hadoop cluster with which the client node is associated. It is possible to modify the client node such that it submits MR jobs to a different Hadoop cluster (via modification of the local Hadoop config files), but specifying the details of the different Hadoop cluster via TDCH arguments is not sufficient.