TD Studio FastLoad Error

Teradata Studio
Enthusiast

TD Studio FastLoad Error

Hello - I have an external data file that needs to be imported to table and it fails everytime - I have tried loading to existing table and importing to a volatile table. When I go through the load prompts and can view the data before the load, everything looks correct, but when I press Finish, the data transfer fails. Below is sample data, load data wizard selects and existing table ddl.  The file I receive is pipe-delimited .txt file with double quotes around the text. 

 

Any ideas? Does this issue have anything to do with the timestamp for eventdate?

 

Sample data:

"ClientID"|"SendID"|"SubscriberKey"|"EmailAddress"|"SubscriberID"|"ListID"|"EventDate"|"EventType"|"BatchID"|"TriggeredSendExternalKey"
7280191|109838|"test@gmail.com"|"test@gmail.com"|69925664|839|1/7/2018 7:04:14 AM|"Unsubscribe"|310|""
7280191|123465|"test@hotmail.com"|"test@hotmail.com"|25603738|839|1/2/2018 10:56:25 AM|"Unsubscribe"|84|""
7280191|126293|"test@gmail.com"|"test@gmail.com"|25564752|839|1/3/2018 12:54:57 PM|"Unsubscribe"|2|""
7280191|126230|"test@gmail.com"|"test@gmail.com"|29089021|839|1/4/2018 8:20:52 PM|"Unsubscribe"|134|""
7280191|57739|"26847852_1_19806"|"test@gmail.com"|90008770|671|1/2/2018 10:56:15 AM|"Unsubscribe"|5924|"4894"

 

Load data wizard:

checked column labels in first row

column delimiter = |

character string delimiter = "

line separator = Windows OS (CR and LF)

file encoding =  US-ASCII (also tried Cp1252)

 

ddl:

CREATE SET TABLE import_hold2 ,NO FALLBACK ,
NO BEFORE JOURNAL,
NO AFTER JOURNAL,
CHECKSUM = DEFAULT,
DEFAULT MERGEBLOCKRATIO
(
client_id INTEGER,
send_id INTEGER,
subscriber_key VARCHAR(100) CHARACTER SET LATIN NOT CASESPECIFIC,
email_address VARCHAR(100) CHARACTER SET LATIN NOT CASESPECIFIC,
subscriber_id INTEGER,
list_id INTEGER,
event_date VARCHAR(100) CHARACTER SET LATIN NOT CASESPECIFIC,
event_type VARCHAR(10) CHARACTER SET LATIN NOT CASESPECIFIC,
batch_id INTEGER,
triggered_send_external_key VARCHAR(4) CHARACTER SET LATIN NOT CASESPECIFIC)
PRIMARY INDEX ( client_id );

 

Thanks!


Accepted Solutions
Teradata Employee

Re: TD Studio FastLoad Error

The column event_type is definned with varchar(10) but the value "Unsubscribe" is having 11 characters i.emore than 10 characters and causing the fastload failure.

 

Could you ceate table with event_type  with varchar(20) and try to  load data.

 

CREATE SET TABLE ravi.import_hold2 ,NO FALLBACK ,
NO BEFORE JOURNAL,
NO AFTER JOURNAL,
CHECKSUM = DEFAULT,
DEFAULT MERGEBLOCKRATIO
(
client_id INTEGER,
send_id INTEGER,
subscriber_key VARCHAR(100) CHARACTER SET LATIN NOT CASESPECIFIC,
email_address VARCHAR(100) CHARACTER SET LATIN NOT CASESPECIFIC,
subscriber_id INTEGER,
list_id INTEGER,
event_date VARCHAR(100) CHARACTER SET LATIN NOT CASESPECIFIC,
event_type VARCHAR(20) CHARACTER SET LATIN NOT CASESPECIFIC,
batch_id INTEGER,
triggered_send_external_key VARCHAR(4) CHARACTER SET LATIN NOT CASESPECIFIC)
PRIMARY INDEX ( client_id );

 

Please let me know the result

 

Thanks

 

1 ACCEPTED SOLUTION
2 REPLIES
Teradata Employee

Re: TD Studio FastLoad Error

The column event_type is definned with varchar(10) but the value "Unsubscribe" is having 11 characters i.emore than 10 characters and causing the fastload failure.

 

Could you ceate table with event_type  with varchar(20) and try to  load data.

 

CREATE SET TABLE ravi.import_hold2 ,NO FALLBACK ,
NO BEFORE JOURNAL,
NO AFTER JOURNAL,
CHECKSUM = DEFAULT,
DEFAULT MERGEBLOCKRATIO
(
client_id INTEGER,
send_id INTEGER,
subscriber_key VARCHAR(100) CHARACTER SET LATIN NOT CASESPECIFIC,
email_address VARCHAR(100) CHARACTER SET LATIN NOT CASESPECIFIC,
subscriber_id INTEGER,
list_id INTEGER,
event_date VARCHAR(100) CHARACTER SET LATIN NOT CASESPECIFIC,
event_type VARCHAR(20) CHARACTER SET LATIN NOT CASESPECIFIC,
batch_id INTEGER,
triggered_send_external_key VARCHAR(4) CHARACTER SET LATIN NOT CASESPECIFIC)
PRIMARY INDEX ( client_id );

 

Please let me know the result

 

Thanks

 

Enthusiast

Re: TD Studio FastLoad Error

Yes, that worked. Thank you!