Keeping sessions alive between TPT steps

Tools & Utilities

Keeping sessions alive between TPT steps

My script looks like the following. It disconnects between steps even though the target system is the same.  

 

Is there a way to keep sessions alive so it doesn't go through the overhead of disconnecting and re-connecting? 

 

DEFINE JOB Import_DWH_Tables
Description 'Importing Various tables '
(
	
	STEP LOAD_TEAM (
		APPLY $INSERT
		TO OPERATOR ($LOAD ATTR(TargetTable = 'P_DOC_T.DW_DOC_TEAM_W'))
		SELECT * FROM OPERATOR ($FILE_READER ATTR(FileName='DW_DOC_TEAM.txt'));
	);
	STEP LOAD_INVENTORY (
		APPLY $INSERT
		TO OPERATOR ($LOAD ATTR(TargetTable = 'P_DOC_T.DW_DOC_INVENTORY_W'))
		SELECT * FROM OPERATOR ($FILE_READER ATTR(FileName='DW_DOC_INVENTORY.txt'));
	);
);
Tags (1)

Accepted Solutions
Teradata Employee

Re: Keeping sessions alive between TPT steps

No. The "special" sessions used for parallel data transfer with FastLoad, MultiLoad, and FastExport protocols (LOAD, UPDATE, EXPORT operators) cannot be reused.

And the logon overhead tends to be quite small in comparison to the rest of the utility step.

1 ACCEPTED SOLUTION
5 REPLIES

Re: Keeping sessions alive between TPT steps

Is that a no? 

 

I suspect there's no way to get around this behavior but it would be nice if a TPT expert could confirm yes or no.

 

I'm also curious why TPT by default will close all the logon sessions between steps even if the same connection will be needed in a subsequent steps. 

Teradata Employee

Re: Keeping sessions alive between TPT steps

No. The "special" sessions used for parallel data transfer with FastLoad, MultiLoad, and FastExport protocols (LOAD, UPDATE, EXPORT operators) cannot be reused.

And the logon overhead tends to be quite small in comparison to the rest of the utility step.

Re: Keeping sessions alive between TPT steps

Thanks Fred!

 

I'm drawn to using these templates because it cuts way down on the amount of code I need to write in order to load multiple tables. But I'm finding that for smaller tables, the overhead of LOAD or UPDATE operators is excessive.  What would you suggest for loading a bunch of small lookup tables in an efficient way? 

Highlighted
Teradata Employee

Re: Keeping sessions alive between TPT steps

I guess it depends on how small. Starting with least initial overhead (but also lowest throughput):

  • Use the SQL INSERTER operator (number of operator instances controls number of sessions, each session processes one row at a time)
  • Use the STREAM operator (number of sessions is controlled by TPT MaxSessions attribute, and number of rows processed at a time is controlled via Pack attribute)
  • Reduce the number of parallel sessions for LOAD or UPDATE (controlled by workload management on the database side, and usually requested via a Query Band, e.g. QueryBandSessInfo='UtilityDataSize=Small;' is generally available for this purpose with the default configuration)
Teradata Employee

Re: Keeping sessions alive between TPT steps

Each "Step" in a multi-step TPT script can be considered to be a separate job in its own right.

There are no relationships between any steps.

We provide steps in TPT scripts as a convenience (in a way) to allow the user to perform many distinct tasks within a single script.

For example, job setup, load data, cleanup.

 

We took the term "step" from IBM Mainframe JCL. Same concept. Each job step is unique and has no relation to any other job step.

 

 

-- SteveF