Importing data for dynamic table names is failing in Bteq

Teradata Applications
Enthusiast

Importing data for dynamic table names is failing in Bteq

 

Hi All

 

I am working on one Teradata requirement where I need to delete data of many tables. I am using Teradata Import with a text file containing list of table names and repeat option for this. Below is my code snippet:

 

.Import data file='table_names.txt';
.Repeat 5
Using (tabname VARCHAR(40))
SELECT * FROM :tabname;

 

My text file contains data in the format: 1M4_XZ_CCC_134OPPP_D340_H_try

 

I am getting the following error: 

.Import data file='table_names.txt';
+---------+---------+---------+---------+---------+---------+---------+----
.Repeat 5
+---------+---------+---------+---------+---------+---------+---------+----
Using (dot VARCHAR(40))
SELECT * FROM :dot;
*** Starting Row 0 at Tue Sep 26 09:04:26 2017

*** Error: The following occurred during an Access Module read:
Unexpected data format !ERROR! EOF encountered before expected EOR.
*** Warning: Out of data.
*** Finished at Tue Sep 26 09:04:26 2017

*** Total elapsed time was 1 second.

 

I found no relevant clue on this as the same code is running fine if i include variable in where clause with a fixed table name. Any help is appreciated, thanks in advance!

 


Accepted Solutions
Senior Apprentice

Re: Importing data for dynamic table names is failing in Bteq

Hi Bunny,

 

BTEQ does not have the capability for using a value read through an IMPORT command as an object name - in your case a table name. Values read using IMPORT can only be used as data values (e.g. a value in a column on a row in a table).

 

If you want to use BTEQ to delete all rows from a number of tables you have to have a couple of steps.

 

The first step is to build all of the "DELETE FROM table-name;" statements that you want to execute. This is the output from the 'export' processing that I showed earlier, although having looked back at it I understand why it might not make sense to you.

 

The second step is to read the 'delete' statements that you've built. this step actually does the work for you.

 

You still use the text file that you've been given, but in a different way.

 

The full processing is:

.export report file = delete_statements.txt;

.Import data file='table_names.txt';
.Repeat *
Using (tabname VARCHAR(40))
select 'DELETE FROM '||databasename||'.'||tablename||';' (title '')
from dbc.tablesv
where tablename = :tabname;

.export reset;
.run file = delete_statements.txt;

So the IMPORT command names your existing text file that contains a list of tablenames.

BTEQ loops through this file (using REPEAT *) and for each one returns a piece of text which contains (for example) "DELETE FROM mydb.mytable;"

This text is appended to another text file - identified by the EXPORT command (my example uses 'delete_statements.txt').

After the REPEAT loop is finished, the RUN FILE command is used to read in the content of 'delete_statements.txt' which BTEQ now treats as SQL commands (and not 'data').

 

The job gets done.

 

The above is using BTEQ features and capabilities. You could also do this using a stored procedure (SP).

Write the SP to read in a tablename.

It uses this tablename to create a "DELETE FROM" sql statement (delete_sql_statement).

Use the Teradata supplied SP "CALL DBC.SYSEXECSQL(delete_sql_statement);" to execute the sql that you've build.

 

If you take this approach your BTEQ script now becomes:

.Import data file='table_names.txt';
.Repeat *
Using (tabname VARCHAR(40))
call mysp(:tabname);

Yes the BTEQ script is much simpler, but you have to create the SP first.

 

A couple of things that you also have to think about:

- are all your tables in a single database - which always has the same name? If not then your existing text file needs to have the databasename as well, i.e. two fields on each record.

- if you use the BTEQ/EXPORT method, note that when BTEQ opens a file for EXPORT, it always opens an existing file in 'append' mode, so you'll need to delete that file before each run.

- for syntax and examples of writing an SP check the Teradata manuals site (http://info.teradata.com/HTMLPubs/DB_TTU_15_10/index.html#page/SQL_Reference/B035_1148_151K/ch05.110...).

 

HTH

Dave

 

Ward Analytics Ltd - information in motion
www: http://www.ward-analytics.com
1 ACCEPTED SOLUTION
9 REPLIES
Senior Apprentice

Re: Importing data for dynamic table names is failing in Bteq

Hi,

 

Sorry but that is not allowed when using BTEQ.

 

The variable from the USING clause needs to be used as a data value in your query, not as an  object name.

 

I also suspect that the IMPORT clause should be IMPORT REPORT (not 'DATA') causing the actual content of each row to be unreadable. Having said that, the IMPORT is not required.

 

In order to run code like this your text file needs to contain entire SQL statements, e.g.:

SELECT * FROM table_name;

 

Having built that file you can then use:

.RUN FILE = table_names.txt;

 

...and BTEQ will run through each line of the file executing the sql statement.

 

But first you have to build the file. You could try something like:

 

.export report file = table_names.txt;

select 'select * from '||databasename||'.'||tablename||';' (title '')
from dbc.tablesv
where ???
;

.export reset;
.run file = table_names.txt;

The ??? in the above code is your selection criteria that identifies all tables that need to be processed.

 

 

You might need to change the BTEQ WIDTH setting if you have long table names.

 

Cheers,

Dave

Ward Analytics Ltd - information in motion
www: http://www.ward-analytics.com
Enthusiast

Re: Importing data for dynamic table names is failing in Bteq

Hi Thanks for your response. I am new to Teradata, I am not able to understand how export will work for my scenario. My requirement is deleting a set of tables, where list of tables will will be available in a external text file. I need to access this text file and pass table name to the bteq script one by one like in a loop to execute delete command. Can you pelase explain in detail. Thank you.

Tags (1)
Senior Apprentice

Re: Importing data for dynamic table names is failing in Bteq

Hi Bunny,

 

BTEQ does not have the capability for using a value read through an IMPORT command as an object name - in your case a table name. Values read using IMPORT can only be used as data values (e.g. a value in a column on a row in a table).

 

If you want to use BTEQ to delete all rows from a number of tables you have to have a couple of steps.

 

The first step is to build all of the "DELETE FROM table-name;" statements that you want to execute. This is the output from the 'export' processing that I showed earlier, although having looked back at it I understand why it might not make sense to you.

 

The second step is to read the 'delete' statements that you've built. this step actually does the work for you.

 

You still use the text file that you've been given, but in a different way.

 

The full processing is:

.export report file = delete_statements.txt;

.Import data file='table_names.txt';
.Repeat *
Using (tabname VARCHAR(40))
select 'DELETE FROM '||databasename||'.'||tablename||';' (title '')
from dbc.tablesv
where tablename = :tabname;

.export reset;
.run file = delete_statements.txt;

So the IMPORT command names your existing text file that contains a list of tablenames.

BTEQ loops through this file (using REPEAT *) and for each one returns a piece of text which contains (for example) "DELETE FROM mydb.mytable;"

This text is appended to another text file - identified by the EXPORT command (my example uses 'delete_statements.txt').

After the REPEAT loop is finished, the RUN FILE command is used to read in the content of 'delete_statements.txt' which BTEQ now treats as SQL commands (and not 'data').

 

The job gets done.

 

The above is using BTEQ features and capabilities. You could also do this using a stored procedure (SP).

Write the SP to read in a tablename.

It uses this tablename to create a "DELETE FROM" sql statement (delete_sql_statement).

Use the Teradata supplied SP "CALL DBC.SYSEXECSQL(delete_sql_statement);" to execute the sql that you've build.

 

If you take this approach your BTEQ script now becomes:

.Import data file='table_names.txt';
.Repeat *
Using (tabname VARCHAR(40))
call mysp(:tabname);

Yes the BTEQ script is much simpler, but you have to create the SP first.

 

A couple of things that you also have to think about:

- are all your tables in a single database - which always has the same name? If not then your existing text file needs to have the databasename as well, i.e. two fields on each record.

- if you use the BTEQ/EXPORT method, note that when BTEQ opens a file for EXPORT, it always opens an existing file in 'append' mode, so you'll need to delete that file before each run.

- for syntax and examples of writing an SP check the Teradata manuals site (http://info.teradata.com/HTMLPubs/DB_TTU_15_10/index.html#page/SQL_Reference/B035_1148_151K/ch05.110...).

 

HTH

Dave

 

Ward Analytics Ltd - information in motion
www: http://www.ward-analytics.com
Enthusiast

Re: Importing data for dynamic table names is failing in Bteq

Hi Dave

 

It worked like a charm, it solves my issue. Great thanks for the solution. I tried using Export report file and Import options, its working great but the only problem is I am trying to capture the success/failure status of delete statements and also the table availability when we are creating delete statements in report file. I am trying to echo the tables not available in dbc.tablesv using if activity_count condition after select statement but its not working, seems not possible to capture :tabname in echo statement as you said earlier. Also do we have any way of capturing only the failed table names while running .run file, please suggest. 

Senior Apprentice

Re: Importing data for dynamic table names is failing in Bteq

Hi Bunny,

 

I'm glad that worked for you.

 

You can't do what you're trying to do (in terms of recording success/failure etc.) using native BTEQ functionality, you'll need to write an SP for that. Within an SP you can trap the success/failure of the DELETE and write information to a 'log table' - as an example.

 

Cheers,

Dave

Ward Analytics Ltd - information in motion
www: http://www.ward-analytics.com
Enthusiast

Re: Importing data for dynamic table names is failing in Bteq

Hi Dave

 

Thanks for correcting me. Is it possible to have two report files while using import with repeat option ? I am trying to write two SQLs inside repeat and redirect query results into two separate text files, is it possible ?

Senior Apprentice

Re: Importing data for dynamic table names is failing in Bteq

Hi Bunny,

 

Sorry but you can't do what I think you're trying to do - not with BTEQ.

 

I think your desired setup is something like:

 

.EXPORT FILE = output-data-file1;
.EXPORT FILE = output-data-file2; .IMPORT FILE = input-data-file; .REPEAT * USING (input-record-field-definitions) sql-statement#1 /* output from here to output-data-file1' */ ;sql-statement#2; /* output from here to output-data-file2 */ .EXPORT RESET;

You can only have one 'export' file opened at a time, the example above will not write the output from different SQL statements to different files..

 

 

If I've misunderstood what you're trying to do please let me know.

 

Cheers,

Dave

 

Ward Analytics Ltd - information in motion
www: http://www.ward-analytics.com
Enthusiast

Re: Importing data for dynamic table names is failing in Bteq

@dave: No Dave, you got me right there. So we dont have an option except importing data file twice with two different sqls or using stored procedure ?

Senior Apprentice

Re: Importing data for dynamic table names is failing in Bteq

Correct. And remember that with the SP approach your 'output' is to a database table and not an external file.

HTH

Dave

Ward Analytics Ltd - information in motion
www: http://www.ward-analytics.com