copying data from 1 databse to other using BTEQ and parameterising table name

Teradata Studio

copying data from 1 databse to other using BTEQ and parameterising table name

I am new to teradata and  have a scenario where i need to copy some of my data from higher region to lower region based on 2 dates  and there are around 100 table .Is there a way to automate it using bteq 

 

Usually I was using the below statement to copy my data till now

 

INSERT INTO <LOWER_REGION>.TB1 SELECT * FROM <UPPER_REGION>.TB1 WHERE DT BETWEEN <DT1> and <DT2>.

 

Please provide your valuable suggestions

 

 

Tags (3)
6 REPLIES
Enthusiast

Re: copying data from 1 databse to other using BTEQ and parameterising table name

Run the 

 

select 'Insert into ' || to_table || ' select * from ' || from_table || ' where change_this_to_suit_your_need ;'
from
(
select tablename, trim(DatabaseName ) || '.' || trim(tableName) from_table
from dbc.tables where DatabaseName = 'your_from_database'
) f
,
(
select tablename, trim(DatabaseName ) || '.' || trim(tableName) to_table
from dbc.tables where DatabaseName = 'your_to_database'
) t
where f.tablename = t.tablename

 

 

Run the above sql which will give you set of sqls to move the data from your source-to-target database.

This assumes that you have exact same tablename in both source and destination databases.

The where-clause is little tricky, unless you have the same column-name on all your tables you are a bit out of luck here.

if you know that there is only one date column in the table then you can modify this query to pull that date-column into your where-condition.

Good luck!

 

Re: copying data from 1 databse to other using BTEQ and parameterising table name

Thanks this approach I had already tried as it will generate 100 sql statements which I need to manually run but my requirement is slightly different .

I want to put all the list of all table names(100 tables ) into text file eg  <table.list>

TO and FROM  dates should also be put in a text file and

then I want to make my below  INSERT statement paramtererised or generic in nature in the bteq script 

 

so that my bteq script can take the table names and dates automatically from  the list file (table.list) and all the staetments gets executed recursively without any manual intervension

Please provide your suggestions

 

Thnaks in advance!

 

Enthusiast

Re: copying data from 1 databse to other using BTEQ and parameterising table name

to do that way, you will need to use bteq's "import" option. 

then write the sql out to a temporary file.

Then execute that temporary file from the same bteq ..

 

if this is a one time effort I will simply generate the sqls from SQLAssistant and paste the generated sql in another window and run.

SQL Assistant logs the sql and the message. So if any of the sql errors out you can revisit to correct and rerun that sql.

Hope this helps

 

Re: copying data from 1 databse to other using BTEQ and parameterising table name

Yeah I agree with you but this task is not a one time effort as the number of tables to be loaded varies with the requirement and we recieve the table names in a file to be copied 

in the below format

<SRC_TBL_NM> | <TARGET TB NM> |FROM_DT | TO_DATE

eg : abd.tab1 | bbb.tab2 |2016-05-06 | 2016-05-10

abd.tab1 | bbb.tab2 |2016-04-06 | 2016-04-08

 

The loading date for each of the table varies too.hence trying a different approach other than going for a stored proc. and trying to explore the bteq option.It would be great if you can provide some insight :)

Teradata Employee

Re: copying data from 1 databse to other using BTEQ and parameterising table name

You can try this using Teradata Studio. The Copy Table will allow you to provide a filter on the column data that converts to a WHERE clause for the INSERT/SELECT.

Re: copying data from 1 databse to other using BTEQ and parameterising table name

Thanks guys for your suggestions but I got my own solution and wrote a small bteq to do this task and a small shell script to automatically trigger this based on user requirement :)