Hi Folks !!!
I have an audit table which is defined in Teradata to store the audit counts (number of records read, inserted, updated, deleted, start time, end time) of the jobs executing in the Production. The jobs are defined in Informatica and they connect to teradata through the Teradata connection.
The jobs are defined to run in sequence citing the deadlock as a reason, i have almost 50 jobs which are set to run in sequence. is there any option in Teradata that could be defined to avoid dead lock's (or) any property which would assist the interfacing client jobs to wait till the other job commits the records in database.
Thanks a lot for your suggestions in advance.
I dont remember in Informatica how it is done. But in Ab Initio, you can put in phases.
How about locking? I m thinking in this direction that if it is a script, you can check by locking table .... for write... nowait and do dml on it and if it fails, then you know it is locked.
Thank you Dieter & Raja.
UPDATE AUDIT_D.audt_load_v FROM PM_VK7OEKKZZV5FM5KMHHK5MIBOZLA SET job_cmpl_dtm = PM_VK7OEKKZZV5FM5KMHHK5MIBOZLA.rpcn_cmpl_dtm WHERE (AUDIT_D.audt_load_v .process_strt_dtm = PM_VK7OEKKZZV5FM5KMHHK5MIBOZLA.process_strt_dtm AND AUDIT_D.audt_load_v.pwr_session_txt = PM_VK7OEKKZZV5FM5KMHHK5MIBOZLA.pwr_session_txt )
Update does not happen based on the Primary Index, rather based on other criteria.
Yes there are select's and Updates on the same table. I am not sure on the transaction mode, Informatica source qualifier uses Teradata connection to read and update the table.
Yes thought of using the LOCKING ROW command in the over ride statement, but seems like this statement would be effective or work only if the statement uses Primary Index.
Correct me if i am wrong.