We are loading table 'B' from 'A' through Teradata Trigger ( When ever there is an insert into table 'A' processed record will get inserted into 'B'. Its a real time load and happens 24x7).
Now we trying to DELETE (till yesterday) old data from 'B', but we are facing Dead lock issue because the table is getting loaded by trigger parallel.
Is there any solution to delete data while its getting loaded? We want to delete old data only and nothing from what trigger is currently loading.
I imagine you were trying a set DML command like "delete from <table> where txn_date < current_date." This would require a table-level lock. If there is any way to Stop those triggers for just a few (or two?) minutes, you could insert-select just today's activity to a new table, drop the old table, rename the new table and Start the triggering processes again.
The next best way would be to Export (via TPT) all the primary index values for the rows you want to delete, and then run another TPT script using the Stream operator to read those PI values and delete those rows. That would execute in parallel and use row hash-level locking.
A similar but slower method would be to write a stored procedure with a cursor that reads each row and deletes it, one at a time. Although using cursors in this way is against my religion, and it would take a long time, it would work. You could speed this up though by instantiating dozens (or hundreds?) of such SPs, each one reading and deleting a different subset of the rows, but this still wouldn't be as fast as the Stream operator, and it would be more likely to get deadlocks on hash collisions.
Is the real-time load inserting one row per transaction or multiple rows before committing? What type of trigger (Row or Statement)?
What about the DELETE transaction? One statement for the entire table, one row per transaction, or multiple single-row deletes in one transaction?
You would have to use the Stream Operator whenever you can't get a table write lock. Streamed transactions will lock by row hash.