I am running a query that creates a volatile multiset table, runs an INSERT INTO step to populate the table, and follows with a SELECT * from that volatile table. I have two issues for which I would love to find solutions.
First, after I run the INSERT INTO step, in my test case, it reported 1,000 records written. When I ran the SELECT *, it says the table doesn't exist. I reran the INSERT step a second time and again it reported 1,000 records written. Trying the SELECT again, it found the table, but displayed 2,000 records. I would like to use the table populated the first time around, but I haven't been able to do that. (This worked in a later test, but it doesn't seem to be dependable.)
Second, I would like to know if there is any way to turn the "INSERT DATA" (under the File menu) option on from within the query. "Shift+Insert" is the keyboard command to do that, but I would like to do it programmatically instead of having to do it in 3 separate queries. Also desirable would be the ability to specify the path and name of the .txt file. Essentially, I would like to be able to submit the entire query without getting either the message that says "Input style parameters found in query" or the one that says "No Import parameters ... found in this query"
My query is:
/* CL_PROMO_COUPON_HIST_BCARD */ CREATE VOLATILE MULTISET TABLE mytbl7 ( cl_tid CHAR(12), DL_TID CHAR(12), PRODID CHAR(3), PROMOID CHAR(2), CPNAMT CHAR(12), CPNSTAT CHAR(15), RDMAMT CHAR(12), RDMDTE CHAR(10), UPDTE CHAR(10) ) PRIMARY INDEX ( CL_TID ) ON COMMIT PRESERVE ROWS; (received CREATE TABLE completed. 0 rows processed.)
(turn on import mode) INSERT INTO mytbl7 (cl_tid, dl_tid, prodid, promoid, cpnamt, cpnstat, rdmamt, rdmdte, updte) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?) ; (supply the name of the .txt file)
(1,000 records imported)SELECT * FROM mytbl7;
(SELECT completed. 1000 rows returned)
Any suggestions would be much appreciated.
I'm not sure this can be done using SQLA, that tool is intended to be a desktop, interactive tool.
Have you looked at using BTEQ to do this? You can run that from a Windows client and you can 'script' all sorts of stuff.
- this will work fine if your data file always has the same path/file name
- you then use an ".IMPORT" command to identify the file, a USING clause to describe each input field and a ".REPEAT" command to loop through all rows in the file.
I don't know what happened to generate your 'table not found' error. if you can recreate it dependably then we can have a look at it.
Thanks for the reply. Not the answer I was hoping for. I haven't used BTEQ in more than 10 years. I guess it's time to refresh.
As for the "table not found" issue, it's not predictable. Sometimes it works, sometimes it doesn't. The comments I inserted into the code listing are about the only things I have. Maybe I'll try SnagIt Video Capture and see if it will help. If I get anything useful out of it, I'll send another reply with an attachment.
You will get the table not found when you submit a 2nd query in another tab while the 1st is still running and didn't finish, yet.
SQL Assistant will open a new session, run the query and logoff again.
You can check dbc.LogonOffV for those short sessions.
dnoeth, thanks for the suggestion, but it's not a tab issue. The three pieces of the query are run in the same tab. The create step always works. Sometimes the import step works. But if I make it that far, the select step usually fails. It's always the same message - table doesn't exist.
As for checking the dbc.logonoffv file, how do I do that? I don't have any admin permissions, if they are required.
(I come from the SAS world where logs are created automatically. I miss that in SQLA. Is there any way to get information about the query processes other than the History tab, or to see what tables/views including volatile tables are available to the query?)
select * from dbc.LogonOffV where username = user and LogDate = date '2017-07-21' order by LogTime
returns one row per event, usually a Logon and a Logoff per session.
If you don't have access you can try dbc.LogonOffVX instead.
Regarding logs, there's the Database Query Log (DBQL) with lots of resource usage details about finished queries, but the DBAs need to enable it (what they usually do) and you need access rights.
Available tables (and other objects) are found in dbc.TablesVX, for Volatile Tables there's HELP VOLATILE TABLE, but you should know which tables you created within the current session.
You might add a SELECT SESSION between the different queries to see if the session number changes.