Hello, I just upgraded to TTU15 on a Windows 64 bit OS, and intalled all the goodies. Now I'm running a large monthly process where there are a lot of FASTEXPORTS And FASTLOADS to be done, all as BTEQ scripts run in a DOS window. FASTEXPORT no issues. Upon running the FASTLOAD, issues, I get "Empty file on read open" and it finished out with zero rows read, etc... yet the file is 5GB of pipe delimited data file.
Can some one please share with me a simple way to convert a straight forward BTEQ FASTLOAD run on Windows DOS Window to a TPT FASTLOAD that can be run the same basic way so the new TTU15 can process this data load? I'm hoping there is little change, like just syntax to the existing scripts. Many thanks in advance. Please let me know what additional detail you may need.
-- Begin sample script
-- BTEQ Old School FASTLOAD
SESSIONS 10 ;
ERRLIMIT 50 ;
RUN C:\some_logins_dir\xxxx_LOGON.TXT ;
SHOW VERSIONS ;
.SET RECORD TEXT
DROP TABLE database.target_table ;
DROP TABLE database.target_table_err1 ;
DROP TABLE database.target_table_err2 ;
CREATE SET TABLE database.target_table
, NO FALLBACK
, NO BEFORE JOURNAL
, NO AFTER JOURNAL
, CHECKSUM = DEFAULT
, DEFAULT MERGEBLOCKRATIO
column_name_1 VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
column_name_2 VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
column_name_3 VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC
PRIMARY INDEX ( column_name_1 );
RECORD 1 ;
SET RECORD UNFORMATTED ;
SET RECORD VARTEXT "|" NOSTOP;
, column_name_2 (VARCHAR(255))
, column_name_3 (VARCHAR(255))
FILE C:\some_dir\filename_here.TXT ;
BEGIN LOADING database.target_table
database.target_table _err1 ,
database.target_table _err2 ;
INSERT INTO database.target_table
END LOADING ;
-- end of sample script
The "Empty file on read open" can happen because of the following reasons
1) It cannot find the source data file. Make sure the file exists and that the user had read permisisons on the file
2) The BAD file directory either does not exist or the user executing the script does not have permisisons to write to the bad directory or that there is a file already with the same name in the BAD directory written by some one elses process and that you do not have write permissions on the file
3) Check the checkpoint and logs directory to make sure yuo have permisisons to write both to the directories as well as to the file names if they already exist.
Hope this helps.
I checked this, the access and priv, several times, and it still gives that same "enpty file on read open". So I setup and sampled off 100 rows from the FEXP extract and then loaded the 100 rows with the FLOAD setup, and it read the file fine and loaded data. Setting up to let it run the whole thing, and make a ~5GB file, as done over a dozen times in the last 1 year, it fails with this strange error. The only change to running this regular set of FEXP and FLOAD jobs now is is that I upgraded to TTU15. I get the impression there may be a file load size limit in place now?? I'm stumped.
What version of FastLoad were you running when you get the error?
And just to confirm, when you upgraded to TTU15, FastLoad 15.0 works fine with the large file?
I'm running the latest FASTLOAD version 15 when the error occurs. I used TTU 13 and TTU 14 when I was running the same BTEQ/FLOAD with no error. To upgrad eto TTU15, I had to remove every trace of previous TD products. So now when I run, it's all TTU15 and I now get the 'empty file on read open:" error on the 5GB file.
I've renamed the file to a smaller name, I've moved to to another dir, I've made sure of permissions, and I've recreated it a few times, and to no avail, the large file gives the error. A 100 row file using the TTU15 and old existing BTEQ FEXP and FLOAD jobs loads fine no errors.
BTW, this is on a PC, Win 7, 64 bit OS and installed the 64 bit of TTU15. I understood the TTU15 x64 software to install all 64 bit tools. When I check the FASTLOAD version is 15.00.00 I also see it say PLATFORM WIN32. I see lots of tools for 64 bit in that package, including TPTbase and TPTstream. Did the older utils not get a 64bit facelift? Is this ultimately a 64bit incompatibility I'm facing?
FastLoad is only delivered as a 32-bit app. All of the older utilities remain 32-bit.
TPT has both but defaults to 64-bit.
I am curious about this statement:
"a lot of FASTEXPORTS And FASTLOADS to be done, all as BTEQ scripts"
BTEQ cannot run FastLoad scripts.
However, back to the matter at hand.
I think you were asking about a TPT script that can do the same thing as FastLoad.
I can help you out with that but need a few things explained.
In the FastLoad script, I see that you have multiple SET RECORD commands.
Please explain why you did this.
You can actually do this with TPT rather simply.
DEFINE JOB my_job
'INSERT INTO database.target_table
) TO OPERATOR ($LOAD)
SELECT * FROM OPERATOR ($FILE_READER);
And then a job variable file to pass in the metadata for the job:
You would then run the job via:
tbuild -f <script-name> -v <job-variable-file-name>
The TPT Load operator will not create the table for you.
You can do that in a TPT script as well, but that should be lesson #2. :)
I would like to see you get the TPT job running first.
(Or check out the TPT manual for the job-step syntax and use the DDL operator to do the DROPs and CREATEs.)
Thanks for this feedback and help. What I believe at this time the issue is for my configuration, 64bit hardawre, 64bit os, and 64bit ttu15, that using the older 32 bit utils to run the fastload scripts will error on large file sizes, and that it is a 32/64bit incompatability.
This means that if I insist to run them in my old school way, I have to back out ttu15 64 bit, and install ttu15 32 bit. I have checked within my environment, and others successfully run ttu15 and fastload scripts the old school way (plus tpt), BUT they are all 32 bit and no issues with large files.
For my particular application, I'm pulling data from one TD platform and loading to another TD platform, and going back and forth several times after code exec and data change to the data that gets passed back and forth. I think that TPT to 'mainline' it and cut out the creation of a physical file and loading it back and forth many times from the 'outside' (upload/fastload from PC) will be best. However, that will change some of my established processes.
I have started to go through the process of coverting the fexp/fload to tpt, I have ot running now, with issues that I'm hammering out now. I trust that going this route, will keep it 64bit end-to-end, and eliminate a file size issue, as namely, there will be no 'file' in the traditional sense.
I meant I was kicking off the fload and fexp script from a DOS Window and I was calling that BTEQ in error.
Well, 32-bit FastLoad should be able to read a 5GB file. I will have someone look into that.
On the flip-side, I am glad you will move to TPT.
It is the best solution.
To move data from one database or system to another without landing the data is one of TPT's strongest assets.
If you need help, let me know.
Please install DataConnector 15.00.00.02 Efix or later to resolve "empty file on read open" error for large files on Windows platform.