Subject: | |
From: | |
Reply To: | |
Date: | Tue, 11 Jan 2000 11:59:32 -0800 |
Content-Type: | text/plain |
Parts/Attachments: |
|
|
Hi, Kim
I agree your idea.
1. Make Dummy flag like Post or Unpost, Date&Time Stamp to sorted order on detail Dataset..
2. Next, Backward Chain read on this set by unpost flag
until you reach end of chain(last Posted date).
3. During or after the process move unpost flag to post flag(critical item)..
Hope this help
Peter Chong
Sr. ERP/MRP Analyst.
714.956.9200 x 363
http://www.powerparagon.com
>>> Kim Heckler <[log in to unmask]> 01/11/00 11:33AM >>>
I have an application that must move transactions from an Image database
(detail Dataset) over to an Oracle database. The transactions must be
moved in chronological order (in sorted date and time order). We decided
to use a database, versus a KSAM/XL file or a sorted flat file, because we
don't run the risk of losing data in the case of a system crash.
Currently I have a COBOL program that reads through the detail Dataset
serially, once the record successfully added to the Oracle table, then the
recorded is deleted from the Image Dataset (this is a batch process). When
the COBOL program reaches the end of file mark, it then starts over at the
top of the Dataset with a new serial read. While the program is moving
these transactions, more records are being added interactively (on-line
process).
My question: to insure the correct order of the transactions
(chronological order: oldest to newest) would it be wise to add a dummy key
field, which always has the same value, and have the date and time stamp be
a sort field of that dummy key path? The batch COBOL program could read
down the chain and delete records as it reads them.
Does the manner in which I'm doing this now, without the dummy key field,
insure I will get the records in chronological order? I do not think it
will.
Thank you in advance for your help.
Kim.
|
|
|