Subject: | |
From: | |
Reply To: | |
Date: | Fri, 28 Apr 1995 18:37:09 +1000 |
Content-Type: | text/plain |
Parts/Attachments: |
|
|
At 11:17 PM 27/4/95 -0400, Denys Beauchemin wrote:
>I respectfully disagree that your solution is efficient. The other solutions
>provided for this problem all involved re-reading the master entry location
>after the DBDELETE. If a migrating secondary situation is involved, the
>secondary chain will be deleted in its entirety at that location. Secondary
>chains are usually not very long unless you are dealing with an Integer key
>type and you have sampled the data to get a capacity instead of calculating
>the exact capacity which will yield no secondaries.
>
>However, in the re-read technique, you only go through the dataset once,
>catching everything. You advocate going through the dataset twice. I find
>this to be inefficient.
I think Gilles was correct...almost. In fact I read misread it first time
and assumed the best solution was there, so didn't consider it further.
I thought the solution was:
----------------------------------------------------------------------
dbget(mode 2) << get first >>
while (status.element(1) <> 0) do
begin
while (status.element(5-6) <> 0) do
begin << synonyms exist >>
dbdelete(..) << delete primary >>
dbget(mode 1) << reread new primary >>
end
dbdelete(..) << synonym or primary with no synonyms >>
dbget(mode 2) << get next >>
end
if status.element(1) <> 11 then error(..)
-----------------------------------------------------------------------
The only possible improvement this offers over some of the previous
solutions is that it avoids a dbget(mode 1) when it is known it would fail.
----
Jim "seMPEr" Wowchuk Internet: [log in to unmask]
Vanguard Computer Services Compu$erve: 100036,106
_--_|\ Post: PO Box 18, North Ryde, NSW 2113
/ \ Phone: +61 (2) 888-9688
\.--.__/ <---Sydney NSW Fax: +61 (2) 888-3056
v Australia
|
|
|