Subject: | |
From: | |
Reply To: | |
Date: | Thu, 13 Jan 2005 08:46:48 -0500 |
Content-Type: | text/plain |
Parts/Attachments: |
|
|
Bill,
Not a bad idea. We don't have anything like that now, but it shouldn't
be *too* hard to put something together. After all the responses I've
received, I'm starting to think we should reconsider our approach.
I'd like to thank all those who responded. I have some very good
discussion points to bring up with the project managers.
/pat
>>> "William L. Brandt" <[log in to unmask]> 1/13/2005 2:20:18 AM >>>
Pat - I have had a very small HP shop (read me) so I have had to forgo
a lot
of the nice bells and whistles larger shops take for granted.
As you know doing mass deletes though IMAGE calls is **very** time
consuming - if you are trying to make a test data base why not take
your
schema, make the capacities much smaller, then write a Q & D program
just
extracting certain keys? IOW rather than trying to delete everything
just
build your test data base with a small portion of the live data - then
just
do a back up of that test data base and keep reusing it -
I'm wondering if you could just do a DBUNLOAD/DBLOAD with the newer
schema
and smaller capacities - when DBLOAD fills up the smaller data set
would it
abort or just issue a warning and go to the next data set? Might be
worth
trying.
Bill
* To join/leave the list, search archives, change list settings, *
* etc., please visit http://raven.utc.edu/archives/hp3000-l.html *
* To join/leave the list, search archives, change list settings, *
* etc., please visit http://raven.utc.edu/archives/hp3000-l.html *
|
|
|