Stu,
Something you may wish to consider.... Instead of creating a job which
performs a PURGE on each file, write only the fully qualified file name to
the file. You can then STORE the files to $Null, with the ;PURGE option,
using the file you create as an indirect reference.
:Build ZapEm;Rec=-80,,F,ASCII;Disc=80000
<Use some program to find and write filenames to this file>
:File BitBuckt=$Null
:Store ^ZapEm;*BitBuckt;Purge
(You may want to double-check those commands.)
Come to think of it, you may want to make sure STORE allows the indirect
file to be that large.
David
I am on a 989/650 system and every weekend we identify via a program, using
MPEX and access date parms about 70 thousand files to delete off the
system.
I build a jobstream that basically uses an XEQ and executes a file that has
about 70 thousand lines. Each line says 'PURGE file.group.account'.
My issue is that this program has become a real hog. It kicks off at 6 AM
on Sunday morning, and b 7 PM on Sunday night it only purged about 20
thousand files. It also kills the overhead making it take about 30 seconds
to log in.
Anyone out there have a better approach? One that will cause less
overhead?
Stu Diamond
HP Systems Manager
Hours: Tues-Thurs 10:15 AM - 9:00 PM
Friday 7:00 AM - 4:00 PM
Phone: 1 631 738-5521
Beeper: 1 877 409-1239
________________________________________________________________________
This email has been scanned for computer viruses.
* To join/leave the list, search archives, change list settings, *
* etc., please visit http://raven.utc.edu/archives/hp3000-l.html *
* To join/leave the list, search archives, change list settings, *
* etc., please visit http://raven.utc.edu/archives/hp3000-l.html *
|