Subject: | |
From: | |
Reply To: | |
Date: | Tue, 10 Apr 2001 15:31:33 -0500 |
Content-Type: | text/plain |
Parts/Attachments: |
|
|
Okay, so I talked to Bradmark. They suggested that I make the maximum
really big, like near the limits. I don't know what the limits are. So I
just chose 6,000,000 and 5,000,000. Got one set that didn't work out of the
first 30 (the first database of many). Seems to be a limit on file size or
maximum blocks. So I trimmed this one set a little bit - retry worked.
Now on to the next database . . .
Gibson Nichols <[log in to unmask]> wrote in message
news:9av6ut$bke6@eccws12.dearborn.ford.com...
> We have been using DBGeneral (Bradmark) Automatic Capacity Management for
> years. We recently upgraded the backup software (Orbit) to a new version.
> With the backup update we improved our JCL to use the "zerodown" feature.
> So far so good. Happy users don't have to log on again after the backup.
>
> We noticed some messages in the backup listing. Seems the backup now
needs
> to have logging running to backup the Image databases without creating the
> messages. This was an unexpected problem. When talking to Orbit they
seem
> to suggest that this is okay. We can still restore any databases, just
have
> to remember to use a parameter that I can't remember (would also need to
> change documentation in places I have no control of). So I changed the
JCL
> to keep logging running.
>
> Now, however, the DBGeneral Automatic Capacity Management option won't
work
> because it needs exclusive access to the databases. I started looking
into
> the Dynamic Dataset Expansion. Cool stuff. I got a new version of the
> Bradmark software which helps set this up. Started testing.
>
> DBGeneral Automatic Capacity Management won't work on a dataset which have
> been converted to use Dynamic Dataset Expansion. Also running the Dynamic
> Expansion options in DBGeneral will update the capacity without checking
to
> see if the expansion is really needed (Automatic Capacity Management
> increased the capacity only if a configured criteria was met).
>
> So I'm faced with a conflict. I can use DBGeneral Automatic Capacity
> Management which means I must get exclusive access or I can use Dynamic
> Dataset Expansion which means that I will fill the maximum capacity at
times
> thus requiring a smart (non-automated or customized to the specific
dataset)
> approach.
>
> Yeah, I know, make the maximum capacity real big. But I can't guess how
big
> we will need the files to be. Prior to this we didn't need to worry
because
> the Automatic Capacity Management would handle it.
>
>
* To join/leave the list, search archives, change list settings, etc *
* please visit http://raven.utc.edu/archives/hp3000-l.html *
|
|
|