HP3000-L Archives

February 2000, Week 4

HP3000-L@RAVEN.UTC.EDU

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Jerry Fochtman <[log in to unmask]>
Reply To:
Jerry Fochtman <[log in to unmask]>
Date:
Wed, 23 Feb 2000 14:23:45 -0600
Content-Type:
text/plain
Parts/Attachments:
text/plain (59 lines)
At 10:54 AM 2/23/2000 -0500, Dave Geis wrote:
>I have a question about the DDX maximum capacity value.  Since this value
>does not affect space usage, I would like to set this value to a much higher
>value.  We just recently hit a max value on a DDX dataset, and to prevent
>this from ever happening again, I am wanting to make the max value the same
>and much higher for every dataset.  I would like to set every dataset to
>around 50,000,000 which is around 5 times the size of our largest dataset
>(entries wise).  Do you see any pitfalls in doing this?  Also, out of
>curiosity, what is the absolute maximum value that IMAGE will take for the
>DDX max value?

There are a couple of factors that come into play.  First, jumbo datasets
do not support DDX, so the dataset involved has to be a standard, single
file set.  So the dataset size is limited to 4GB.

The next issue is the number of possible blocks that can be fit into a
4GB object space.  This can easily be 'estimated' by dividing 4GB by the
block size of the set.  Be sure and round-down, as there is some overhead
space used in the 4GB file limit (128kb I seem to recall).  This will give
you the maximum block count.  Then simply multiple this number by your
blocking factor and this would be a good estimate of the maximum capacity
of the set.

I've seen some sites indeed set their capacity to the maximum possible value.
This will have a minor impact on the size of the extents that are allocated
when an expansion occurs.  Also, be astute when setting the expansion value
as you don't want to be constantly expanding the capacity.

One draw back to doing this is you'll end-up with a dataset that has many,
many extents.  For performance reasons, you may want to use some tool
such as <plug> DSM </unplug> or DEFRAG/X which will manage these extents,
combining them into larger 'chunks' and distribute them in an organized
fashion across all the volume in the volume set.

In some instances, applications may chose to use DBINFO(202) to obtain the
current capacity of a dataset to determine if the set is full.  If your
applications/tools do this, they'll need to use DBINFO(205) and utilize the
max_capacity value in any logic that tests whether or not space in the set
has been exhausted.

Final word of caution, don't use DDX if you're still on 5.5 powerpatch 5.


/jf
                               _\\///_
                              (' o-o ')
___________________________ooOo_( )_OOoo____________________________________

                          Tuesday, February 22th
                    George Washington was born in 1732

           Today in 1819 - Florida ceded to the United States by Spain.

___________________________________Oooo_____________________________________
                             oooO  (    )
                            (    )  )  /
                             \  (   (_/
                              \_)

ATOM RSS1 RSS2