HP3000-L Archives

May 2000, Week 4

HP3000-L@RAVEN.UTC.EDU

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Tom Renz <[log in to unmask]>
Reply To:
Date:
Tue, 23 May 2000 09:59:58 -0600
Content-Type:
text/plain
Parts/Attachments:
text/plain (48 lines)
Leonard,

>What determines the number of chunks in a jumbo dataset?

What is current blocking factor for the database and/or data set?  I have
seen with databases with jumbo data sets that the blocking factor
contributes to the number of jumbo chunks allocated for a jumbo data set.
If the data set has a small blocking factor, like 512 or 1024, the number of
jumbo chunks become large.  If the blocking factor is at 2048 the number of
jumbo chunks were small.  To improve and decrease the number of jumbo chunks
I would do a repack and/or reblock.  Also for the reblock number I would
highly recommend using 2048 (4096 bytes).  I would not go to the limit of
2560 unless you need to squeeze those last few records in the allocated disk
space.  I feel that by going to the 2560 limit you are working against the
hardware because of their built in features.  The disks read at 4096, memory
pages are allocated at 4096, etc.  Use your favorite DB Maintenance tool to
determine the best blocking factor that will save the most space as well as
utilizing the disc space most efficiently.


>Below are listings for two datasets, on has 32 chunks and one has only six
(from two different data
>bases -- I've telescoped the listings)?

You didn't include a LISTFILE /ACCT/GROUP/DBNAME@,2 listing to show us the
record lengths and disk space.  Only a Query/Suprtool listing.  That would
have helped.

>Is this documented anywhere?

It should be in the TurboIMAGE manuals and in the documentation of your
favorite DB Maintenance tool.


>Does this unusually large quantity of chunks indicate some problem?

No, as Stan said earlier, because you can go up 99 chunks or 999 depending
on the TurboIMAGE version.  Also, depending on your version you can have up
40GB or 80GB or more.  If you exceed these limits then you could have
problems.  Also each jumbo chunk is limited to 4GB until the large files
project is complete where there won't be a limit and the jumbos can be
reduced or eliminated from your database structure.


HTH,

Tom Renz

ATOM RSS1 RSS2