1. Create a Database containing one Manual Master dataset with a single
field of 270 bytes and about 1.2 million capacity (preferably a prime
number).
2. Load the Ascii file into this Database. Each record will be
pseudo-randomly "Hashed" into a specific location.
3. Read the file serially back into a Flat Ascii file.
4. Voila! Randomized file.
5. Use of slightly different Dataset sizes will create completely
different randomizations.
Porter, Allen H wrote:
>
> I have a very large ASCII file (270 bytes per record with approx. 1,000,000
> records) that needs to be randomized. The current program we are using
> takes a couple of hours to run. Does anyone have a faster method? I was
> hoping TurboSort could randomize but I can't find anything in the
> documentation.
> Allen Porter
> Programmer II
> Maritz Marketing Research INC
>
> 636.827.6833
> [log in to unmask]