HP3000-L Archives

March 1998, Week 4

HP3000-L@RAVEN.UTC.EDU

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
"Trudeau, James Lhrl" <[log in to unmask]>
Reply To:
Trudeau, James Lhrl
Date:
Tue, 24 Mar 1998 09:43:33 -0700
Content-Type:
multipart/mixed
Parts/Attachments:
text/plain (1678 bytes) , application/ms-tnef (2388 bytes)


> ----------
> From:         Lars Appel
> Sent:         Tuesday, March 24, 1998 1:01 AM
> To:   [log in to unmask]
> Cc:   Trudeau, James Lhrl
> Subject:      Re: 3k to 9k via tape
>
> At 14:08 18.03.1998 -0700, James T (not Kirk) wrote:
> >So we have a Turboimage database on the 3000 which needs one
> >(several actually but let's keep it simple) of it's dataset's on the
> >9000.  The output file created by the extract program on the 3000
> >is 23GB.  The program normally outputs a disc file...
>
> Well, not sure what I am missing here... but what trick are you using
> to make the extraction program create a 23 GigaByte disc file? My current
> understanding of MPE/iX is/has a 4 GB file limit... Do you already have
> an early version of release 7.0 or alike??
>

----------------------------------------------------------------------------
----------

        Lars,

        Guess my message got scrambled a bit.  No we are not running
        an alpha version of MPE/iX 7.0.  The entire system is only 20GB,
        so I have to redirect the extracts output to tape - which is what
        created the problem.

        For anyone interested in how this was solved:

        The problem was that the extract program was writing a
        134 byte record to a 1024 byte record.  After determining that
        there was no "processing" of the data, just read-write-read-write
        I used Suprtool to copy the records from the dataset to a disc
        file.  Then the 3.nnGB disc file was ftp flown to the 9000.  Once
        there I used SQLLOAD to plant it to the Oracle database.  Ok
        for now but I'm quite sure this "big file" issue will rise again.

        James (wish I *was* Kirk) Trudeau


ATOM RSS1 RSS2