Subject: | |
From: | |
Reply To: | |
Date: | Sat, 9 Aug 1997 17:20:50 +0100 |
Content-Type: | text/plain |
Parts/Attachments: |
|
|
At 07:37 08.08.1997 -0700, Ken wrote:
>Maarten Hoolboom writes:
>> Every time we load a web page, the browser just doen't finish loading
>> the page! The page is basically completely loaded, but the browser just
>> doesn't indicate that and seems to want to go on loading and loading
>> and...not a single webpage can be loaded completely, if we believe the
>> Netscape browser's upper corner loading indicator.
>
>I experienced this as well, but only for Netscape < version 3.0.
>Anyways, to make it work, I seem to recall that your web page files have to be
>variable-length. If they are in fixed format, the 3000 has extra blanks after
>the </html> tag (and possibly line numbers!) which Netscape didn't seem to
like.
Hmmm. This sounds somewhat familiar to me as Samba/iX also has problems with
fixed record size text files. The web server might suffer the same problem...
IF it uses the Posix call stat() or fstat() to determine the file size in
bytes and then reports it back to the client in the "Content-length" header
THEN is will be suprised to find less bytes than expected when sequentially
reading the text files afterwards (and the client will keep waiting...)
The reason: stat() and fstat() return values calculated from EOF and REC size
but the sequential read is done with "bytestream emulation" and that does strip
trailing blanks from each record and add LF line delimiters to make the file
"look like" a regular bytestream file. Unfortunately this changes the number of
characters "seen" by reader.
Lars.
|
|
|