<< Previous Message Main Index Next Message >>
<< Previous Message in Thread This Month Next Message in Thread >>
Date   : Sun, 20 Jan 1985 03:00:36 GMT
From   : oacb2 <oacb2%ut-ngp.uucp@BRL-TGR.ARPA>
Subject: Re: possible problems with large numbers of open files simultaneously?

> Just the opposite.  You should take pains NOT to do a CLOSE of a file
> that was used only for reading.  This is because all a CLOSE does is
> copy the FCB back out to the directory.  If you haven't modified the
> file, this is an unnecessary disk access, and will prevent your program
> from running when the file or the disk is read-only.

The BDOS (CP/M 2.2 and, I assume, CP/M Plus) is smart enough to not rewrite
the FCB if it's not been changed.  Not closing input files is just asking
for trouble if you ever upgrade to a multiuser or multiprocessor system.

> If you're absolutely sure that you're not going to write (or otherwise
> modify) the file while it's temporarily closed, it suffices to do a
> CLOSE and keep the FCB, then resume WRITING with the FCB later.  This
> is because CLOSE doesn't cause the file to no longer be OPEN in the
> usual sense; all CLOSE really does is update the directory.  In fact,
> if you have a transaction processing program which adds records to an
> open file, it should CLOSE the file whenever it knows that it will be
> idle for awhile (waiting for another line of terminal input), to make
> sure that the entire file will be there if the system crashes or
> someone removes the floppy.

Again, this may cause trouble if you upgrade to a multiuser or multiprocessor
system.

I strongly recommand that all files be closed after processing and that
I/O never be done to a "closed" FCB.  Closing an input file causes negligible
overhead.  Opening a closed file does require some overhead, but I think it's
worth it.
-- 

       Mike Rubenstein, OACB, UT Medical Branch, Galveston TX 77550
<< Previous Message Main Index Next Message >>
<< Previous Message in Thread This Month Next Message in Thread >>