[Python-Dev] Why does IOBase.del call .close? (original) (raw)

Nikolaus Rath Nikolaus at rath.org
Fri Jun 13 03:06:20 CEST 2014


Benjamin Peterson <benjamin at python.org> writes:

On Wed, Jun 11, 2014, at 17:11, Nikolaus Rath wrote:

MRAB <python at mrabarnett.plus.com> writes: > On 2014-06-11 02:30, Nikolaus Rath wrote: >> Hello, >> >> I recently noticed (after some rather protacted debugging) that the >> io.IOBase class comes with a destructor that calls self.close(): >> >> [0] nikratio at vostro:~/tmp$ cat test.py >> import io >> class Foo(io.IOBase): >> def close(self): >> print('close called') >> r = Foo() >> del r >> [0] nikratio at vostro:~/tmp$ python3 test.py >> close called >> >> To me, this came as quite a surprise, and the best "documentation" of >> this feature seems to be the following note (from the io library >> reference): >> >> "The abstract base classes also provide default implementations of some >> methods in order to help implementation of concrete stream classes. For >> example, BufferedIOBase provides unoptimized implementations of >> readinto() and readline()." >> >> For me, having del call close() does not qualify as a reasonable >> default implementation unless close() is required to be idempotent >> (which one could deduce from the documentation if one tries to, but it's >> far from clear). >> >> Is this behavior an accident, or was that a deliberate decision? >> > To me, it makes sense. You want to make sure that it's closed, releasing > any resources it might be holding, even if you haven't done so > explicitly.

I agree with your intentions, but I come to the opposite conclusion: automatically calling close() in the destructor will hide that there's a problem in the code. Without that automatic cleanup, there's at least a good chance that a ResourceWarning will be emitted so the problem gets noticed. "Silently work around bugs in caller's code" doesn't seem like a very useful default to me... Things which actually hold system resources (like FileIO) give ResourceWarning if they close in del, so I don't understand your point.

Consider this simple example:

$ cat test.py import io import warnings

class StridedStream(io.IOBase): def init(self, name, stride=2): super().init() self.fh = open(name, 'rb') self.stride = stride

def read(self, len_):
    return self.fh.read(self.stride*len_)[::self.stride]

def close(self):
    self.fh.close()

class FixedStridedStream(StridedStream): def del(self): # Prevent IOBase.del frombeing called. pass

warnings.resetwarnings() warnings.simplefilter('error')

print('Creating & loosing StridedStream..') r = StridedStream('/dev/zero') del r

print('Creating & loosing FixedStridedStream..') r = FixedStridedStream('/dev/zero') del r

$ python3 test.py Creating & loosing StridedStream.. Creating & loosing FixedStridedStream.. Exception ignored in: <_io.FileIO name='/dev/zero' mode='rb'> ResourceWarning: unclosed file <_io.BufferedReader name='/dev/zero'>

In the first case, the destructor inherited from IOBase actually prevents the ResourceWarning from being emitted.

Best, -Nikolaus

-- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F

         »Time flies like an arrow, fruit flies like a Banana.«


More information about the Python-Dev mailing list