Python Open & Memory Leaks
Solution 1:
CPython closes a file object automatically when the object is deleted; it is deleted when it's reference count drops to zero (no more variables refer to it). So if you use mergeData
in a function, as soon as the function is done, the local variables are cleaned up and the file is closed.
If you use allData = open( "myinput.txt","r" ).read()
the reference count drops to 0 the moment .read()
returns, and on CPython that means the file is closed there and then.
On other implementations such as Jython or IronPython, where object lifetime is managed differently, the moment an object is actually deleted could be much later.
The best way to use a file though, is as a context manager:
withopen("myinput.txt","r") as mergeData:
allData = mergeData.read()
which calls .close()
on mergeData
automatically. See the file.open()
documentation and the documentation for the with
statement.
Solution 2:
Yes. Yes you can. There is no memory leak or anything of the sort.
The file handle will be closed soon after the file
object returned by open()
goes out scope and is garbage collected.
Though if you prefer you might wish to do something like:
withopen('myinput.txt') as f:
data = f.read()
This will ensure that the file is closed as soon as you're done with it.
Post a Comment for "Python Open & Memory Leaks"