[Python-Dev] Proposal: Run GC less often (original) (raw)
Neil Schemenauer nas at arctrix.com
Sun Jun 22 21:27:27 CEST 2008
- Previous message: [Python-Dev] Proposal: Run GC less often
- Next message: [Python-Dev] Proposal: Run GC less often
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
Martin v. Löwis wrote:
Under my proposal, 10 middle collections must have passed, PLUS the number of survivor objects from the middle generation must exceed 10% of the number of objects in the oldest generation. What happens if the program enters a phase where it's not producing any new cyclic garbage, but is breaking references among the old objects in such a way that cycles of them are being left behind? Under this rule, the oldest generation would never be scanned, so those cycles would never be collected.
Another problem is that the program could be slowly leaking and a full collection will never happen.
As a consequence, garbage collection becomes less frequent as the number of objects on the heap grows Wouldn't it be simpler just to base the collection frequency directly on the total number of objects in the heap? From what another poster said, this seems to be what emacs does.
I like simple. The whole generational collection scheme was dreamed up by me early in the GC's life. There was not a lot of thought or benchmarking put into it since at that time I was more focused on getting the basic GC working. At some later point some tuning was done on the collection frequencies but that 10 middle collections scheme was never deeply investigated, AFAIK.
BTW, I suspect that documentation needs updating since I understand that the GC is no longer optional (the stdlib and/or the Python internals create reference cycles themselves).
Neil
- Previous message: [Python-Dev] Proposal: Run GC less often
- Next message: [Python-Dev] Proposal: Run GC less often
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]