[Python-Dev] extremely slow exit for program having huge (45G) dict (python 2.5.2) (original) (raw)
Antoine Pitrou solipsis at pitrou.net
Mon Dec 22 23:35:30 CET 2008
- Previous message: [Python-Dev] extremely slow exit for program having huge (45G) dict (python 2.5.2)
- Next message: [Python-Dev] extremely slow exit for program having huge (45G) dict (python 2.5.2)
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
Martin v. Löwis <martin v.loewis.de> writes:
It then occurred that there are only 64 different values for nfreepools, as ARENASIZE is 256kiB, and POOLSIZE is 4kiB. So rather than keeping the list sorted, I now propose to maintain 64 lists, accessible in an array double-linked lists indexed by nfreepools. Whenever nfreepools changes, the arenaobject is unlinked from its current list, and linked into the new list. This should reduce the overhead for keeping the lists sorted down from O(n) to O(1), with a moderate overhead of 64 pointers (512 Bytes in your case). Allocation of a new pool would have to do a linear search in these pointers (finding the arena with the least number of pools);
You mean the least number of free pools, right? IIUC, the heuristic is to favour a small number of busy arenas rather than a lot of sparse ones. And, by linear search in these pointers, do you mean just probe the 64 lists for the first non-NULL list head? If so, then it's likely fast enough for a rather infrequent operation.
Now, we should find a way to benchmark this without having to steal Mike's machine and wait 30 minutes every time.
Regards
Antoine.
- Previous message: [Python-Dev] extremely slow exit for program having huge (45G) dict (python 2.5.2)
- Next message: [Python-Dev] extremely slow exit for program having huge (45G) dict (python 2.5.2)
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]