[Python-3000] Two proposals for a new list-like type: one modest, one radical (original) (raw)

Adam Olsen rhamph at gmail.com
Tue Apr 24 01:32:45 CEST 2007


On 4/23/07, Daniel Stutzbach <daniel at stutzbachenterprises.com> wrote:

If a user creates a BList from whole-cloth (e.g., BList(iterable)), they will get a best-case BList. This would probably be the common case for user's who don't plan to modify their list much.

If a user is doing lots of inserts and deletes, the BList's node's will typically be three-quarters full, translating into 33% extra memory above a regular Python list. However (and this is a big however!), these users will greatly appreciate the BList's additional speed for these operations.

For contrast, the existing list seems to over-allocate by 12.5%, for an average of 6.25% wasted. However, when deleting it only resizes when it drops below half full, leaving it with a worst case of 100%, the same as BList.

The usage patterns can make a big difference there though. A list's worst-case is only reached if you have a spike, then drop to just above half-full. A BList's worst-case is reached through random slicing.

Daniel, does repeated getitem/setitem (as in random.shuffle()) increase memory usage, or does it have no effect?

A random.shuffle() benchmark might be a better a better demonstration of the overall costs.

-- Adam Olsen, aka Rhamphoryncus



More information about the Python-3000 mailing list