[Python-Dev] bigmemtests for really big memory too slow (original) (raw)

martin at v.loewis.de martin at v.loewis.de
Tue Sep 6 15:03:32 CEST 2011


I benchmarked some of the bigmemtests when run with -M 80G. They run really slow, because they try to use all available memory, and then take a lot of time processing it. Here are some runtimes:

test_capitalize (test.test_bigmem.StrTest) ... ok (420.490846s) test_center (test.test_bigmem.StrTest) ... ok (149.431523s) test_compare (test.test_bigmem.StrTest) ... ok (200.181986s) test_concat (test.test_bigmem.StrTest) ... ok (154.282903s) test_contains (test.test_bigmem.StrTest) ... ok (173.960073s) test_count (test.test_bigmem.StrTest) ... ok (186.799731s) test_encode (test.test_bigmem.StrTest) ... ok (53.752823s) test_encode_ascii (test.test_bigmem.StrTest) ... ok (8.421414s) test_encode_raw_unicode_escape (test.test_bigmem.StrTest) ... ok (3.752774s) test_encode_utf32 (test.test_bigmem.StrTest) ... ok (9.732829s) test_encode_utf7 (test.test_bigmem.StrTest) ... ok (4.998805s) test_endswith (test.test_bigmem.StrTest) ... ok (208.022452s) test_expandtabs (test.test_bigmem.StrTest) ... ok (614.490436s) test_find (test.test_bigmem.StrTest) ... ok (230.722848s) test_format (test.test_bigmem.StrTest) ... ok (407.471929s) test_hash (test.test_bigmem.StrTest) ... ok (325.906271s)

In the test suite, we have the bigmemtest and precisionbigmemtest decorators. I think bigmemtest cases should all be changed to precisionbigmemtest, giving sizes of just above 2**31. With that change, the runtime for test_capitalize would go down to 42s.

What do you think?

Regards, Martin



More information about the Python-Dev mailing list