[Tutor] Why does counting to 20 million stress my computer? (original) (raw)

Dick Moores rdm at rcblue.com
Fri Jul 16 12:19:20 CEST 2004


I was just fooling around writing a script (below) that would show how fast my computer can count. I was pleased with the speed, but dismayed that counting to a mere 15 or 20 million reduced available memory from about 300 to close to zero, forcing a reboot; and sometimes upon rebooting I would find some important Win XP settings changed. Do I have to get a CS degree to understand what's going on? Or can someone point me to an explanation?

Some results: 0 to 9999999 in 7.641 seconds! 0 to 1000000 in 0.766 seconds! 0 to 100000 in 0.078 seconds! 0 to 50000 in 0.031 seconds! 0 to 10000 in 0.000 seconds!

Thanks, tutors.

Dick Moores

==================================== #spin.py

import time print """ Enter an integer to count to from zero. To quit, enter x or q. """

while True: # for exiting via ^C or ^D try: max = raw_input("integer: ") except (TypeError, EOFError): print "Bye." break if len(max) == 0: print "Hey, don't just hit Enter, type an integer first!" continue if max in ["q", "x"]: print "Bye." break try: max = int(max) + 1 except: print "That's not an integer!" continue if max > 10000000: print "For the health of this computer," print "better keep integer under 10 million." continue print "Counting.." t0 = time.time() for k in range(max): k += 1 print "0 to %d in %.3f seconds!" % ((k - 1), (time.time() - t0))



More information about the Tutor mailing list