[Python-Dev] Investigating time for import requests (original) (raw)

Koos Zevenhoven k7hoven at gmail.com
Sun Oct 8 11:24:13 EDT 2017


On Sun, Oct 8, 2017 at 2:44 PM, Chris Angelico <rosuav at gmail.com> wrote:

On Sun, Oct 8, 2017 at 7:02 PM, David Cournapeau <cournape at gmail.com> wrote: > It is certainly true that for a CLI tool that actually makes any network > I/O, especially SSL, import times will quickly be negligible. It becomes > tricky for complex tools, because of error management. For example, a common > pattern I have used in the past is to have a high level "catch all > exceptions" function that dispatch the CLI command: > > try: > mainfunction(...) > except ErrorKind1: > .... > except requests.exceptions.SSLError: > # gives complete message about options when receiving SSL errors, e.g. > invalid certificate > > This pattern requires importing requests every time the command is run, even > if no network IO is actually done. For complex CLI tools, maybe most command > don't use network IO (the tool in question was a complete packages manager), > but you pay ~100 ms because of requests import for every command. It is > particularly visible because commands latency starts to be felt around > 100-150 ms, and while you can do a lot in python in 100-150 ms, you can't do > much in 0-50 ms.

This would be a perfect use-case for lazy importing, then. You'd pay the price of the import only if you get an error that isn't caught by one of the preceding except blocks.

​I suppose it might be convenient to be able to do something like:

with autoimport: try: main_function(...) ​ except ErrorKind1: ... except requests.exceptions.SLLError: ...

The easiest workaround at the moment is still pretty clumsy:

def import_SLLError(): from requests.exceptions import SLLError return SLLError

...

except import_SLLError():

But what happens if that gives you an ImportError?

––Koos

--



More information about the Python-Dev mailing list