[Python-Dev] PEP 550 leak-in vs leak-out, why not just a ChainMap (original) (raw)
Jim J. Jewett jimjjewett at gmail.com
Fri Aug 25 22:26:29 EDT 2017
- Previous message (by thread): [Python-Dev] PEP 550 leak-in vs leak-out, why not just a ChainMap
- Next message (by thread): [Python-Dev] PEP 550 leak-in vs leak-out, why not just a ChainMap
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
On Aug 24, 2017 11:02 AM, "Yury Selivanov" <yselivanov.ml at gmail.com> wrote:
On Thu, Aug 24, 2017 at 10:05 AM, Jim J. Jewett <jimjjewett at gmail.com> wrote:
On Thu, Aug 24, 2017 at 1:12 AM, Yury Selivanov > On Thu, Aug 24, 2017 at 12:32 AM, Jim J. Jewett <jimjjewett at gmail.com> wrote:
If you look at this small example:
foo = new_context_key()
async def nested():
await asyncio.sleep(1)
print(foo.get())
async def outer():
foo.set(1)
await nested()
foo.set(1000)
l = asyncio.get_event_loop()
l.create_task(outer())
l.run_forever()
If will print "1", as "nested()" coroutine will see the "foo" key when it's awaited.
Now let's say we want to refactor this snipped and run the "nested()" coroutine with a timeout:
foo = new_context_key()
async def nested():
await asyncio.sleep(1)
print(foo.get())
async def outer():
foo.set(1)
await asyncio.wait_for(nested(), 10) # !!!
foo.set(1000)
l = asyncio.get_event_loop()
l.create_task(outer())
l.run_forever()
So we wrap our nested()
in a wait_for()
, which creates a new
asynchronous tasks to run nested()
. That task will now execute on
its own, separately from the task that runs outer()
. So we need to
somehow capture the full EC at the moment wait_for()
was called, and
use that EC to run nested()
within it. If we don't do this, the
refactored code would print "1000", instead of "1".
I would expect 1000 to be the right answer! By the time it runs, 1000 (or mask_errors=false, to use a less toy example) is what its own controlling scope requested.
If you are sure that you want the value frozen earlier, please make this desire very explicit ... this example is the first I noticed it. And please explain what this means for things like signal or warning masking.
ContextKey is declared once for the code that uses it. Nobody else will use that key. Keys have names only for introspection purposes, the implementation doesn't use it, iow:
var = new_context_key('aaaaaa')
var.set(1)
# EC = [..., {var: 1}]
# Note the that EC has a "var" object itself as the key in the
mapping, not "aaaaa".
This I had also not realized. So effectively, they keys are based on object identity, with some safeguards to ensure that even starting with the same (interned) name will not produce the same object unless you passed it around explicitly, or are in the same same code unit (file, typically).
This strikes me as reasonable, but still surprising. (I think of variables as typically named, rather than identified by address.) So please make this more explicit as well.
-jJ -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://mail.python.org/pipermail/python-dev/attachments/20170825/25efd54c/attachment.html>
- Previous message (by thread): [Python-Dev] PEP 550 leak-in vs leak-out, why not just a ChainMap
- Next message (by thread): [Python-Dev] PEP 550 leak-in vs leak-out, why not just a ChainMap
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]