(original) (raw)
How about:
async def wait\_to\_run(async\_fn, \*args):
await wait\_for\_something()
return await async\_fn(\*args)
task = loop.create\_task(wait\_to\_run(myfunc, ...))
-----
Whatever strategy you use, you should also think about what semantics you want if one of these delayed tasks is cancelled before it starts.
For regular, non-delayed tasks, Trio makes sure that even if it gets cancelled before it starts, then it still gets scheduled and runs until the first cancellation point. This is necessary for correct resource hand-off between tasks:
async def some\_task(handle):
with handle:
await ...
If we skipped running this task entirely, then the handle wouldn't be closed properly; scheduling it once allows the with block to run, and then get cleaned up by the cancellation exception. I'm not sure but I think asyncio handles pre-cancellation in a similar way. (Yury, do you know?)
Now, in delayed task case, there's a similar issue. If you want to keep the same solution, then you might want to instead write:
# asyncio
async def wait\_to\_run(async\_fn, \*args):
try:
await wait\_for\_something()
except asyncio.CancelledError:
# have to create a subtask to make it cancellable
subtask = loop.create\_task(async\_fn(\*args))
# then cancel it immediately
subtask.cancel()
# and wait for the cancellation to be processed
return await subtask
else:
return await async\_fn(\*args)
In trio, this could be simplified to
# trio
async def wait\_to\_run(async\_fn, \*args):
try:
await wait\_for\_something()
except trio.Cancelled:
pass
return await async\_fn(\*args)
(This works because of trio's "stateful cancellation" – if the whole thing is cancelled, then as soon as async\_fn hits a cancellation point the exception will be re-delivered.)
-n
On Wed, Jun 13, 2018, 13:47 Michel Desmoulin <desmoulinmichel@gmail.com> wrote:
I was working on a concurrency limiting code for asyncio, so the user
may submit as many tasks as one wants, but only a max number of tasks
will be submitted to the event loop at the same time.
However, I wanted that passing an awaitable would always return a task,
no matter if the task was currently scheduled or not. The goal is that
you could add done callbacks to it, decide to force schedule it, etc
I dug in the asyncio.Task code, and encountered:
def \_\_init\_\_(self, coro, \*, loop=None):
...
self.\_loop.call\_soon(self.\_step)
self.\_\_class\_\_.\_all\_tasks.add(self)
I was surprised to see that instantiating a Task class has any side
effect at all, let alone 2, and one of them being to be immediately
scheduled for execution.
I couldn't find a clean way to do what I wanted: either you
loop.create\_task() and you get a task but it runs, or you don't run
anything, but you don't get a nice task object to hold on to.
I tried several alternatives, like returning a future, and binding the
future awaiting to the submission of a task, but that was complicated
code that duplicated a lot of things.
I tried creating a custom task, but it was even harder, setting a custom
event policy, to provide a custom event loop with my own create\_task()
accepting parameters. That's a lot to do just to provide a parameter to
Task, especially if you already use a custom event loop (e.g: uvloop). I
was expecting to have to create a task factory only, but task factories
can't get any additional parameters from create\_task()).
Additionally I can't use ensure\_future(), as it doesn't allow to pass
any parameter to the underlying Task, so if I want to accept any
awaitable in my signature, I need to provide my own custom ensure\_future().
All those implementations access a lot of \_private\_api, and do other
shady things that linters hate; plus they are fragile at best. What's
more, Task being rewritten in C prevents things like setting self.\_coro,
so we can only inherit from the pure Python slow version.
In the end, I can't even await the lazy task, because it blocks the
entire program.
Hence I have 2 distinct, but independent albeit related, proposals:
\- Allow Task to be created but not scheduled for execution, and add a
parameter to ensure\_future() and create\_task() to control this. Awaiting
such a task would just do like asyncio.sleep(O) until it is scheduled
for execution.
\- Add an parameter to ensure\_future() and create\_task() named "kwargs"
that accept a mapping and will be passed as \*\*kwargs to the underlying
created Task.
I insist on the fact that the 2 proposals are independent, so please
don't reject both if you don't like one or the other. Passing a
parameter to the underlying custom Task is still of value even without
the unscheduled instantiation, and vice versa.
Also, if somebody has any idea on how to make a LazyTask that we can
await on without blocking everything, I'll take it.
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: https://mail.python.org/mailman/options/python-dev/njs%40pobox.com