How to tell typecheckers that a function will run with augmented namespace (original) (raw)
Hello. I have here a question I couldn’t find an answer to in this forum nor google it.
Is it possible to somehow declare that function’s global namespace is extended with bound methods from some instance? Here’s an example:
class Machinery:
def __init__(self, a: int):
self.a = a
def do_thing(self, b: int):
print(self.a + b)
def do_thing(b: int): ...
def script():
do_thing(3)
if __name__ == "__main__":
machinery = Machinery(2)
exec(script.__code__, globals=globals() | {"do_thing": machinery.do_thing})
The def do_thing(b: int): ...
is the best I’ve come up with but I’d like to automate it somehow for a whole list of methods from some instances like the machinery
.
MegaIng (Cornelius Krupp) May 4, 2025, 1:03pm 2
No, this is not currently possible. type checkers assume a static execution environment and many of python dynamic features cannot be represented currently.
WizardUli (WizardUli) May 4, 2025, 1:11pm 3
Thanks you. I’ll go the way of defining dummy functions in “normal” global namespace then and perhaps write a test that their signatures match signatures of some specific methods (excluding their self
param).
JamesParrott (James Parrott) May 4, 2025, 2:06pm 4
Is there some other specific requirement to use exec, .__code__
and globals? In day to day coding with no need for heavy wizadry and introspection, I treat the latter two as code smells (and in my opinion exec should be deprecated).
Functions are first class citizens. A dependency injection / passed in callable with a default is much cleaner:
def script(do_thing = do_thing):
do_thing(3)
if __name__ == "__main__":
machinery = Machinery(2)
script(do_thing = machinery.do_thing)
There are monkey patching, mocking, and DI frameworks too.
Andrej730 (Andrej) May 4, 2025, 4:13pm 5
You can hide it under if TYPE_CHECKING
to ensure this dummy function won’t show up unexpectedly at runtime or even define it in local scope. You can also connect it to the original method, so you won’t need to sync signature manually.
def script():
if TYPE_CHECKING:
do_thing = partial(Machinery.do_thing, ...)
do_thing(3)
if __name__ == "__main__":
machinery = Machinery(2)
exec(script.__code__, globals=globals() | {"do_thing": machinery.do_thing})
WizardUli (WizardUli) May 4, 2025, 5:23pm 6
I’m creating a small 3d modeling library/app built around symbolic mathematics and system of equations using SymPy.
Users are going to be able to provide functions representing some modeling operations (as well as a whole model) → the script
function from my earlier example.
Inside these functions usercode should be able call some two dozens methods on some context: Context
- perhaps I should call it builder - (the machinery
from my earlier example). Such operations can of course be executed many times over different contexts.
From what I wrote it seems the operations user provides should be of type Callable[[Context], None]
.
But a friend told me that he would find it PITA having to write
def my_op(context: Context):
context.doThing()
context.doOtherThing()
...
and repeat the context.
like a parrot when its completely clear that those method executes over some context and I kinda concur.
Having user to declare each used context’s method as a parameter in his function is also cumbersome so I thought about injecting those methods to global for each run of such function. What other mechanism in python allows me to inject something into a function’s scope for a specific run of the function?
MegaIng (Cornelius Krupp) May 4, 2025, 5:31pm 7
There are a few options:
- Make the users subclass some base class that provides (just the signature of) these functions. This still requires all of them to use
self
, but that is far more established in the tooling. - Use
c: Context
. That at least reduces the amount of typing required (I kind of wish python had something like nim’susing
statement for this) - Use a
from ??? import *
at the top of these script modules. Those actual functions can then get the required context object from either a global variable, a threadlocal variable or a ContextVar as required by your application.
JamesParrott (James Parrott) May 4, 2025, 5:31pm 8
The app sounds great. But I think you’re doing your users (and yourself) a disservice, by requiring them to write Python, but hiding from them how Python’s scopes work, and using your own auto-magick instead.
Suffice to say I disagree entirely with your friend. Tell them to stop expecting computers to read their mind, and that when writing code it is necessary to specify exactly what they mean, Including, but not limited to, passing arguments to functions.
What other mechanism in python allows me to inject something into a function’s scope for a specific run of the function?
Put context
with its two dozens of methods in a module, and require your users to import it, e.g.:
You write:
my_lib.py
class Machinery
def doTing():
...
def doOtherTing():
...
Your users write:
from my_lib import Machinery
context = Machinery(2)
def script():
context.doTing()
context.doOtherTing()
script()
They could then even do everything in global scope without script
at all.
tjreedy (Terry Jan Reedy) May 5, 2025, 9:55pm 9
I agree that WizardUli’s use of exec is smelly. But exex() is needed to write python code that will run python code in the same process as top-level code. The code module, IDLE’s Shell, and the new REPL are examples. Exec just exposes to Python code some of the function of the interpreter.
Eneg (Eneg) May 6, 2025, 7:29pm 10
contextvars.ContextVar
could be of use here. A user would import doThing
/doOtherThing
into global scope, and call them like normal functions.
Those functions would get the appropriate Machinery
from an internal ContextVar
.
from my_lib import doThing, doOtherThing, set_machinery
set_machinery(2)
def script():
doThing()
doOtherThing()
script()