Issue 31058: FileFinder fails to find modules for import if modules are created at runtime and don't result in a directory mtime update (original) (raw)

If modules are added to a package namespace at runtime, there is a chance that they will not be properly detected if adding the module does not also result in an update to the parent directory's st_mtime attribute.

This manifests in not being able to import the module in question, despite it clearly existing on disk and despite being to import it via a new Python interpreter if a second one is opened in parallel.

Attached is a SSCCE which reproduces this issue. On my Windows dev machine, it works flawlessly on Python 2.7 and 3.6. However, on a Linux VM, it works on Python 2.7 yet fails fairly consistently on Python 3.6.

I'm working around the issue by walking sys.path_importer_cache and resetting FileFinder._path_mtime to 0, forcing the cache to be recreated on the next call to FileFinder.find_spec().

This bug is admittedly a bit of a corner case, but I did end up spend many hours trying to figure out what was going on, so whether or not this gets fixed, I hope this may be useful info to someone. Thanks for your attention.

This is a known limitation of the import caching mechanism, and is the key reason https://docs.python.org/3/library/importlib.html#importlib.invalidate_caches is offered as a supported public API.

Ideally, the code that is dynamically creating packages at runtime will be calling that itself, but if it isn't, then the integrating application needs to take care of calling it at appropriate points in the program execution.