[Python-Dev] Usefulness of binary compatibility accross Python versions? (original) (raw)
Nathaniel Smith njs at pobox.com
Sun Dec 17 15:05:39 EST 2017
- Previous message (by thread): [Python-Dev] Usefulness of binary compatibility accross Python versions?
- Next message (by thread): [Python-Dev] Usefulness of binary compatibility accross Python versions?
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
On Dec 16, 2017 11:44 AM, "Guido van Rossum" <guido at python.org> wrote:
On Sat, Dec 16, 2017 at 11:14 AM, Antoine Pitrou <solipsis at pitrou.net> wrote:
On Sat, 16 Dec 2017 19:37:54 +0100 Antoine Pitrou <solipsis at pitrou.net> wrote: > > Currently, you can pass a
moduleapiversion
to PyModuleCreate2(), > but that function is for specialists only :-) > > ("""Most uses of this function should be using PyModuleCreate() > instead; only use this if you are sure you need it.""")Ah, it turns out I misunderstood that piece of documentation and also what PEP 3121 really did w.r.t the module API check. PyModuleCreate() is actually a macro calling PyModuleCreate2() with the version number is was compiled against! #ifdef PyLIMITEDAPI _#define PyModuleCreate(module) _ PyModuleCreate2(module, PYTHONABIVERSION) #else _#define PyModuleCreate(module) _ PyModuleCreate2(module, PYTHONAPIVERSION) #endif And there's already a check for that version number in moduleobject.c: https://github.com/python/cpython/blob/master/Objects/moduleobject.c#L114 That check is always invoked when calling PyModuleCreate() and PyModuleCreate2(). Currently it merely invokes a warning, but we can easily turn that into an error. (with apologies to Martin von Löwis for not fully understanding what he did at the time :-))
If it's only a warning, I worry that if we stop checking the flag bits it can cause wild pointer following. This sounds like it would be a potential security issue (load a module, ignore the warning, try to use a certain API on a class it defines, boom). Also, could there still be 3rd party modules out there that haven't been recompiled in a really long time and use some older backwards compatible module initialization API? (I guess we could stop supporting that and let them fail hard.)
I think there's a pretty simple way to avoid this kind of problem.
Since PEP 3149 (Python 3.2), the import system has (IIUC) checked for:
foo.cpython-XYm.so foo.abi3.so foo.so
If we drop foo.so from this list, then we're pretty much guaranteed not to load anything into a python that it wasn't intended for.
How disruptive would this be? AFAICT there hasn't been any standard way to build python extensions named like 'foo.so' since 3.2 was released, so we're talking about modules from 3.1 and earlier (or else people who are manually hacking around the compatibility checking system, who can presumably take care of themselves). We've at a minimum been issuing warnings about these modules for 5 versions now (based on Antoine's analysis above), and I'd be really surprised if a module built for 3.1 works on 3.7 anyway. So this change seems pretty reasonable to me.
-n -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://mail.python.org/pipermail/python-dev/attachments/20171217/31e4ddbf/attachment.html>
- Previous message (by thread): [Python-Dev] Usefulness of binary compatibility accross Python versions?
- Next message (by thread): [Python-Dev] Usefulness of binary compatibility accross Python versions?
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]