[Python-Dev] future-proofing vector tables for python APIs: binary-module interoperability (original) (raw)
Luke Kenneth Casson Leighton lkcl at lkcl.net
Sat Jan 24 12:15:45 CET 2009
- Previous message: [Python-Dev] distutils.mwerkscompiler and macpath deprecation
- Next message: [Python-Dev] future-proofing vector tables for python APIs: binary-module interoperability
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
On Fri, Jan 23, 2009 at 10:48 PM, Roumen Petrov <bugtrack at roumenpetrov.info> wrote:
python.exe (say, the official one) loads python25.dll. Then, an import is made of a ming-wine extension, say foo.pyd, which is linked with libpython2.5.dll, which then gets loaded. Voila, you have two interpreters in memory, with different type objects, memory heaps, and so on.
ok, there's a solution for that - the gist of the solution is already implemented in things like Apache Runtime and Apache2 (modules), and is an extremely common standard technique implemented in OS kernels. the "old school" name for it is "vector tables". [SNIP] Did you think that this will escape python MSVC from "Assembly hell" ?
let me think about that.... write some things down, i might have an answer at the end :)
but it would certainly mean that there would be both a future-proof path for binary modules from either msvc-compiled or mingw-compiled 2.5, 2.6, 2.7 etc. to work with 2.5, 2.6, 2.7, 2.8 etc. without a recompile. [forwards-future-proof-compatibility is possible, but... it's a bit more... complicated. backwards-compatibility is easy].
what you do is you make sure that the vector-table is always and only "extended" - added to - never "removed from" or altered. if one function turns out to be a screw-up (inadequate, not enough parameters), you do NOT change its function parameters, you add an "Ex" version - or an "Ex1" version.
just like microsoft does. [.... now you know why they do that "ridiculous" thing of adding FunctionEx1 FunctionEx2 and if you look at the MSHTML specification i think they go up to six revisions of the same function in one case!]
to detect revisions of the vector-table you use a "negotiation" tactic. you add a bit-field at the beginning of the struct, and each bit expresses a "new revision" indicating that the vector table has been extended (and so needs to be typecast to a different struct - exactly the same as is done with PyObject being typecast to different structs). the first function in the vector-table is one which the module must call (in its initXXXX()) to pass in the "version number" of the module, to the python runtime. just in case someone needs to know.
but for the most part, the initiation - of function call-out - is done from modules, so each and every module will never try to call something beyond what it understands.
but basically, not only is this technique nothing new - it's in use in Apache RunTime, FreeDCE, the NT Kernel, the Linux Kernel - but also it's actually already in use in one form in the way that python objects are typecast from PyObject to other types of structs! the difference is that a bit-field would make detection of revisions a bit easier but to be honest you could just as easily make it an int and increase the revision number.
.... ok, i've thought about your question, and i think it might [save us from assembly hell].
what you would likely have to do is compile individual modules with assemblies, should they need them. for example, the msvcrt module would obviously have to be.... hey, that'd be interesting, how about having different linked versions of the msvcrt module?
coool :)
in the mingw builds, it's not necessary to link in PC/msvcrtmodule.o into the python dll - so (and this confused the hell out of me for a minute so i had to do find . -name "msvcrt*") you end up with a Modules/msvcrt.pyd.
surely, that should be the only dll which gets specifically linked against msvcr71.dll (or 90, or... whatever) and it would be even better if that then got named msvcr71.pyd, msvcr90.pyd etc.
i'll do an experiment, later, to confirm that this actually does work - i.e. creating an msvcr80.pyd with "mingw gcc -specs=msvcr80".
the neat thing is that if it works, you wouldn't need to force people to link to the python dll or the python exe with msvcr90 or any other version.
and the mingw built python.exe or python dll would be interchangeable, as it would be specific modules that required specific versions of the msvc runtime.
l.
- Previous message: [Python-Dev] distutils.mwerkscompiler and macpath deprecation
- Next message: [Python-Dev] future-proofing vector tables for python APIs: binary-module interoperability
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]