[Python-Dev] Usefulness of binary compatibility accross Python versions? (original) (raw)
Antoine Pitrou solipsis at pitrou.net
Sat Dec 16 08:22:57 EST 2017
- Previous message (by thread): [Python-Dev] New crash in test_embed on macOS 10.12
- Next message (by thread): [Python-Dev] Usefulness of binary compatibility accross Python versions?
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
Hello,
Nowadays we have an official mechanism for third-party C extensions to be binary-compatible accross feature releases of Python: the stable ABI.
But, for non-stable ABI-using C extensions, there are also mechanisms in place to try and ensure binary compatibility. One of them is the way in which we add tp_ slots to the PyTypeObject structure.
Typically, when adding a tp_XXX slot, you also need to add a Py_TPFLAGS_HAVE_XXX type flag to signal those static type structures that have been compiled against a recent enough PyTypeObject definition. This way, extensions compiled against Python N-1 are supposed to "still work": as they don't have Py_TPFLAGS_HAVE_XXX set, the core Python runtime won't try to access the (non-existing) tp_XXX member.
However, beside internal code complication, it means you need to add a new Py_TPFLAGS_HAVE_XXX each time we add a slot. Since we have only 32 such bits available (many of them already taken), it is a very limited resource. Is it worth it? (*) Can an extension compiled against Python N-1 really claim to be compatible with Python N, despite other possible differences?
(*) we can't extend the tp_flags field to 64 bits, precisely because of the binary compatibility problem...
Regards
Antoine.
- Previous message (by thread): [Python-Dev] New crash in test_embed on macOS 10.12
- Next message (by thread): [Python-Dev] Usefulness of binary compatibility accross Python versions?
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]