[Python-Dev] PEP 384: Defining a Stable ABI (original) (raw)

M.-A. Lemburg mal at egenix.com
Tue May 26 18:28:59 CEST 2009


Nick Coghlan wrote:

M.-A. Lemburg wrote:

Now, with the PEP, I have a feeling that the Python C-API will in effect be limited to what's in the PEP's idea of a usable ABI and open up the non-inluded public C-APIs to the same rate of change as the private APIs. Not really - before this PEP it was already fairly easy to write an extension that was source-level compatible with multiple versions of Python (depending on exactly what you wanted to do, of course).

Right and I hope that things stay that way.

However, it is essentially impossible to make an extension that is binary level compatible with multiple versions.

On Windows, yes. On Unix, this often worked, even though it wasn't always safe to do.

In practice it's usually better to recompile extensions for every single release.

With the defined stable ABI in place, each extension module author will be able to make a choice: - choose binary compatibility by limiting themselves to the stable ABI and be able to provide a single binary that will still work with later versions of Py3k - stick with source compatibility and continue to provide new binaries for each version of Python

Great !

An optional cross-version ABI would certainly be a good thing.

Limiting the Python C-API would be counterproductive. I don't think anyone would disagree with that. A discussion on C-API sig would certainly be a good idea. During the compilation of applications, the preprocessor macro PyLIMITEDAPI must be defined. Doing so will hide all definitions that are not part of the ABI. So extensions wanting to use the full Python C-API as documented in the C-API docs will still be able to do this, right ? Yep - they just wouldn't define the new macro.

Good !

Type Objects ------------

The structure of type objects is not available to applications; declaration of "static" type objects is not possible anymore (for applications using this ABI). Hmm, that's going to create big problems for extensions that want to expose a C-API for their types: Type checks are normally done by pointer comparison using those static type objects. They would just have to expose "MyExtensionPrefixMyTypeCheck" and "MyExtensionPrefixMyTypeCheckExact" functions the same way that types in the C API do.

Hmm, that's a function call per type check and will slow things down a lot, esp. when working with APIs that deal a lot with these objects.

The typical way to implement these type checks is via a simple pointer comparison (falling back to a function for sub-types). That's cheap and fast.

Functions and function-like Macros ----------------------------------

Function-like macros (in particular, field access macros) remain available to applications, but get replaced by function calls (unless their definition only refers to features of the ABI, such as the various Check macros) Including PyINCREF()/PyDECREF() ? I believe so - MvL deliberately left the fields that the ref counting relies on as part of the ABI.

Hmm, another slow-down. This one has even more impact if you're writing extensions that have to deal with lots of objects.

Excluded Functions ------------------

Functions declared in the following header files are not part of the ABI: - cellobject.h - classobject.h - code.h - frameobject.h - funcobject.h - genobject.h - pyarena.h - pydebug.h - symtable.h - token.h - traceback.h I don't think that's feasable: you basically remove all introspection functions that way. This will need a more fine-grained approach. I don't think it is reasonable to expect the introspection interfaces to remain stable at a binary level across versions. Having "I want deep introspection support from C" and "I want to use a single binary for multiple Python versions" be mutually exclusive choices sounds like a perfectly sensible position to me. Also, keep in mind that even an extension module that restricts itself to PyLIMITEDAPI would still be able to call in to the Python equivalents via PyObjectCall and friends (e.g. by importing and using the inspect and traceback modules).

Sure, but they'd also want to print tracebacks or raise fatal errors if necessary.

What if you mix extensions that use the full C-API with ones that restrict themselves to the limited version ?

Would creating a Python object in a full-API extension and free'ing it in a limited-API extension cause problems ? Possibly, if you end up mixing C runtimes in the process. Specifically: 1. Python linked with MSVCRT X 2. Full extension module linked with MSVCRT Y 3. Limited extension module linked with MSVCRT Z The PyMem/PyObject APIs in the limited extension module will use the heap in MSVCRT X, since they will be redirected through the Python stable ABI as function calls. However, if the full extension module uses the macro forms and links with the wrong MSVCRT version, then you have the usual opportunities for conflicts between the two C runtimes. This isn't a problem created by defining a stable ABI though - it's the main reason mixing C runtimes is a bad idea. (The two others we have noted so far being IO issues, especially attempting to share FILE* instances and the fact that changing the locale will only affect whichever runtime the extension module linked against).

Of course, but the stable ABI encourages mixing extensions regardless of what runtime they were compiled with.

This is not much of an issue if the C runtime DLL doesn't change between releases, but it becomes a problem when they do e.g. due to an upgrade to a new MSVC++ compiler version or in case the extension was downloaded pre-compiled from pypi or some other site.

I think the module import API should check for possible incompatibilities here and issue a warning (much like it does now for differences in the Python API version).

Good point. As a separate issue, I would actually like to deprecate, then remove these APIs. I had originally hoped that this would happen for 3.0 already, alas, nobody worked on it.

In any case, I have removed them from the ABI now. How do you expect Python extensions to allocate memory and objects in a platform independent way without those APIs ? And as an aside: Which API families are you referring to ? PyMemMalloc, PyObjectMalloc, or PyObjectNew ? The ones with a FILE* parameter in the signature. There's no problem with the PyMem/PyObject functions since those will be redirected to consistently use the version of the C runtime that Python was originally linked against (their macro counterparts are obviously off limits for the stable ABI).

Ah, ok.

Thanks,

Marc-Andre Lemburg eGenix.com

Professional Python Services directly from the Source (#1, May 26 2009)

Python/Zope Consulting and Support ... http://www.egenix.com/ mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/


2009-06-29: EuroPython 2009, Birmingham, UK 33 days to go

::: Try our new mxODBC.Connect Python Database Interface for free ! ::::

eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/



More information about the Python-Dev mailing list