Message 189066 - Python tracker (original) (raw)
Version 2 of my patch:
Mark> - I would much prefer PyLong_AsIntMax_t not to use nb_int; Mark> it should work only for instances of 'int' (just as Mark> PyLong_AsSsize_t and PyLong_AsSize_t currently do)."
I copied code from PyLong_AsLongLong(), but doc from PyLong_AsLong() :-/
Some PyLong_As*() functions call int(), but not all? It is a little bit surprising to have a different behaviour, but Mark has a longer experience in these APIs and so I trust him :-)
I changed my code to only accept PyLongObject.
Mark> There's a missing 'versionadded' for PyLong_AsIntMax_t in the docs.
fixed
Mark> Will AC_CHECK_SIZEOF(intmax_t) work on platforms that Mark> don't define intmax_t? I don't know whether the #define Mark> created by the earlier AC_TYPE_INTMAX_T is available at Mark> that point. We'll probably find out from the buildbots.
I tested with a typo in configure.ac:
AC_CHECK_SIZEOF(uintmax32_t)
configure result:
checking size of uintmax32_t... 0
pyconfig.h:
#define SIZEOF_UINTMAX32_T 0
Should we undefine SIZEOF_UINTMAX32_T (in pyport.h) if its value is zero?
Mark> Do we also need an addition to PC/pyconfig.h to define (u)intmax_t Mark> and SIZEOF_(U)INTMAX_T on Windows?
Ah yes, I forgot Windows, but I don't have access to a Windows box right now. I modified PC/pyconfig.h, but I cannot test my patch.
I suppose that intmax_t and uintmax_t don't need to be defined (using typedef) with Visual Studio 2010 or later, since stdint.h is available.
For the SIZEOF, I chose 64 bits and added a new test in _testcapi (for all platforms). It looks like there is no platform with (hardware) 128 bits integer, and 64-bit on Windows should be correct.
On Linux 64-bit, __int128 is available, but the size of intmax_t is 64 bits.
Mark> For the PyLong_As* functions, it may be more efficient to code the conversion directly instead of using _PyLong_AsByteArray.
I copied code from PyLong_AsLongLong and PyLong_AsUnsignedLongLong. If the code is changed, I would prefer to change the 4 PyLong_As*() functions at the same time. Don't you think so?
The PyLong_As* functions assume that intmax_t and uintmax_t have no padding bits, no trap representation, and (in the case of intmax_t) use two's complement. I think it's fine to assume all these things, but we should also either document or test those assumptions.
What is a "trap representation"?
I only know "two's complement". What are the other kinds?
How should we test those assumptions?
The patch lacks tests.
Which kind of test do you see?
Would you like to help me to implement this new feature?