[Python-3000] PEP 31XX: A Type Hierarchy for Numbers (and other algebraic entities) (original) (raw)

Guido van Rossum guido at python.org
Mon Apr 30 05:02:49 CEST 2007


On 4/29/07, Jim Jewett <jimjjewett at gmail.com> wrote:

On 4/29/07, Guido van Rossum <guido at python.org> wrote: > Hmm... Maybe the conclusion to draw from this is that we shouldn't > make Ring a class? Maybe it ought to be a metaclass, so we could ask > isinstance(Complex, Ring)?

Yes; all the ABCs are assertions about the class.

I don't think so. Many are quite useful for introspection of instances as well, e.g. Hashable/Iterable (the whole "One Trick Ponies" section) as well as the distinction between Sequence and Mapping. It's the binary operations where the class comes into play.

(Zope interfaces do support instance-specific interfaces, which has been brought up as a relative weakness of ABCs.)

The only thing two subclasses of an Abstract class need to have in common is that they both (independently) meet the requirements of the ABC. If not for complexity of implementation, that would be better described as a common metaclass.

Again, not so fast; it depends. The way the Set section of the PEP is currently written, all sets are comparable (in the subset/superset sense) to all other sets, and for ComposableSet instances the union, intersection and both types of differences are also computable across class boundaries.

Using a metaclass would also solve the "when to gripe" issue; the metaclass would gripe if it couldn't make every method concrete. If this just used the standard metaclass machinery, then it would mean a much deeper metaclass hierarchy than we're used to; MutableSet would a have highly dervived metaclass.

I think you're going way too fast here.

> The more I think about it, it sounds like the right thing to do. To > take PartiallyOrdered (let's say PO for brevity) as an example, the > Set class should specify PO as a metaclass. The PO metaclass could > require that the class implement lt and le. If it found a > class that didn't implement them, it could make the class abstract by > adding the missing methods to its abstractmethods attribute.

Or by making it a sub(meta)class, instead of a (regular instance) class.

That makes no sense. Deciding on the fly whether something should be a class or a metaclass sounds like a fine recipe for end-user confusion.

> if it found that the class implemented one but not the other, it could > inject a default implementation of the other in terms of the one and > eq.

This also allows greater freedom in specifying which subsets of methods must be defined. > Now, you could argue that Complex should also be a metaclass. While > that may mathematically meaningful (for all I know there are people > doing complex number theory using Complex[Z/n]), for Python's numeric > classes I think it's better to make Complex a regular class > representing all the usual complex numbers (i.e. a pair of Real > numbers). complex already meets that need. Complex would be the metaclass representing the restrictions on the class, so that independing implementations wouldn't have to fake-inherit from complex.

I was thinking of other representations of complex numbers as found e.g. in numpy. These vary mostly by using fewer (or more?) bits for the real and imag parts. They can't realistically subclass complex, as their implementation is independent; they should subclass Complex, to indicate that they implement the Complex API. I really think you're going too far with the metaclass idea.

Now, if we had parameterizable types (for which I've proposed a notation, e.g. list[int] would be a list of integers, and Mapping[String, Real] would be a mapping from strings to real numbers; but I don't expect this to be in py3k, as it needs more experimentation), Complex might be a parameterizable type, and e.g. the current concrete complex type could be equated to Complex[float]; but without that, I think it's fine to see Complex as the Abstract Base Class and complex as one concrete representation.

> I expect that the complex subclasses used in practice are > all happy under mixed arithmetic using the usual definition of mixed > arithmetic: convert both arguments to a common base class and compute > the operation in that domain.

It is reasonable to insist that all Complex classes have a way to tranform their instances into (builtin) complex instances, if only as a final fallback. There is no need for complex to be a base class.

I agree complex shouldn't be a base class (apologies if I implied that by using lowercase) but I still think Complex should be a base class.

To be honest, I'm not sure what should happen with mixed operations between classes that only have an abstract common base class. The normal approach for binary operations is that each side gets a try. For pairs like int+float this is easy; int.add returns NotImplemented in this case, and then float.radd is called which converts the first argument to float and returns a float. For pairs like numpy's complex 32-bit float and numpy's complex 64-bit float it should also be easy (numpy is aware of both types and hence always gets to choose); and for numpy's complex combined with the built-in complex it's easy enough too (again, numpy always gets to choose, this time because the built-in complex doesn't know about numpy).

But what if I wrote my own complex type based on decimal.Decimal, and I encountered a numpy complex? numpy doesn't know about my type and hence passes the ball to me; but perhaps I don't know about numpy either, and then a TypeError will be raised. So, to be a good citizen, I could check if the other arg was a Complex of unknown provenance, and then I could convert to the built-in complex and pass the ball to that type. But if all good citizens lived by that rule (and it appears to be a reasonable rule), then numpy, being the ultimate good citizen, would also convert my type to the built-in complex. But that would mean that if my type did know about numpy (but numpy didn't know about mine), I wouldn't get to choose what to do in half the cases (if the numpy instance was on the left, it would get and take the first opportunity). Perhaps we need to extend the built-in operation processing so that if both sides return NotImplemented, before raising TypeError, we look for some common base type implementing the same operation. The abstract Complex type could provide abstract implementations of the binary operators that would convert their arguments to the concrete complex type and compute the result that way. (Hah! Another use case for abstract methods with a useful implementation! :-)

-- --Guido van Rossum (home page: http://www.python.org/~guido/)



More information about the Python-3000 mailing list