Variants/case classes/algebraic data types/sums/oh my! (original) (raw)
org.openjdk at io7m.com org.openjdk at io7m.com
Sat Jun 11 19:34:49 UTC 2016
- Previous message (by thread): Variants/case classes/algebraic data types/sums/oh my!
- Next message (by thread): Variants/case classes/algebraic data types/sums/oh my!
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
On 2016-06-11T20:40:06 +0200 forax at univ-mlv.fr wrote:
You should take a look to Scala, it's is a JVM language, i.e. it compiles to bytecode and run on the JVM. The matching part is done by invoking a method unapply() on the constructed pattern [1], one goal of Scala (at least in the early days) was to try to implement most of the features of the language as library calls, the unapply() trick fits well that model. The ugly side effect :) of that trick is that you can do side effect when performing the matching.
I'm reasonably familiar with Scala as it was about two years ago. I'd forgotten that matching was implemented like that for patterns. I agree it's ugly, I don't think that's something that's desirable in an implementation of this stuff.
> Add another subclass to the class being visited, add another case to > the visitor type, and now the compiler tells you everywhere in your > code where the anonymous visitor instances are now > incomplete.
You can not change the Visitor interface without breaking the backward compatibility, basically, with the Visitor of the Gof book, you can not add a new subclass. That's why i've created this formulation, it lets anyone to create a new subclass, as you said at the expense of loosing the precise type checking of the compiler. BTW, ADT pattern matching and Design Pattern Visitor are dual, it was called the "expression problem" by Philip Wadler (see [2] and [3] for the full story).
I'd argue that that's a feature. For these types, we specifically want code to break when their definition is changed in any way: The compiler's telling us "here are all the places in your code that now need to be updated". It's one of the greatest refactoring tools. Code that uses the dynamic approach may retain some form of backwards compatibility due to the fact that it still compiles, but it's now incorrect (because it can't handle the new cases that are now going to be given to it), and the compiler can't help.
I'm not actually convinced by Wadler. We're getting somewhat off topic, but I think that he turned something into a problem that actually isn't a problem, and it stems from the idea that code must keep compiling in the face of changes. For the types I define that are conceptually closed, I want code to stop compiling when I change them, because it means my program is now incorrect. For the types I define that are conceptually open, I obviously want it to be very difficult to make a change that could cause compilation errors or demand recompilation of existing code. I feel like they're two opposing requirements, both with satisfactory solutions, and that it's not actually a problem.
I'm sure we could go on all night!
> > I'm intrigued by this. Is the per-value specialization something that's > currently implemented? That looks a bit dependently-typed to me...
It's not currently implemented but i see not reason to not try to (see my talk at the next JVM Summit (if the talk is accepted)). By example, i think that we need something like that if we want to re-implement method handle combiners as a kind of specialization.
I was curious and slightly terrified of the idea that this would allow for the declaration of the classic dependently-typed length-indexed lists in Java, but I see that it doesn't quite work at that level (or at least, I assume it doesn't).
M
- Previous message (by thread): Variants/case classes/algebraic data types/sums/oh my!
- Next message (by thread): Variants/case classes/algebraic data types/sums/oh my!
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]