Open danieldietrich opened 6 years ago
At the root of the type hierarchy there was Value
, which had all the conversion methods (not only for collections). Starting with Vavr 1.0, instead each modules has intra-module conversion methods. This fixes issues like:
// = Some(1)
List(1, 2, 3).toOption()
Having one conversion method for each collection does not scale very well. What if user-defined collections are added. The solution for this problem is a generic conversion method:
public interface Iterable<T> extends java.lang.Iterable<T> {
@Override
Iterator<T> iterator();
default <U, C extends Iterable<U>> C to(Function<Iterable<T>, C> fromIterable) {
return fromIterable.apply(this);
}
}
Before:
After:
with
interface Set<T> extends Iterable<T> {
static <T> Set<T> ofAll(Iterable<? extends T> iterable) { ... }
}
interface Map<K, V> extends Iterable<V> {
static <T, K, V> Function<Iterable<T>, Map<K, V>> ofAll(Function<T, K> keyMapper, Function<T, V> valueMapper) { ... }
}
Types...
void test(List<String> list) {
Iterable<CharSequence> ccc1 = list.to(Set::ofAll);
Iterable<String> ccc2 = list.to(Set::ofAll);
List<CharSequence> ccc3 = list.to(Set::ofAll);
List<String> ccc4 = list.to(Set::ofAll);
// = Map.ofAll(s -> s, String::length).apply(list);
Map<String, Integer> map = list.to(Map.ofAll(s -> s, String::length));
}
Bite the bullet, go for simulated higher kinded types and unite *.transform(...)
with Iterable.to(...)
? Because they are basically the same, except for the lack of HTKs, right?
@nfekete I know you don't like a generic to(...)
conversion method. But explicitly enumerating all possible conversion methods has drawbacks and does not scale well. The solution described above does not simulate HKT's. It is just a compositional approach using functions. It aligns to the Scala 2.13 collection rework. Vavr's .ofAll
factory method is called fromIterable
in Scala. If only we could express on the type level that a type Class<C>
has a specific static method...
I want to sperarate the frequently used collections (Sets, Maps, Seqs) and the more esoteric and less used collections (BitSet, PriorityQueue, MultiMap, MultiSet) into different Jigsaw modules. However, we still want to be able to convert collections across Jigsaw modules. That will be only possible with a generic conversion method.
(Scala's collections do not have a tranform
method, Vavr 1.0 also won't have it.
Btw - Vavr's {Option, Either, Try}.transform(...) aligned to Scala in Vavr 1.0.)
We need the split and the simplifications because Vavr suffered from the second-system effect within the last 4 years...
I'm not for the enumerating all the conversion methods at all. I think a single transform(...)
would be enough, if we could only express it in Java without simulated HKTs and other workarounds. @jbgi's work on derive4j/hkt seems very interesting. There is also KindedJ which seems to be adopted by Arrow for Kotlin and AOL's Cyclops
But I also recognize your aim to simplify things as much as possible and to stay away of external dependencies.
Yeah, I know. I was involved when the KindedJ Github org was founded. (In fact I'm an admin of that org.)
However, that's not the right direction for Vavr. HKTs are needed by the projects you mentioned in order to lift functions to a level that operates on Monads. HKTs are a dirty workaround in Java, they have limitations (e.g. only one inheritance level configurable). Such algebraic abstractions are completely out of scope for Vavr. We strive for a dead-simple (but powerful) library for 'Avarage Java Joe' that hides away explicit abstractions like Functor, Monad, Monoid, Applicative, Lense, etc.
I'm a little curious though, about to
versus transform
. am I missing something, or every call to to
could be replaced by a call to transform
?
If that's the case, maybe cave could have only the latter, to reduce the API surface area?
@emmanueltouzery to
is shorter and also present in Scala. transform
will disappear. We align to Scala in the first place (and when in doubt). We could introduce the 0.9.2 methods for backcompat and deprecate them. But I would prefer to remove them.
@emmanueltouzery you (and @nfekete) are right! We can widen the codomain of to
:
default <C> C to(Function<Iterable<T>, C> fromIterable) {
return fromIterable.apply(this);
}
But they are not the same. Vector:
<U> U transform(Function<? super Vector<T>, ? extends U> f)
Map:
<U> U transform(Function<? super Map<K, V>, ? extends U> f)
Iterable:
<C> C to(Function<Iterable<T>, C> fromIterable)
There is value is the specificity of the transform
method as opposed to the genericity of the to
method. Transform takes a function that takes as input the concrete type on which transform is defined. Hence, you can efficiently transform a Vector
by directly indexing one of the values in it, or directly taking out a value from a Map
. You can't do that with the current version of to
because it's too generic, it's takes a function that takes an Iterable
of values. You can't take advantage of the concrete type when doing the transformation.
The two methods could be united, but only if you could somehow declare at the top level (at Iterable
) that you're gonna take a function that takes as input the current self type. But you cannot express this without HKT's I think.
<U> U transform(Function<? super *SELF*, ? extends U> f)
Because we can't express the above, I think it's better to keep both. transform
methods on the concrete implementation level, to
on Iterable
.
@nfekete That's right. A transformation is just a function. When removing transform
we loose the fluent API.
In a first step I will take a look at the existing Scala collections API. It is pretty mature. Afterwards I think about what to do additionally...
Just a data point: I like the fluency of transform and would regret seeing it go. On the other hand, I can see why just removing the to* without adding a to() would leave people confused at how to port these calls.
In the end, if it was me, I think today in vavr, the to* calls and the transform calls serve different functions, and would keep both options.
@nfekete, @emmanueltouzery I also think both, to() and transform(), are helpful. We will implement both in Vavr 1.0.
Also I've thought much about modularity. The first module was a test-balloon. I care about you and existing code-bases. The more I think about modularization and the benefits of the current Vavr 0.9.x monolith, the more I think I need to keep the main functionality of Vavr in one module.
I will prepare a suggestion that preserves most of the existing functionality.
We will have the following implementations:
All other collections will go to an 'extended collections' module
We need to take care of casting Comparators of sorted collections to (Comparator & Serializable).
Hi, I haven't read the entire conversation in detail, but I've noticed the discussion about the generalization of the transform
method from
<R> R transform(Function<? super Vector<T>, ? extends R> f)
<R> R transform(Function<? super Map<K, V>, ? extends R> f)
into
<R> R transform(Function<? super *SELF*, ? extends R> f)
@nfekete said:
you cannot express this without HKT's I think.
and it's true that you can't currently express a single method like that in Java.
However, I've come up with a workaround for this. It doesn't use higher-kinded types, and instead uses a helper object with a covariant return type (two method calls):
interface Value<T> {
Transformer<? extends Value<T>> transformed();
}
interface Traversable<T> extends Value<T> {
@Override
Transformer<? extends Traversable<T>> transformed();
}
where:
@FunctionalInterface
interface Transformer<T> {
<R> R by(Function<? super T, ? extends R> f);
}
Usage:
String string = value.transfomed().by(Object::toString);
If you find it interesting, see my blog post for more details: Transformer Pattern.
Hope it helps! :smiley:
Thank you @tlinkowski, that is really helpful. I tweeted your lovely blog post link: https://twitter.com/danieldietrich/status/1083362477147127810
I will experiment with it!
See
Tasks