-
Notifications
You must be signed in to change notification settings - Fork 72
Use a universal abstraction for constrained collection types #39
Conversation
This could be used for different types of constraints to implement types such as `Map`, `BitSet`, `SortedSet`. Computing the result type works similarly to `CanBuild` but the eligible implicit scopes and the relationship of different implicits are much simpler. In particular, there is never a `From` type and there are only two choices for the `To` type: the desired constrained type (which does not need to be a type constructor) and an unconstrained fallback type constructor. The current implementation shows that inheritance works (with plenty of `@uncheckedVariance` tricks) if the constraint is the same. We add an `Ordering` constraint in `SortedSet` and extend that to `TreeSet`. If the element type has an `Ordering` poly-transforms produce another `TreeSet` (or `SortedSet`, depending on the original type), otherwise a `Set`. Unfortunately, it does not look like it will be possible to compose multiple constraints, which would be needed for types like `TreeMap` (combining the `Tuple2` constraint of `Map` with the `Ordering` constraint of `Sorted`).
scalaVersion in ThisBuild := "2.12.1" | ||
resolvers in ThisBuild += "scala-pr" at "https://scala-ci.typesafe.com/artifactory/scala-pr-validation-snapshots" | ||
|
||
scalaVersion in ThisBuild := "2.12.2-ebe1180-SNAPSHOT" // "2.12.1" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Scala build from scala/scala#5742
Adding an upper bound to element types allows `FromIterable` and `IterableFactory` to be used for building constrained collection types. In cases where the constraint is only a type proof (e.g. `BitSet`, `String`, `Map`) they are on equal footing with unconstrained collections. As a proof of concept, I added a `BitSet` class. If you have an `Iterable[Int]` you can now call `.to(BitSet)` with no overloading or runtime overhead (like passing an additional implicit parameter). Sort-of-problem: We need pretend unary type constructors for all collection types, so instead of a `BitSet` you really get a `BitSet with Iterable[T]`. This still doesn’t help with the actual implementation of constrained collection types (only with their companion objects). Next step: Implicit lookup of companion objects and extending it to constraints with a runtime component.
- Requires special syntax for `to` when building collections with a constraint value. - No fallback to unconstrained parent type in poly-transforms
I'm done with my experiments here so I'll summarize what I learned in order to make an informed decision on how we want to integrate constrained collection types. Rather than just unconstrained and constrained, we can make further distinctions which may prove useful:
In the first commit I implement a One downside of this design is that constraints do not compose. My attempts to solve this problem inevitably led back to the full Another downside is the runtime overhead, which should be the same as for The final state of this PR after commit 3 shows the alternative: Pass implicit evidence values directly to the overloaded methods in The big problem of this version is that there is no fallback. Assuming an implementation that doesn't rely on generalized type constraints, This problem is independent of having a generic abstraction for constraints. All implementations of constrained collections that require an implicit evidence will run into this issue, so the biggest question is, are we OK with this limitation or do we need to go back to a For collections that only have type constraints the actual collection type needs to implement specific overloads for the type constraints. We cannot have generic supertraits because methods with a generic constraint would have the same erasure as the unconstrained overloads. We can provide such an abstraction for the collection factories though, which is what I did in the 2nd commit. This requires an additional upper bound type parameter in To summarize my recommendations and open questions:
|
I am not only okay with operations on However, being unable to map to your target at all is terribly inconvenient, so we should always provide a method that will upcast the type to something that has lost the constraint. Regarding the I guess with mutable collections we deal as usual with element constraints by making collections with constrained element types invariant in the parameter with the constraint. For other constraints, since the in-place stuff will automatically not change the collection type and thus previous evidence applies (e.g. an Ordering), and the not-in-place stuff will follow the same logic as here, I think there's nothing extra. So I vote for the final strategy as the best alternative. I would like to hear from @julienrf as well, however. |
I agree that issuing type errors when constraints are lost is probably desirable. At least worthwhile checking out in depth.
Isn't that just a widening? I.e.
If it's just that, I am OK with it! |
Yes, just a widening. |
Definitely.
I don’t think so.
I’m not sure to want all the collection factories to extend this factory. I think it will be marginally useful (e.g. for |
What's the difference? An unconstrained |
The difference is only visible in tools (autocompletion, scaladoc). But if you all think this is not a problem I’m happy to go with this solution. |
It looks to me like the tooling should be improved for this case rather than the API. Both IntelliJ and scaladoc show override def empty[E <: Any]: List[E] = ??? both IntelliJ and scaladoc show it without an explicit type bound, even though one was provided in the definition. I can't think of any reason why an inherited method should be treated differently. |
Still, if you “go to definition” you will see the actual source code that does have this useless upper bound. But, as I previously said, this is just a minor inconvenience in my opinion. |
New implementation of these ideas in #45 |
This could be used for different types of constraints to implement
types such as
Map
,BitSet
,SortedSet
. Computing the result typeworks similarly to
CanBuild
but the eligible implicit scopes and therelationship of different implicits are much simpler. In particular,
there is never a
From
type and there are only two choices for theTo
type: the desired constrained type (which does not need to be atype constructor) and an unconstrained fallback type constructor.
The current implementation shows that inheritance works (with plenty of
@uncheckedVariance
tricks) if the constraint is the same. We add anOrdering
constraint inSortedSet
and extend that toTreeSet
. Ifthe element type has an
Ordering
poly-transforms produce anotherTreeSet
(orSortedSet
, depending on the original type), otherwise aSet
.Unfortunately, it does not look like it will be possible to compose
multiple constraints, which would be needed for types like
TreeMap
(combining the
Tuple2
constraint ofMap
with theOrdering
constraint of
Sorted
).