@@ -3,15 +3,16 @@ layout: blog-page
3
3
title : Announcing Dotty 0.13.0-RC1 with Spark support, top level definitions and redesigned implicits
4
4
author : Aggelos Biboudis
5
5
authorImg : /images/aggelos.jpg
6
- date : 2019-03-04
6
+ date : 2019-03-05
7
7
---
8
8
9
- Hello hello! This is the second release for 2019, let's call it the _ Contextual_
10
- release and you will understand why we are super excited in a bit! ✨🎊🎉
9
+ Hello hello! This is the second release for 2019. Spark, top level definitions
10
+ and redesigned implicits ✨🎊🎉 are the most important inclusions in this release
11
+ and you will understand why we are super excited, in a bit!
11
12
12
- Without further ado, today we release the version 0.13.0-RC1 of the Dotty compiler.
13
- This release serves as a technology preview that demonstrates new language features and the
14
- compiler supporting them.
13
+ Without further ado, today we release the version 0.13.0-RC1 of the Dotty
14
+ compiler. This release serves as a technology preview that demonstrates new
15
+ language features and the compiler supporting them.
15
16
16
17
Dotty is the project name for technologies that are being considered for
17
18
inclusion in Scala 3. Scala has pioneered the fusion of object-oriented and
@@ -38,8 +39,8 @@ This is our 13th scheduled release according to our
38
39
39
40
Dotty projects have always been able to [ depend on Scala 2
40
41
libraries] ( https://github.com/lampepfl/dotty-example-project#getting-your-project-to-compile-with-dotty ) ,
41
- and this usually works fine as long as the Dotty code does not call a Scala 2
42
- macro directly. However, [ Spark] ( http://spark.apache.org/ ) was known to not work
42
+ and this usually works fine ( as long as the Dotty code does not call a Scala 2
43
+ macro directly) . However, [ Spark] ( http://spark.apache.org/ ) was known to not work
43
44
correctly as it heavily relies on Java serialization which we were not fully
44
45
supporting.
45
46
@@ -55,6 +56,26 @@ Scala 2, and that was enough to make our Spark assignments run correctly! This
55
56
doesn't mean that our support is perfect however, so don't hesitate to [ open an
56
57
issue] ( http://github.com/lampepfl/dotty/issues ) if something is amiss.
57
58
59
+ ## Introducing top level definitions
60
+
61
+ _ Top level_ definitions are now supported. This means that package objects are
62
+ now redundant, and will be phased out. This means that all kinds of definitions
63
+ can be written at the top level.
64
+
65
+ ``` scala
66
+ package p
67
+
68
+ type Labelled [T ] = (String , T )
69
+
70
+ val a : Labelled [Int ] = (" count" , 1 )
71
+ def b = a._2
72
+ ```
73
+
74
+ You can read about [ dropping package
75
+ objects] ( https://dotty.epfl.ch/docs/reference/dropped-features/package-objects.html )
76
+ at the documentation linked or at the relevant PR
77
+ [ #5754 ] ( https://github.com/lampepfl/dotty/pull/5754 ) .
78
+
58
79
## All things impl... implied
59
80
60
81
Scala's implicits are its most distinguished feature. They are _ the_ fundamental
@@ -63,32 +84,39 @@ varied number of use cases, among them: implementing type classes, establishing
63
84
context, dependency injection, expressing capabilities, computing new types and
64
85
proving relationships between them.
65
86
66
- However, we identify a few consequences that implicits gave rise to, as a
67
- programming style. Firstly, users used implicit conversions between types, in an
68
- unprincipled matter. This overuse of implicit conversions decluttered code for
69
- sure, but it made it harder for people to reason about.
87
+ However, with great power comes great responsibility. The current design of
88
+ implicits has shown some limitations, which we have been trying to identify and
89
+ address to make Scala a clearer and more pleasant language. First of all, we
90
+ found that the syntactic similarity was too great between implicit _ conversions_
91
+ and implicit _ values_ that depend on other implicit values. Both of them appear
92
+ in the snippet below:
70
93
71
94
``` scala
72
95
implicit def i1 (implicit x : T ): C [T ] = ... // 1: conditional implicit value
73
96
implicit def i2 (x : T ): C [T ] = ... // 2: implicit conversion
74
97
```
75
98
99
+ Some users used implicit conversions, in an unprincipled matter. This overuse of
100
+ implicit conversions decluttered code. However, while implicit conversions can
101
+ be useful to remove clutter, their abuse makes it harder for people to reason
102
+ about the code.
103
+
76
104
The ` implicit ` keyword is used for both implicit conversions and conditional
77
- implicit values and we identify that their semantic differences must be
78
- communicated more clearly syntactically. Secondly, implicits pose challenges for
79
- tooling such as error reporting for failed implicit searches. Furthermore, the
80
- ` implicit ` keyword is way too overloaded (implicit vals, defs, objects,
81
- parameters). For instance, a newcomer can easily confuse the two
82
- examples above while they demonstrate completely different things, a typeclass
83
- instance is an implicit object or val if unconditional and an implicit def with
84
- implicit parameters if conditional; arguably all of them are surprisingly
85
- similar (syntactically). Another consideration is that the ` implicit ` keyword
86
- annotates a whole parameter section instead of a single parameter, and passing
87
- an argument to an implicit parameter looks like a regular application. This is
88
- problematic because it can create confusion regarding what parameter gets passed
89
- in a call. Last but not least, sometimes implicit parameters are merely
90
- propagated in nested function calls and not used at all, so names of implicit
91
- parameters are not always necessary .
105
+ implicit values and we identified that their semantic differences must be
106
+ communicated more clearly syntactically. Furthermore, the ` implicit ` keyword is
107
+ ascribed too many overloaded meanings in the language ( implicit vals, defs,
108
+ objects, parameters). For instance, a newcomer can easily confuse the two
109
+ examples above, although they demonstrate completely different things, a
110
+ typeclass instance is an implicit object or val if unconditional and an implicit
111
+ def with implicit parameters if conditional; arguably all of them are
112
+ surprisingly similar (syntactically). Another consideration is that the
113
+ ` implicit ` keyword annotates a whole parameter section instead of a single
114
+ parameter, and passing an argument to an implicit parameter looks like a regular
115
+ application. This is problematic because it can create confusion regarding what
116
+ parameter gets passed in a call. Last but not least, sometimes implicit
117
+ parameters are merely propagated in nested function calls and not used at all,
118
+ so giving names to implicit parameters is often redundant and only adds noise to
119
+ a function signature .
92
120
93
121
Consequently, we introduce two new language features:
94
122
@@ -125,7 +153,7 @@ implied ListOrd[T] given (ord: Ord[T]) for Ord[List[T]] {
125
153
}
126
154
```
127
155
128
- A ` given ` clause can also designate an inferable parameter for functions:
156
+ A ` given ` clause can also designate an inferable parameter for functions:
129
157
130
158
``` scala
131
159
def max [T ](x : T , y : T ) given (ord : Ord [T ]): T =
@@ -155,10 +183,10 @@ instance of `ExecutionContext` is demanded the right-hand side is returned.
155
183
implied ctx for ExecutionContext = currentThreadPool().context
156
184
```
157
185
158
- For symmetry, we define our well-known ` implicitly ` from ` Predef ` in terms of
159
- ` given ` and for simplicity we rename it to ` the ` . Functions like ` the ` that have
160
- only _ inferable parameters_ are also called _ context queries_ from now on.
161
- Consequently, to summon an implied instance of ` Ord[List[Int]] ` we write:
186
+ We have also added a synonym to ` implicitly ` , which is often more natural to
187
+ spell out in user code . Functions like ` the ` that have only _ inferable
188
+ parameters_ are also called _ context queries_ from now on. Consequently, to
189
+ summon an implied instance of ` Ord[List[Int]] ` we write:
162
190
163
191
``` scala
164
192
the[Ord [List [Int ]]]
@@ -180,7 +208,7 @@ object B {
180
208
}
181
209
```
182
210
183
- You can read more about [ implied
211
+ ** You can read more about** [ implied
184
212
imports] ( https://dotty.epfl.ch/docs/reference/contextual/import-implied.html )
185
213
from the docs or the relevant PR
186
214
[ #5868 ] ( https://github.com/lampepfl/dotty/pull/5868 ) .
@@ -196,7 +224,7 @@ Context queries--previously named implicit function types (IFTs)--are now also
196
224
expressed with ` given ` , providing types for first-class context queries. This is
197
225
merely an alignment of IFTs into the new scheme.
198
226
199
- You can read about the alternative to implicits through the * Contextual
227
+ ** You can read more about** the alternative to implicits through the * Contextual
200
228
Abstractions* section of our documentation or for a deep dive from the relevant
201
229
PR chain that originated from
202
230
[ #5458 ] ( https://github.com/lampepfl/dotty/pull/5458 ) . The syntax changes for new
@@ -222,7 +250,7 @@ enum Tree[T] derives Eql, Ordering, Pickling {
222
250
223
251
where the generated implied instances are the ones below:
224
252
``` scala
225
- implied [T : Eq ] for Eq [Tree [T ]] = Eq .derived
253
+ implied [T : Eql ] for Eql [Tree [T ]] = Eql .derived
226
254
implied [T : Ordering ] for Ordering [Tree [T ]] = Ordering .derived
227
255
implied [T : Pickling ] for Pickling [Tree [T ]] = Pickling .derived
228
256
```
@@ -248,9 +276,9 @@ it has a definition like this:
248
276
def derived [T ] given Generic [T ] = ...
249
277
```
250
278
251
- You can read more about [ Typeclass
279
+ ** You can read more about** [ Typeclass
252
280
Derivation] ( https://dotty.epfl.ch/docs/reference/contextual/derivation.html ) or
253
- for a deep dive at the relevant PRs:
281
+ have a deep dive at the relevant PRs:
254
282
[ #5540 ] ( https://github.com/lampepfl/dotty/pull/5540 ) and
255
283
[ #5839 ] ( https://github.com/lampepfl/dotty/pull/5839 ) .
256
284
@@ -263,7 +291,7 @@ provide a derived implicit instance:
263
291
implied for Eql [Int , String ] = Eql .derived
264
292
```
265
293
266
- You can read how we based multiversal equality on typeclass derivation through
294
+ ** You can read more about ** how we based multiversal equality on typeclass derivation through
267
295
the relevant PR [ #5843 ] ( https://github.com/lampepfl/dotty/pull/5843 ) .
268
296
269
297
_ Implicit conversions_ are now defined by implied instances of the
@@ -275,32 +303,11 @@ implied for Conversion[String, Token] {
275
303
}
276
304
```
277
305
278
- _ Top level_ definitions are now supported. This means that package objects are
279
- now redundant, and will be phased out. This means that all kinds of definitions
280
- can be written at the top level.
281
-
282
- ``` scala
283
- package p
284
-
285
- type Labelled [T ] = (String , T )
286
- val a : Labelled [Int ] = (" count" , 1 )
287
- def b = a._2
288
-
289
- case class C ()
290
-
291
- implicit object Cops {
292
- def (x : C ) pair (y : C ) = (x, y)
293
- }
294
- ```
295
-
296
- You can read about [ dropping package
297
- objects] ( https://dotty.epfl.ch/docs/reference/dropped-features/package-objects.html )
298
- at the documentation linked or at the relevant PR
299
- [ #5754 ] ( https://github.com/lampepfl/dotty/pull/5754 ) .
300
-
301
- ** This blogpost offers only a brief summary of the new features, for more details
302
- please read our documentation page under the new section named [ * Contextual
303
- Abstractions* ] ( https://dotty.epfl.ch/docs/ ) .**
306
+ ** Note:** that these release notes contain only a brief summary of the new
307
+ features, for more details please read our documentation page under the new
308
+ section named [ * Contextual Abstractions* ] ( https://dotty.epfl.ch/docs/ ) . Equally
309
+ important with the documentation of each feature, please consult the
310
+ [ Relationship with Scala 2 Implicits] ( https://dotty.epfl.ch/docs/reference/contextual/relationship-implicits.html ) section as well.
304
311
305
312
## Implicit resolution rule changes
306
313
0 commit comments