From 8372014d4c581c0622c4c47d356114ddf0a659de Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Tue, 18 Oct 2022 10:51:03 +0200 Subject: [PATCH 01/12] update reference docs and replacements for 3.2.0 --- .../changed-features/pattern-bindings.md | 10 +- .../changed-features/pattern-matching.md | 27 +- .../reference/contextual/context-bounds.md | 11 +- .../reference/contextual/derivation-macro.md | 25 +- docs/_docs/reference/contextual/derivation.md | 42 +- .../dropped-features/nonlocal-returns.md | 2 +- .../dropped-features/this-qualifier.md | 2 - docs/_docs/reference/enums/enums-index.md | 2 +- .../reference/experimental/explicit-nulls.md | 2 +- .../reference/experimental/fewer-braces.md | 78 +--- .../experimental/numeric-literals.md | 2 +- docs/_docs/reference/experimental/overview.md | 1 - .../language-versions/binary-compatibility.md | 25 +- .../language-versions/language-versions.md | 1 + .../language-versions/source-compatibility.md | 19 +- .../metaprogramming/compiletime-ops.md | 8 +- docs/_docs/reference/new-types/new-types.md | 2 +- .../other-new-features/experimental-defs.md | 8 +- .../reference/other-new-features/export.md | 52 ++- .../other-new-features/indentation.md | 84 ++-- .../other-new-features/targetName.md | 2 +- docs/_docs/reference/overview.md | 4 +- docs/_docs/reference/syntax.md | 428 +++++++++--------- .../referenceReplacements/sidebar.yml | 1 + 24 files changed, 430 insertions(+), 408 deletions(-) diff --git a/docs/_docs/reference/changed-features/pattern-bindings.md b/docs/_docs/reference/changed-features/pattern-bindings.md index 2c8d1c10ceae..b7b7432e7817 100644 --- a/docs/_docs/reference/changed-features/pattern-bindings.md +++ b/docs/_docs/reference/changed-features/pattern-bindings.md @@ -7,7 +7,7 @@ movedTo: https://docs.scala-lang.org/scala3/reference/changed-features/pattern-b In Scala 2, pattern bindings in `val` definitions and `for` expressions are loosely typed. Potentially failing matches are still accepted at compile-time, but may influence the program's runtime behavior. -From Scala 3.1 on, type checking rules will be tightened so that warnings are reported at compile-time instead. +From Scala 3.2 on, type checking rules will be tightened so that warnings are reported at compile-time instead. ## Bindings in Pattern Definitions @@ -16,7 +16,7 @@ val xs: List[Any] = List(1, 2, 3) val (x: String) :: _ = xs // error: pattern's type String is more specialized // than the right-hand side expression's type Any ``` -This code gives a compile-time warning in Scala 3.1 (and also in Scala 3.0 under the `-source future` setting) whereas it will fail at runtime with a `ClassCastException` in Scala 2. In Scala 3.1, a pattern binding is only allowed if the pattern is _irrefutable_, that is, if the right-hand side's type conforms to the pattern's type. For instance, the following is OK: +This code gives a compile-time warning in Scala 3.2 (and also earlier Scala 3.x under the `-source future` setting) whereas it will fail at runtime with a `ClassCastException` in Scala 2. In Scala 3.2, a pattern binding is only allowed if the pattern is _irrefutable_, that is, if the right-hand side's type conforms to the pattern's type. For instance, the following is OK: ```scala val pair = (1, true) val (x, y) = pair @@ -25,7 +25,7 @@ Sometimes one wants to decompose data anyway, even though the pattern is refutab ```scala val first :: rest = elems // error ``` -This works in Scala 2. In fact it is a typical use case for Scala 2's rules. But in Scala 3.1 it will give a warning. One can avoid the warning by marking the right-hand side with an [`@unchecked`](https://scala-lang.org/api/3.x/scala/unchecked.html) annotation: +This works in Scala 2. In fact it is a typical use case for Scala 2's rules. But in Scala 3.2 it will give a warning. One can avoid the warning by marking the right-hand side with an [`@unchecked`](https://scala-lang.org/api/3.x/scala/unchecked.html) annotation: ```scala val first :: rest = elems: @unchecked // OK ``` @@ -40,7 +40,7 @@ val elems: List[Any] = List((1, 2), "hello", (3, 4)) for (x, y) <- elems yield (y, x) // error: pattern's type (Any, Any) is more specialized // than the right-hand side expression's type Any ``` -This code gives a compile-time warning in Scala 3.1 whereas in Scala 2 the list `elems` +This code gives a compile-time warning in Scala 3.2 whereas in Scala 2 the list `elems` is filtered to retain only the elements of tuple type that match the pattern `(x, y)`. The filtering functionality can be obtained in Scala 3 by prefixing the pattern with `case`: ```scala @@ -56,4 +56,4 @@ Generator ::= [‘case’] Pattern1 ‘<-’ Expr ## Migration -The new syntax is supported in Scala 3.0. However, to enable smooth cross compilation between Scala 2 and Scala 3, the changed behavior and additional type checks are only enabled under the `-source future` setting. They will be enabled by default in version 3.1 of the language. +The new syntax is supported in Scala 3.0. However, to enable smooth cross compilation between Scala 2 and Scala 3, the changed behavior and additional type checks are only enabled under the `-source future` setting. They will be enabled by default in version 3.2 of the language. diff --git a/docs/_docs/reference/changed-features/pattern-matching.md b/docs/_docs/reference/changed-features/pattern-matching.md index b4660f893141..a067d19f8ccd 100644 --- a/docs/_docs/reference/changed-features/pattern-matching.md +++ b/docs/_docs/reference/changed-features/pattern-matching.md @@ -4,7 +4,7 @@ title: "Option-less pattern matching" movedTo: https://docs.scala-lang.org/scala3/reference/changed-features/pattern-matching.html --- -The implementation of pattern matching in Scala 3 was greatly simplified compared to Scala 2. From a user perspective, this means that Scala 3 generated patterns are a *lot* easier to debug, as variables all show up in debug modes and positions are correctly preserved. +The implementation of pattern matching in Scala 3 was greatly simplified compared to Scala 2. From a user perspective, this means that Scala 3 generated patterns are a _lot_ easier to debug, as variables all show up in debug modes and positions are correctly preserved. Scala 3 supports a superset of Scala 2 [extractors](https://www.scala-lang.org/files/archive/spec/2.13/08-pattern-matching.html#extractor-patterns). @@ -12,7 +12,7 @@ Scala 3 supports a superset of Scala 2 [extractors](https://www.scala-lang.org/f Extractors are objects that expose a method `unapply` or `unapplySeq`: -```Scala +```scala def unapply[A](x: T)(implicit x: B): U def unapplySeq[A](x: T)(implicit x: B): U ``` @@ -25,7 +25,7 @@ called variadic extractors, which enables variadic patterns. Fixed-arity extractors expose the following signature: -```Scala +```scala def unapply[A](x: T)(implicit x: B): U ``` @@ -36,7 +36,7 @@ The type `U` conforms to one of the following matches: Or `U` conforms to the type `R`: -```Scala +```scala type R = { def isEmpty: Boolean def get: S @@ -62,7 +62,7 @@ A usage of a fixed-arity extractor is irrefutable if one of the following condit Variadic extractors expose the following signature: -```Scala +```scala def unapplySeq[A](x: T)(implicit x: B): U ``` @@ -73,7 +73,7 @@ The type `U` conforms to one of the following matches: Or `U` conforms to the type `R`: -```Scala +```scala type R = { def isEmpty: Boolean def get: S @@ -167,7 +167,7 @@ object Nat: - `N > 1` is the maximum number of consecutive (parameterless `def` or `val`) `_1: P1 ... _N: PN` members in `U` - Pattern-matching on exactly `N` patterns with types `P1, P2, ..., PN` -```Scala +```scala object ProdEmpty: def _1: Int = ??? def _2: String = ??? @@ -180,12 +180,11 @@ object ProdEmpty: case _ => () ``` - ## Sequence Match - `U <: X`, `T2` and `T3` conform to `T1` -```Scala +```scala type X = { def lengthCompare(len: Int): Int // or, `def length: Int` def apply(i: Int): T1 @@ -221,18 +220,18 @@ object CharList: the type of the remaining patterns are determined as in Seq Pattern. ```Scala -class Foo(val name: String, val children: Int *) +class Foo(val name: String, val children: Int*) object Foo: def unapplySeq(f: Foo): Option[(String, Seq[Int])] = Some((f.name, f.children)) def foo(f: Foo) = f match - case Foo(name, ns : _*) => - case Foo(name, x, y, ns : _*) => + case Foo(name, x, y, ns*) => ">= two children." + case Foo(name, ns*) => => "< two children." ``` -There are plans for further simplification, in particular to factor out *product -match* and *name-based match* into a single type of extractor. +There are plans for further simplification, in particular to factor out _product match_ +and _name-based match_ into a single type of extractor. ## Type testing diff --git a/docs/_docs/reference/contextual/context-bounds.md b/docs/_docs/reference/contextual/context-bounds.md index e336f00cc463..e0be7bfd31a6 100644 --- a/docs/_docs/reference/contextual/context-bounds.md +++ b/docs/_docs/reference/contextual/context-bounds.md @@ -10,7 +10,14 @@ A context bound is a shorthand for expressing the common pattern of a context pa def maximum[T: Ord](xs: List[T]): T = xs.reduceLeft(max) ``` -A bound like `: Ord` on a type parameter `T` of a method or class indicates a context parameter `using Ord[T]`. The context parameter(s) generated from context bounds come last in the definition of the containing method or class. For instance, +A bound like `: Ord` on a type parameter `T` of a method or class indicates a context parameter `using Ord[T]`. The context parameter(s) generated from context bounds +are added as follows: + + - If the method parameters end in an implicit parameter list or using clause, + context parameters are added in front of that list. + - Otherwise they are added as a separate parameter clause at the end. + +Example: ```scala def f[T: C1 : C2, U: C3](x: T)(using y: U, z: V): R @@ -19,7 +26,7 @@ def f[T: C1 : C2, U: C3](x: T)(using y: U, z: V): R would expand to ```scala -def f[T, U](x: T)(using y: U, z: V)(using C1[T], C2[T], C3[U]): R +def f[T, U](x: T)(using _: C1[T], _: C2[T], _: C3[U], y: U, z: V): R ``` Context bounds can be combined with subtype bounds. If both are present, subtype bounds come first, e.g. diff --git a/docs/_docs/reference/contextual/derivation-macro.md b/docs/_docs/reference/contextual/derivation-macro.md index 5ff0007268dd..060d04424132 100644 --- a/docs/_docs/reference/contextual/derivation-macro.md +++ b/docs/_docs/reference/contextual/derivation-macro.md @@ -31,12 +31,12 @@ given derived[T: Type](using Quotes): Expr[Eq[T]] and for comparison reasons we give the same signature we had with `inline`: ```scala -inline given derived[T]: (m: Mirror.Of[T]) => Eq[T] = ??? +inline given derived[T](using Mirror.Of[T]): Eq[T] = ??? ``` Note, that since a type is used in a subsequent stage it will need to be lifted -to a `Type` by using the corresponding context bound. Also, not that we can -summon the quoted `Mirror` inside the body of the `derived` this we can omit it +to a `Type` by using the corresponding context bound. Also, note that we can +summon the quoted `Mirror` inside the body of the `derived` thus we can omit it from the signature. The body of the `derived` method is shown below: @@ -49,15 +49,16 @@ given derived[T: Type](using Quotes): Expr[Eq[T]] = ev match case '{ $m: Mirror.ProductOf[T] { type MirroredElemTypes = elementTypes }} => val elemInstances = summonAll[elementTypes] - val eqProductBody: (Expr[T], Expr[T]) => Expr[Boolean] = (x, y) => - elemInstances.zipWithIndex.foldLeft(Expr(true: Boolean)) { - case (acc, (elem, index)) => - val e1 = '{$x.asInstanceOf[Product].productElement(${Expr(index)})} - val e2 = '{$y.asInstanceOf[Product].productElement(${Expr(index)})} - '{ $acc && $elem.asInstanceOf[Eq[Any]].eqv($e1, $e2) } - } - - '{ eqProduct((x: T, y: T) => ${eqProductBody('x, 'y)}) } + def eqProductBody(x: Expr[Product], y: Expr[Product])(using Quotes): Expr[Boolean] = { + elemInstances.zipWithIndex.foldLeft(Expr(true)) { + case (acc, ('{ $elem: Eq[t] }, index)) => + val indexExpr = Expr(index) + val e1 = '{ $x.productElement($indexExpr).asInstanceOf[t] } + val e2 = '{ $y.productElement($indexExpr).asInstanceOf[t] } + '{ $acc && $elem.eqv($e1, $e2) } + } + } + '{ eqProduct((x: T, y: T) => ${eqProductBody('x.asExprOf[Product], 'y.asExprOf[Product])}) } // case for Mirror.ProductOf[T] // ... diff --git a/docs/_docs/reference/contextual/derivation.md b/docs/_docs/reference/contextual/derivation.md index 972ac945a22d..87ae8a3a9a7e 100644 --- a/docs/_docs/reference/contextual/derivation.md +++ b/docs/_docs/reference/contextual/derivation.md @@ -19,9 +19,9 @@ The `derives` clause generates the following given instances for the `Eq`, `Orde companion object of `Tree`, ```scala -given [T: Eq] : Eq[Tree[T]] = Eq.derived -given [T: Ordering] : Ordering[Tree] = Ordering.derived -given [T: Show] : Show[Tree] = Show.derived +given [T: Eq] : Eq[Tree[T]] = Eq.derived +given [T: Ordering] : Ordering[Tree[T]] = Ordering.derived +given [T: Show] : Show[Tree[T]] = Show.derived ``` We say that `Tree` is the _deriving type_ and that the `Eq`, `Ordering` and `Show` instances are _derived instances_. @@ -29,17 +29,28 @@ We say that `Tree` is the _deriving type_ and that the `Eq`, `Ordering` and `Sho ### Types supporting `derives` clauses All data types can have a `derives` clause. This document focuses primarily on data types which also have a given instance -of the `Mirror` type class available. Instances of the `Mirror` type class are generated automatically by the compiler -for, - -+ enums and enum cases -+ case classes and case objects -+ sealed classes or traits that have only case classes and case objects as children +of the `Mirror` type class available. `Mirror` type class instances provide information at the type level about the components and labelling of the type. They also provide minimal term level infrastructure to allow higher level libraries to provide comprehensive derivation support. +Instances of the `Mirror` type class are generated automatically by the compiler +unconditionally for: +- enums and enum cases, +- case objects. + +Instances for `Mirror` are also generated conditionally for: +- case classes where the constructor is visible at the callsite (always true if the companion is not a case object) +- sealed classes and sealed traits where: + - there exists at least one child case, + - each child case is reachable from the parent's definition, + - if the sealed trait/class has no companion, then each child case is reachable from the callsite through the prefix of the type being mirrored, + - and where the compiler can generate a `Mirror` type class instance for each child case. + + +The `Mirror` type class definition is as follows: + ```scala sealed trait Mirror: @@ -119,10 +130,21 @@ new Mirror.Product: new Leaf(...) ``` +If a Mirror cannot be generated automatically for a given type, an error will appear explaining why it is neither a supported +sum type nor a product type. For example, if `A` is a trait that is not sealed, + +``` +No given instance of type deriving.Mirror.Of[A] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[A]: + * trait A is not a generic product because it is not a case class + * trait A is not a generic sum because it is not a sealed trait +``` + + Note the following properties of `Mirror` types, + Properties are encoded using types rather than terms. This means that they have no runtime footprint unless used and also that they are a compile time feature for use with Scala 3's metaprogramming facilities. ++ There is no restriction against the mirrored type being a local or inner class. + The kinds of `MirroredType` and `MirroredElemTypes` match the kind of the data type the mirror is an instance for. This allows `Mirror`s to support ADTs of all kinds. + There is no distinct representation type for sums or products (ie. there is no `HList` or `Coproduct` type as in @@ -145,7 +167,7 @@ following form, ```scala import scala.deriving.Mirror -def derived[T](using Mirror.Of[T]): TC[T] = ... +inline def derived[T](using Mirror.Of[T]): TC[T] = ... ``` That is, the `derived` method takes a context parameter of (some subtype of) type `Mirror` which defines the shape of diff --git a/docs/_docs/reference/dropped-features/nonlocal-returns.md b/docs/_docs/reference/dropped-features/nonlocal-returns.md index f7a78637c848..b7c707ba904b 100644 --- a/docs/_docs/reference/dropped-features/nonlocal-returns.md +++ b/docs/_docs/reference/dropped-features/nonlocal-returns.md @@ -5,7 +5,7 @@ title: "Deprecated: Nonlocal Returns" movedTo: https://docs.scala-lang.org/scala3/reference/dropped-features/nonlocal-returns.html --- -Returning from nested anonymous functions has been deprecated. +Returning from nested anonymous functions has been deprecated, and will produce a warning from version `3.2`. Nonlocal returns are implemented by throwing and catching `scala.runtime.NonLocalReturnException`-s. This is rarely what is intended by the programmer. It can be problematic because of the hidden performance cost of throwing and catching exceptions. Furthermore, it is a leaky implementation: a catch-all exception handler can intercept a `NonLocalReturnException`. diff --git a/docs/_docs/reference/dropped-features/this-qualifier.md b/docs/_docs/reference/dropped-features/this-qualifier.md index e1814e1d194e..d7de1b051da1 100644 --- a/docs/_docs/reference/dropped-features/this-qualifier.md +++ b/docs/_docs/reference/dropped-features/this-qualifier.md @@ -29,5 +29,3 @@ This can cause problems if a program tries to access the missing private field v // [C] needed if `field` is to be accessed through reflection val retained = field * field ``` - - diff --git a/docs/_docs/reference/enums/enums-index.md b/docs/_docs/reference/enums/enums-index.md index c49afaffea0d..fb46b3e3ed6b 100644 --- a/docs/_docs/reference/enums/enums-index.md +++ b/docs/_docs/reference/enums/enums-index.md @@ -1,7 +1,7 @@ --- layout: index title: "Enums" -movedTo: https://docs.scala-lang.org/scala3/reference/enums.html +movedTo: https://docs.scala-lang.org/scala3/reference/enums/index.html --- This chapter documents enums in Scala 3. diff --git a/docs/_docs/reference/experimental/explicit-nulls.md b/docs/_docs/reference/experimental/explicit-nulls.md index ec2298e3795a..2b5ffe3559c6 100644 --- a/docs/_docs/reference/experimental/explicit-nulls.md +++ b/docs/_docs/reference/experimental/explicit-nulls.md @@ -480,7 +480,7 @@ The program in [`unsafeNulls`](https://scala-lang.org/api/3.x/scala/runtime/stdL For example, the following code cannot be compiled even using unsafe nulls. Because of the Java interoperation, the type of the get method becomes `T | Null`. -```Scala +```scala def head[T](xs: java.util.List[T]): T = xs.get(0) // error ``` diff --git a/docs/_docs/reference/experimental/fewer-braces.md b/docs/_docs/reference/experimental/fewer-braces.md index f9a856dec39e..9a4a97198e23 100644 --- a/docs/_docs/reference/experimental/fewer-braces.md +++ b/docs/_docs/reference/experimental/fewer-braces.md @@ -4,80 +4,4 @@ title: "Fewer Braces" movedTo: https://docs.scala-lang.org/scala3/reference/experimental/fewer-braces.html --- -By and large, the possible indentation regions coincide with those regions where braces `{...}` are also legal, no matter whether the braces enclose an expression or a set of definitions. There is one exception, though: Arguments to function can be enclosed in braces but they cannot be simply indented instead. Making indentation always significant for function arguments would be too restrictive and fragile. - -To allow such arguments to be written without braces, a variant of the indentation scheme is implemented under language import -```scala -import language.experimental.fewerBraces -``` -Alternatively, it can be enabled with command line option `-language:experimental.fewerBraces`. - -This variant is more contentious and less stable than the rest of the significant indentation scheme. It allows to replace a function argument in braces by a `:` at the end of a line and indented code, similar to the convention for class bodies. It also allows to leave out braces around arguments that are multi-line function values. - -## Using `:` At End Of Line - - -Similar to what is done for classes and objects, a `:` that follows a function reference at the end of a line means braces can be omitted for function arguments. Example: -```scala -times(10): - println("ah") - println("ha") -``` - -The colon can also follow an infix operator: - -```scala -credentials ++ : - val file = Path.userHome / ".credentials" - if file.exists - then Seq(Credentials(file)) - else Seq() -``` - -Function calls that take multiple argument lists can also be handled this way: - -```scala -val firstLine = files.get(fileName).fold: - val fileNames = files.values - s"""no file named $fileName found among - |${values.mkString(\n)}""".stripMargin - : - f => - val lines = f.iterator.map(_.readLine) - lines.mkString("\n) -``` - - -## Lambda Arguments Without Braces - -Braces can also be omitted around multiple line function value arguments: -```scala -val xs = List.range(1, 10).map: x => - val y = x - 1 - y * y -xs.foldLeft(0): (x, y) => - x + y * 8 -``` -Braces can be omitted if the lambda starts with a parameter list and `=>` or `=>?` at the end of one line and it has an indented body on the following lines. - -## Syntax Changes - -``` -SimpleExpr ::= ... - | SimpleExpr `:` IndentedArgument - | SimpleExpr FunParams (‘=>’ | ‘?=>’) IndentedArgument -InfixExpr ::= ... - | InfixExpr id `:` IndentedArgument -IndentedArgument ::= indent (CaseClauses | Block) outdent -``` - -Note that a lambda argument must have the `=>` at the end of a line for braces -to be optional. For instance, the following would also be incorrect: - -```scala - xs.map x => x + 1 // error: braces or parentheses are required -``` -The lambda has to be enclosed in braces or parentheses: -```scala - xs.map(x => x + 1) // ok -``` +The documentation contained in this file is now part of [./indentation.html]. diff --git a/docs/_docs/reference/experimental/numeric-literals.md b/docs/_docs/reference/experimental/numeric-literals.md index 56684d2722d5..e8d4f5309c1e 100644 --- a/docs/_docs/reference/experimental/numeric-literals.md +++ b/docs/_docs/reference/experimental/numeric-literals.md @@ -73,7 +73,7 @@ trait FromDigits[T]: def fromDigits(digits: String): T ``` -Implementations of the `fromDigits` convert strings of digits to the values of the +Implementations of `fromDigits` convert strings of digits to the values of the implementation type `T`. The `digits` string consists of digits between `0` and `9`, possibly preceded by a sign ("+" or "-"). Number separator characters `_` are filtered out before diff --git a/docs/_docs/reference/experimental/overview.md b/docs/_docs/reference/experimental/overview.md index ecc253703df6..b4cb6575cf98 100644 --- a/docs/_docs/reference/experimental/overview.md +++ b/docs/_docs/reference/experimental/overview.md @@ -26,4 +26,3 @@ They can be imported at the top-level if all top-level definitions are `@experim Some experimental language features that are still in research and development can be enabled with special compiler options. These include * [`-Yexplicit-nulls`](./explicit-nulls.md). Enable support for tracking null references in the type system. - diff --git a/docs/_docs/reference/language-versions/binary-compatibility.md b/docs/_docs/reference/language-versions/binary-compatibility.md index 3e48090ba8c5..d0409d32e6b7 100644 --- a/docs/_docs/reference/language-versions/binary-compatibility.md +++ b/docs/_docs/reference/language-versions/binary-compatibility.md @@ -1,6 +1,7 @@ --- layout: doc-page title: "Binary Compatibility" +movedTo: https://docs.scala-lang.org/scala3/reference/language-versions/binary-compatibility.html --- In Scala 2 different minor versions of the compiler were free to change the way how they encode different language features in JVM bytecode so each bump of the compiler's minor version resulted in breaking binary compatibility and if a project had any Scala dependencies they all needed to be (cross-)compiled to the same minor Scala version that was used in that project itself. On the contrary, Scala 3 has a stable encoding into JVM bytecode. @@ -10,27 +11,3 @@ In addition to classfiles the compilation process in Scala 3 also produces files TASTy format is extensible but it preserves backward compatibility and the evolution happens between minor releases of the language. This means a Scala compiler in version `3.x1.y1` is able to read TASTy files produced by another compiler in version `3.x2.y2` if `x1 >= x2` (assuming two stable versions of the compiler are considered - `SNAPSHOT` or `NIGHTLY` compiler versions can read TASTy in an older stable format but their TASTY versions are not compatible between each other even if the compilers have the same minor version; also compilers in stable versions cannot read TASTy generated by an unstable version). TASTy version number has the format of `.-` and the numbering changes in parallel to language releases in such a way that a bump in language minor version corresponds to a bump in TASTy minor version (e.g. for Scala `3.0.0` the TASTy version is `28.0-0`). Experimental version set to 0 signifies a stable version while others are considered unstable/experimental. TASTy version is not strictly bound to the data format itself - any changes to the API of the standard library also require a change in TASTy minor version. - -Being able to bump the compiler version in a project without having to wait for all of its dependencies to do the same is already a big leap forward when compared to Scala 2. However, we might still try to do better, especially from the perspective of authors of libraries. -If you maintain a library and you would like it to be usable as a dependency for all Scala 3 projects, you would have to always emit TASTy in a version that would be readble by everyone, which would normally mean getting stuck at 3.0.x forever. - -To solve this problem a new experimental compiler flag `-scala-output-version ` (available since 3.1.2) has been added. Setting this flag makes the compiler produce TASTy files that should be possible to use by all Scala 3 compilers in version `` or newer. This flag was inspired by how `-java-output-version` (formerly `-release`) works for specifying the target version of JDK. More specifically this enforces emitting TASTy files in an older format ensuring that: -* the code contains no references to parts of the standard library which were added to the API after `` and would crash at runtime when a program is executed with the older version of the standard library on the classpath -* no dependency found on the classpath during compilation (except for the standard library itself) contains TASTy files produced by a compiler newer than `` (otherwise they could potentially leak such disallowed references to the standard library). - -If any of the checks above is not fulfilled or for any other reason older TASTy cannot be emitted (e.g. the code uses some new language features which cannot be expressed the the older format) the entire compilation fails (with errors reported for each of such issues). - -As this feature is experimental it does not have any special support in build tools yet (at least not in sbt 1.6.1 or lower). -E.g. when a project gets compiled with Scala compiler `3.x1.y1` and `-scala-output-version 3.x2` option and then published using sbt -then the standard library in version `3.x1.y1` gets added to the project's dependencies instead of `3.x2.y2`. -When the dependencies are added to the classpath during compilation with Scala `3.x2.y2` the compiler will crash while trying to read TASTy files in the newer format. -A currently known workaround is to modify the build definition of the dependent project by explicitly overriding the version of Scala standard library in dependencies, e.g. - -```scala -dependencyOverrides ++= Seq( - scalaOrganization.value %% "scala3-library" % scalaVersion.value, - scalaOrganization.value %% "scala3-library_sjs1" % scalaVersion.value // for Scala.js projects -) -``` - -The behaviour of `-scala-output-version` flag might still change in the future, especially it's not guaranteed that every new version of the compiler would be able to generate TASTy in all older formats going back to the one produced by `3.0.x` compiler. diff --git a/docs/_docs/reference/language-versions/language-versions.md b/docs/_docs/reference/language-versions/language-versions.md index c38538d3a82a..1bc8d939a7e9 100644 --- a/docs/_docs/reference/language-versions/language-versions.md +++ b/docs/_docs/reference/language-versions/language-versions.md @@ -1,6 +1,7 @@ --- layout: index title: "Language Versions" +movedTo: https://docs.scala-lang.org/scala3/reference/language-versions/index.html --- Additional information on interoperability and migration between Scala 2 and 3 can be found [here](https://docs.scala-lang.org/scala3/guides/migration/compatibility-intro.html). diff --git a/docs/_docs/reference/language-versions/source-compatibility.md b/docs/_docs/reference/language-versions/source-compatibility.md index 029a3674ba73..57bc15d11d88 100644 --- a/docs/_docs/reference/language-versions/source-compatibility.md +++ b/docs/_docs/reference/language-versions/source-compatibility.md @@ -1,25 +1,30 @@ --- layout: doc-page title: "Source Compatibility" -movedTo: https://docs.scala-lang.org/scala3/reference/language-versions.html +movedTo: https://docs.scala-lang.org/scala3/reference/language-versions/source-compatibility.html --- Scala 3 does NOT guarantee source compatibility between different minor language versions (e.g. some syntax valid in 3.x might get deprecated and then phased out in 3.y for y > x). There are also some syntax structures that were valid in Scala 2 but are not anymore in Scala 3. However the compiler provides a possibility to specify the desired version of syntax used in a particular file or globally for a run of the compiler to make migration between versions easier. -The default Scala language syntax version currently supported by the Dotty compiler is [`3.0`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/0$.html). There are also other language versions that can be specified instead: +The default Scala language syntax version currently supported by the Dotty compiler is [`3.2`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/2$.html). There are also other language versions that can be specified instead: -- [`3.0-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/0-migration$.html): Same as `3.0` but with a Scala 2 compatibility mode that helps moving Scala 2.13 sources over to Scala 3. In particular, it +- [`3.0-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/0-migration$.html): Same as +`3.0` and `3.1`, but with a Scala 2 compatibility mode that helps moving Scala 2.13 sources over to Scala 3. In particular, it - flags some Scala 2 constructs that are disallowed in Scala 3 as migration warnings instead of hard errors, - changes some rules to be more lenient and backwards compatible with Scala 2.13 - gives some additional warnings where the semantics has changed between Scala 2.13 and 3.0 - in conjunction with `-rewrite`, offer code rewrites from Scala 2.13 to 3.0. -- [`future`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future$.html): A preview of changes introduced in the next versions after 3.0. In the doc pages here we refer to the language version with these changes as `3.1`, but it might be that some of these changes will be rolled out in later `3.x` versions. +- [`3.0`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/0$.html), [`3.1`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/1$.html): the default set of features included in scala versions `3.0.0` to `3.1.3`. +- [`3.2`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/2$.html): the same as `3.0` and `3.1`, but in addition: + - [stricter pattern bindings](https://docs.scala-lang.org/scala3/reference/changed-features/pattern-bindings.html) are now enabled (part of `future` in earlier `3.x` releases), producing warnings for refutable patterns. These warnings can be silenced to achieve the same runtime behavior, but in `future` they become errors and refutable patterns will not compile. + - [Nonlocal returns](https://docs.scala-lang.org/scala3/reference/dropped-features/nonlocal-returns.html) now produce a warning upon usage (they are still an error under `future`). +- [`3.2-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/2-migration$.html): the same as `3.2`, but in conjunction with `-rewrite`, offer code rewrites from Scala `3.0/3.1` to `3.2`. +- [`future`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future$.html): A preview of changes that will be introduced in `3.x` versions after `3.2`. +Some Scala 2 specific idioms are dropped in this version. The feature set supported by this version may grow over time as features become stabilised for preview. -Some Scala 2 specific idioms will be dropped in this version. The feature set supported by this version will be refined over time as we approach its release. - -- [`future-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future-migration$.html): Same as `future` but with additional helpers to migrate from `3.0`. Similarly to the helpers available under `3.0-migration`, these include migration warnings and optional rewrites. +- [`future-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future-migration$.html): Same as `future` but with additional helpers to migrate from `3.2`. Similarly to the helpers available under `3.0-migration`, these include migration warnings and optional rewrites. There are two ways to specify a language version : diff --git a/docs/_docs/reference/metaprogramming/compiletime-ops.md b/docs/_docs/reference/metaprogramming/compiletime-ops.md index 944cdac5389a..d101ae0c5c00 100644 --- a/docs/_docs/reference/metaprogramming/compiletime-ops.md +++ b/docs/_docs/reference/metaprogramming/compiletime-ops.md @@ -107,7 +107,7 @@ If an inline expansion results in a call `error(msgStr)` the compiler produces an error message containing the given `msgStr`. ```scala -import scala.compiletime.{error, code} +import scala.compiletime.{error, codeOf} inline def fail() = error("failed for a reason") @@ -118,10 +118,10 @@ fail() // error: failed for a reason or ```scala -inline def fail(p1: => Any) = - error(code"failed on: $p1") +inline def fail(inline p1: Any) = + error("failed on: " + codeOf(p1)) -fail(identity("foo")) // error: failed on: identity("foo") +fail(identity("foo")) // error: failed on: identity[String]("foo") ``` ### The `scala.compiletime.ops` package diff --git a/docs/_docs/reference/new-types/new-types.md b/docs/_docs/reference/new-types/new-types.md index be98f42ec70d..8eb1d7b3bd1b 100644 --- a/docs/_docs/reference/new-types/new-types.md +++ b/docs/_docs/reference/new-types/new-types.md @@ -1,7 +1,7 @@ --- layout: index title: "New Types" -movedTo: https://docs.scala-lang.org/scala3/reference/new-types.html +movedTo: https://docs.scala-lang.org/scala3/reference/new-types/index.html --- This chapter documents the new types introduced in Scala 3. diff --git a/docs/_docs/reference/other-new-features/experimental-defs.md b/docs/_docs/reference/other-new-features/experimental-defs.md index ef9eca1ea7f5..760be63440f8 100644 --- a/docs/_docs/reference/other-new-features/experimental-defs.md +++ b/docs/_docs/reference/other-new-features/experimental-defs.md @@ -38,7 +38,7 @@ Experimental definitions can only be referenced in an experimental scope. Experi
Example 2 - + ```scala import scala.annotation.experimental @@ -72,7 +72,7 @@ Experimental definitions can only be referenced in an experimental scope. Experi
Example 3 - + ```scala import scala.annotation.experimental @@ -85,7 +85,7 @@ Experimental definitions can only be referenced in an experimental scope. Experi
Example 4 - + ```scala import scala.annotation.experimental @@ -106,7 +106,7 @@ Experimental definitions can only be referenced in an experimental scope. Experi
Example 5 - + ```scala @experimental trait ExpSAM { diff --git a/docs/_docs/reference/other-new-features/export.md b/docs/_docs/reference/other-new-features/export.md index 85f03de4104e..ded55738b919 100644 --- a/docs/_docs/reference/other-new-features/export.md +++ b/docs/_docs/reference/other-new-features/export.md @@ -24,7 +24,7 @@ class Copier: private val scanUnit = new Scanner export scanUnit.scan - export printUnit.{status => _, *} + export printUnit.{status as _, *} def status: List[String] = printUnit.status ++ scanUnit.status ``` @@ -55,8 +55,8 @@ one or more selectors `sel_i` that identify what gets an alias. Selectors can be of one of the following forms: - A _simple selector_ `x` creates aliases for all eligible members of `path` that are named `x`. - - A _renaming selector_ `x => y` creates aliases for all eligible members of `path` that are named `x`, but the alias is named `y` instead of `x`. - - An _omitting selector_ `x => _` prevents `x` from being aliased by a subsequent + - A _renaming selector_ `x as y` creates aliases for all eligible members of `path` that are named `x`, but the alias is named `y` instead of `x`. + - An _omitting selector_ `x as _` prevents `x` from being aliased by a subsequent wildcard selector. - A _given selector_ `given x` has an optional type bound `x`. It creates aliases for all eligible given instances that conform to either `x`, or `Any` if `x` is omitted, except for members that are named by a previous simple, renaming, or omitting selector. - A _wildcard selector_ `*` creates aliases for all eligible members of `path` except for given instances, @@ -75,9 +75,23 @@ A member is _eligible_ if all of the following holds: - it is not a constructor, nor the (synthetic) class part of an object, - it is a given instance (declared with `given`) if and only if the export is from a _given selector_. +It is a compile-time error if a simple or renaming selector does not identify +any eligible members. + It is a compile-time error if a simple or renaming selector does not identify any eligible members. -Type members are aliased by type definitions, and term members are aliased by method definitions. Export aliases copy the type and value parameters of the members they refer to. +Type members are aliased by type definitions, and term members are aliased by method definitions. For instance: +```scala +object O: + class C(val x: Int) + def m(c: C): Int = c.x + 1 +export O.* + // generates + // type C = O.C + // def m(c: O.C): Int = O.m(c) +``` + +Export aliases copy the type and value parameters of the members they refer to. Export aliases are always `final`. Aliases of given instances are again defined as givens (and aliases of old-style implicits are `implicit`). Aliases of extensions are again defined as extensions. Aliases of inline methods or values are again defined `inline`. There are no other modifiers that can be given to an alias. This has the following consequences for overriding: - Export aliases cannot be overridden, since they are final. @@ -132,6 +146,34 @@ Export clauses also fill a gap opened by the shift from package objects to top-l of internal compositions available to users of a package. Top-level definitions are not wrapped in a user-defined object, so they can't inherit anything. However, top-level definitions can be export clauses, which supports the facade design pattern in a safer and more flexible way. +## Export Clauses in Extensions + +An export clause may also appear in an extension. + +Example: +```scala +class StringOps(x: String): + def *(n: Int): String = ... + def capitalize: String = ... + +extension (x: String) + def take(n: Int): String = x.substring(0, n) + def drop(n: Int): String = x.substring(n) + private def moreOps = new StringOps(x) + export moreOps.* +``` +In this case the qualifier expression must be an identifier that refers to a unique parameterless extension method in the same extension clause. The export will create +extension methods for all accessible term members +in the result of the qualifier path. For instance, the extension above would be expanded to +```scala +extension (x: String) + def take(n: Int): String = x.substring(0, n) + def drop(n: Int): String = x.substring(n) + private def moreOps = StringOps(x) + def *(n: Int): String = moreOps.*(n) + def capitalize: String = moreOps.capitalize +``` + ## Syntax changes: ``` @@ -139,6 +181,8 @@ TemplateStat ::= ... | Export TopStat ::= ... | Export +ExtMethod ::= ... + | Export Export ::= ‘export’ ImportExpr {‘,’ ImportExpr} ImportExpr ::= SimpleRef {‘.’ id} ‘.’ ImportSpec ImportSpec ::= NamedSelector diff --git a/docs/_docs/reference/other-new-features/indentation.md b/docs/_docs/reference/other-new-features/indentation.md index 46bc21e2597d..8894a310b974 100644 --- a/docs/_docs/reference/other-new-features/indentation.md +++ b/docs/_docs/reference/other-new-features/indentation.md @@ -61,7 +61,7 @@ There are two rules: - after the leading parameters of an `extension`, or - after a `with` in a given instance, or - - after a ": at end of line" token (see below) + - after a `:` at the start of a template body (see discussion of `` below), or - after one of the following tokens: ``` @@ -98,7 +98,7 @@ There are two rules: - An `` is also inserted if the next token following a statement sequence starting with an `` closes an indentation region, i.e. is one of `then`, `else`, `do`, `catch`, `finally`, `yield`, `}`, `)`, `]` or `case`. - An `` is finally inserted in front of a comma that follows a statement sequence starting with an `` if the indented region is itself enclosed in parentheses + - An `` is finally inserted in front of a comma that follows a statement sequence starting with an `` if the indented region is itself enclosed in parentheses. It is an error if the indentation width of the token following an `` does not match the indentation of some previous line in the enclosing indentation region. For instance, the following would be rejected. @@ -134,12 +134,14 @@ is parsed as `if x then a + b + c else d`. The Scala grammar uses the term _template body_ for the definitions of a class, trait, or object that are normally enclosed in braces. The braces around a template body can also be omitted by means of the following rule. -If at the point where a template body can start there is a `:` that occurs at the end -of a line, and that is followed by at least one indented statement, the recognized -token is changed from ":" to ": at end of line". The latter token is one of the tokens -that can start an indentation region. The Scala grammar is changed so an optional ": at end of line" is allowed in front of a template body. +A template body can alternatively consist of a colon followed by one or more indented statements. To this purpose we introduce a new `` token that reads as +the standard colon "`:`" but is generated instead of it where `` +is legal according to the context free syntax, but only if the previous token +is an alphanumeric identifier, a backticked identifier, or one of the tokens `this`, `super`, "`)`", and "`]`". -Analogous rules apply for enum bodies and local packages containing nested definitions. +An indentation region can start after a ``. A template body may be either enclosed in braces, or it may start with +` ` and end with ``. +Analogous rules apply for enum bodies, type refinements, and local packages containing nested definitions. With these new rules, the following constructs are all valid: @@ -170,17 +172,19 @@ In each case, the `:` at the end of line can be replaced without change of meani The syntax changes allowing this are as follows: +Define for an arbitrary sequence of tokens or non-terminals `TS`: + ``` -Template ::= InheritClauses [colonEol] [TemplateBody] -EnumDef ::= id ClassConstr InheritClauses [colonEol] EnumBody -Packaging ::= ‘package’ QualId [nl | colonEol] ‘{’ TopStatSeq ‘}’ -SimpleExpr ::= ‘new’ ConstrApp {‘with’ ConstrApp} [[colonEol] TemplateBody] +:<<< TS >>> ::= ‘{’ TS ‘}’ + | +``` +Then the grammar changes as follows: +``` +TemplateBody ::= :<<< [SelfType] TemplateStat {semi TemplateStat} >>> +EnumBody ::= :<<< [SelfType] EnumStat {semi EnumStat} >>> +Refinement ::= :<<< [RefineDcl] {semi [RefineDcl]} >>> +Packaging ::= ‘package’ QualId :<<< TopStats >>> ``` - -Here, `colonEol` stands for ": at end of line", as described above. -The lexical analyzer is modified so that a `:` at the end of a line -is reported as `colonEol` if the parser is at a point where a `colonEol` is -valid as next token. ### Spaces vs Tabs @@ -444,15 +448,15 @@ indented regions where possible. When invoked with options `-rewrite -no-indent` The `-indent` option only works on [new-style syntax](./control-syntax.md). So to go from old-style syntax to new-style indented code one has to invoke the compiler twice, first with options `-rewrite -new-syntax`, then again with options `-rewrite -indent`. To go in the opposite direction, from indented code to old-style syntax, it's `-rewrite -no-indent`, followed by `-rewrite -old-syntax`. -### Variant: Indentation Marker `:` +### Variant: Indentation Marker `:` for Arguments -Generally, the possible indentation regions coincide with those regions where braces `{...}` are also legal, no matter whether the braces enclose an expression or a set of definitions. There is one exception, though: Arguments to function can be enclosed in braces but they cannot be simply indented instead. Making indentation always significant for function arguments would be too restrictive and fragile. +Generally, the possible indentation regions coincide with those regions where braces `{...}` are also legal, no matter whether the braces enclose an expression or a set of definitions. There is one exception, though: Arguments to functions can be enclosed in braces but they cannot be simply indented instead. Making indentation always significant for function arguments would be too restrictive and fragile. To allow such arguments to be written without braces, a variant of the indentation scheme is implemented under language import ```scala import language.experimental.fewerBraces ``` -This variant is more contentious and less stable than the rest of the significant indentation scheme. In this variant, a colon `:` at the end of a line is also one of the possible tokens that opens an indentation region. Examples: +In this variant, a `` token is also recognized where function argument would be expected. Examples: ```scala times(10): @@ -462,24 +466,44 @@ times(10): or +```scala +credentials `++`: + val file = Path.userHome / ".credentials" + if file.exists + then Seq(Credentials(file)) + else Seq() +``` + +or + ```scala xs.map: x => val y = x - 1 y * y ``` - -The colon is usable not only for lambdas and by-name parameters, but -also even for ordinary parameters: +What's more, a `:` in these settings can also be followed on the same line by the parameter part and arrow of a lambda. So the last example could be compressed to this: ```scala -credentials ++ : - val file = Path.userHome / ".credentials" - if file.exists - then Seq(Credentials(file)) - else Seq() +xs.map: x => + val y = x - 1 + y * y ``` +and the following would also be legal: +```scala +xs.foldLeft(0): (x, y) => + x + y +``` + +The grammar changes for this variant are as follows. -How does this syntax variant work? Colons at the end of lines are their own token, distinct from normal `:`. -The Scala grammar is changed so that colons at end of lines are accepted at all points -where an opening brace enclosing an argument is legal. Special provisions are taken so that method result types can still use a colon on the end of a line, followed by the actual type on the next. +``` +SimpleExpr ::= ... + | SimpleExpr ColonArgument +InfixExpr ::= ... + | InfixExpr id ColonArgument +ColonArgument ::= colon [LambdaStart] + indent (CaseClauses | Block) outdent +LambdaStart ::= FunParams (‘=>’ | ‘?=>’) + | HkTypeParamClause ‘=>’ +``` diff --git a/docs/_docs/reference/other-new-features/targetName.md b/docs/_docs/reference/other-new-features/targetName.md index d2a654697d15..09886968a232 100644 --- a/docs/_docs/reference/other-new-features/targetName.md +++ b/docs/_docs/reference/other-new-features/targetName.md @@ -29,7 +29,7 @@ The [`@targetName`](https://scala-lang.org/api/3.x/scala/annotation/targetName.h of type `String`. That string is called the _external name_ of the definition that's annotated. - 2. A `@targetName` annotation can be given for all kinds of definitions. + 2. A `@targetName` annotation can be given for all kinds of definitions except a top-level `class`, `trait`, or `object`. 3. The name given in a [`@targetName`](https://scala-lang.org/api/3.x/scala/annotation/targetName.html) annotation must be a legal name for the defined entities on the host platform. diff --git a/docs/_docs/reference/overview.md b/docs/_docs/reference/overview.md index 9b184a7408ba..d982cbeecff8 100644 --- a/docs/_docs/reference/overview.md +++ b/docs/_docs/reference/overview.md @@ -45,7 +45,7 @@ These constructs replace existing constructs with the aim of making the language - [Extension methods](contextual/extension-methods.md) replace implicit classes with a clearer and simpler mechanism. - [Opaque type aliases](other-new-features/opaques.md) - replace most uses of value classes while guaranteeing absence of boxing. + replace most uses of value classes while guaranteeing the absence of boxing. - [Top-level definitions](dropped-features/package-objects.md) replace package objects, dropping syntactic boilerplate. - [Export clauses](other-new-features/export.md) @@ -131,7 +131,7 @@ These are additions to the language that make it more powerful or pleasant to us ## Metaprogramming -The following constructs together aim to put metaprogramming in Scala on a new basis. So far, metaprogramming was achieved by a combination of macros and libraries such as [Shapeless](https://github.com/milessabin/shapeless) that were in turn based on some key macros. Current Scala 2 macro mechanisms are a thin veneer on top the current Scala 2 compiler, which makes them fragile and in many cases impossible to port to Scala 3. +The following constructs together aim to put metaprogramming in Scala on a new basis. So far, metaprogramming was achieved by a combination of macros and libraries such as [Shapeless](https://github.com/milessabin/shapeless) that were in turn based on some key macros. Current Scala 2 macro mechanisms are a thin veneer on top of the current Scala 2 compiler, which makes them fragile and in many cases impossible to port to Scala 3. It's worth noting that macros were never included in the [Scala 2 language specification](https://scala-lang.org/files/archive/spec/2.13/) and were so far made available only under an `-experimental` flag. This has not prevented their widespread usage. diff --git a/docs/_docs/reference/syntax.md b/docs/_docs/reference/syntax.md index 57b21659b7c4..015609a450a9 100644 --- a/docs/_docs/reference/syntax.md +++ b/docs/_docs/reference/syntax.md @@ -4,15 +4,29 @@ title: "Scala 3 Syntax Summary" movedTo: https://docs.scala-lang.org/scala3/reference/syntax.html --- + + The following description of Scala tokens uses literal characters `‘c’` when referring to the ASCII fragment `\u0000` – `\u007F`. _Unicode escapes_ are used to represent the [Unicode character](https://www.w3.org/International/articles/definitions-characters/) with the given hexadecimal code: -```ebnf -UnicodeEscape ::= ‘\’ ‘u’ {‘u’} hexDigit hexDigit hexDigit hexDigit ; -hexDigit ::= ‘0’ | … | ‘9’ | ‘A’ | … | ‘F’ | ‘a’ | … | ‘f’ ; +``` +UnicodeEscape ::= ‘\’ ‘u’ {‘u’} hexDigit hexDigit hexDigit hexDigit +hexDigit ::= ‘0’ | … | ‘9’ | ‘A’ | … | ‘F’ | ‘a’ | … | ‘f’ ``` Informal descriptions are typeset as `“some comment”`. @@ -22,70 +36,71 @@ Informal descriptions are typeset as `“some comment”`. The lexical syntax of Scala is given by the following grammar in EBNF form. -```ebnf -whiteSpace ::= ‘\u0020’ | ‘\u0009’ | ‘\u000D’ | ‘\u000A’ ; -upper ::= ‘A’ | … | ‘Z’ | ‘\$’ | ‘_’ “… and Unicode category Lu” ; -lower ::= ‘a’ | … | ‘z’ “… and Unicode category Ll” ; -letter ::= upper | lower “… and Unicode categories Lo, Lt, Nl” ; -digit ::= ‘0’ | … | ‘9’ ; -paren ::= ‘(’ | ‘)’ | ‘[’ | ‘]’ | ‘{’ | ‘}’ | ‘'(’ | ‘'[’ | ‘'{’ ; -delim ::= ‘`’ | ‘'’ | ‘"’ | ‘.’ | ‘;’ | ‘,’ ; -opchar ::= “printableChar not matched by (whiteSpace | upper | - lower | letter | digit | paren | delim | opchar | - Unicode_Sm | Unicode_So)” ; -printableChar ::= “all characters in [\u0020, \u007F] inclusive” ; -charEscapeSeq ::= ‘\’ (‘b’ | ‘t’ | ‘n’ | ‘f’ | ‘r’ | ‘"’ | ‘'’ | ‘\’) ; - -op ::= opchar {opchar} ; -varid ::= lower idrest ; +``` +whiteSpace ::= ‘\u0020’ | ‘\u0009’ | ‘\u000D’ | ‘\u000A’ +upper ::= ‘A’ | … | ‘Z’ | ‘\$’ | ‘_’ “… and Unicode category Lu” +lower ::= ‘a’ | … | ‘z’ “… and Unicode category Ll” +letter ::= upper | lower “… and Unicode categories Lo, Lt, Nl” +digit ::= ‘0’ | … | ‘9’ +paren ::= ‘(’ | ‘)’ | ‘[’ | ‘]’ | ‘{’ | ‘}’ +delim ::= ‘`’ | ‘'’ | ‘"’ | ‘.’ | ‘;’ | ‘,’ +opchar ::= ‘!’ | ‘#’ | ‘%’ | ‘&’ | ‘*’ | ‘+’ | ‘-’ | ‘/’ | ‘:’ | + ‘<’ | ‘=’ | ‘>’ | ‘?’ | ‘@’ | ‘\’ | ‘^’ | ‘|’ | ‘~’ + “… and Unicode categories Sm, So” +printableChar ::= “all characters in [\u0020, \u007E] inclusive” +charEscapeSeq ::= ‘\’ (‘b’ | ‘t’ | ‘n’ | ‘f’ | ‘r’ | ‘"’ | ‘'’ | ‘\’) + +op ::= opchar {opchar} +varid ::= lower idrest alphaid ::= upper idrest - | varid ; + | varid plainid ::= alphaid - | op ; + | op id ::= plainid - | ‘`’ { charNoBackQuoteOrNewline | UnicodeEscape | charEscapeSeq } ‘`’ ; -idrest ::= {letter | digit} [‘_’ op] ; -quoteId ::= ‘'’ alphaid ; + | ‘`’ { charNoBackQuoteOrNewline | UnicodeEscape | charEscapeSeq } ‘`’ +idrest ::= {letter | digit} [‘_’ op] +quoteId ::= ‘'’ alphaid +spliceId ::= ‘$’ alphaid ; -integerLiteral ::= (decimalNumeral | hexNumeral) [‘L’ | ‘l’] ; -decimalNumeral ::= ‘0’ | nonZeroDigit [{digit | ‘_’} digit] ; -hexNumeral ::= ‘0’ (‘x’ | ‘X’) hexDigit [{hexDigit | ‘_’} hexDigit] ; -nonZeroDigit ::= ‘1’ | … | ‘9’ ; +integerLiteral ::= (decimalNumeral | hexNumeral) [‘L’ | ‘l’] +decimalNumeral ::= ‘0’ | nonZeroDigit [{digit | ‘_’} digit] +hexNumeral ::= ‘0’ (‘x’ | ‘X’) hexDigit [{hexDigit | ‘_’} hexDigit] +nonZeroDigit ::= ‘1’ | … | ‘9’ floatingPointLiteral ::= [decimalNumeral] ‘.’ digit [{digit | ‘_’} digit] [exponentPart] [floatType] | decimalNumeral exponentPart [floatType] - | decimalNumeral floatType ; -exponentPart ::= (‘E’ | ‘e’) [‘+’ | ‘-’] digit [{digit | ‘_’} digit] ; -floatType ::= ‘F’ | ‘f’ | ‘D’ | ‘d’ ; + | decimalNumeral floatType +exponentPart ::= (‘E’ | ‘e’) [‘+’ | ‘-’] digit [{digit | ‘_’} digit] +floatType ::= ‘F’ | ‘f’ | ‘D’ | ‘d’ -booleanLiteral ::= ‘true’ | ‘false’ ; +booleanLiteral ::= ‘true’ | ‘false’ -characterLiteral ::= ‘'’ (printableChar | charEscapeSeq) ‘'’ ; +characterLiteral ::= ‘'’ (printableChar | charEscapeSeq) ‘'’ stringLiteral ::= ‘"’ {stringElement} ‘"’ - | ‘"""’ multiLineChars ‘"""’ ; + | ‘"""’ multiLineChars ‘"""’ stringElement ::= printableChar \ (‘"’ | ‘\’) | UnicodeEscape - | charEscapeSeq ; -multiLineChars ::= {[‘"’] [‘"’] char \ ‘"’} {‘"’} ; + | charEscapeSeq +multiLineChars ::= {[‘"’] [‘"’] char \ ‘"’} {‘"’} processedStringLiteral ::= alphaid ‘"’ {[‘\’] processedStringPart | ‘\\’ | ‘\"’} ‘"’ - | alphaid ‘"""’ {[‘"’] [‘"’] char \ (‘"’ | ‘$’) | escape} {‘"’} ‘"""’ ; + | alphaid ‘"""’ {[‘"’] [‘"’] char \ (‘"’ | ‘$’) | escape} {‘"’} ‘"""’ processedStringPart - ::= printableChar \ (‘"’ | ‘$’ | ‘\’) | escape ; + ::= printableChar \ (‘"’ | ‘$’ | ‘\’) | escape escape ::= ‘$$’ | ‘$’ letter { letter | digit } - | ‘{’ Block [‘;’ whiteSpace stringFormat whiteSpace] ‘}’ ; -stringFormat ::= {printableChar \ (‘"’ | ‘}’ | ‘ ’ | ‘\t’ | ‘\n’)} ; + | ‘{’ Block [‘;’ whiteSpace stringFormat whiteSpace] ‘}’ +stringFormat ::= {printableChar \ (‘"’ | ‘}’ | ‘ ’ | ‘\t’ | ‘\n’)} -symbolLiteral ::= ‘'’ plainid // until 2.13 ; +symbolLiteral ::= ‘'’ plainid // until 2.13 comment ::= ‘/*’ “any sequence of characters; nested comments are allowed” ‘*/’ - | ‘//’ “any sequence of characters up to end of line” ; + | ‘//’ “any sequence of characters up to end of line” -nl ::= “new line character” ; -semi ::= ‘;’ | nl {nl} ; +nl ::= “new line character” +semi ::= ‘;’ | nl {nl} ``` ## Optional Braces @@ -95,20 +110,24 @@ The lexical analyzer also inserts `indent` and `outdent` tokens that represent r In the context-free productions below we use the notation `<<< ts >>>` to indicate a token sequence `ts` that is either enclosed in a pair of braces `{ ts }` or that constitutes an indented region `indent ts outdent`. Analogously, the notation `:<<< ts >>>` indicates a token sequence `ts` that is either enclosed in a pair of braces `{ ts }` or that constitutes an indented region `indent ts outdent` that follows -a `:` at the end of a line. +a `colon` token. -```ebnf +A `colon` token reads as the standard colon "`:`" but is generated instead of it where `colon` is legal according to the context free syntax, but only if the previous token +is an alphanumeric identifier, a backticked identifier, or one of the tokens `this`, `super`, `new`, "`)`", and "`]`". + +``` +colon ::= ':' -- with side conditions explained above <<< ts >>> ::= ‘{’ ts ‘}’ - | indent ts outdent ; + | indent ts outdent :<<< ts >>> ::= [nl] ‘{’ ts ‘}’ - | `:` indent ts outdent ; + | colon indent ts outdent ``` ## Keywords ### Regular keywords -```ebnf +``` abstract case catch class def do else enum export extends false final finally for given if implicit import lazy match new @@ -121,9 +140,8 @@ type val var while with yield ### Soft keywords -```ebnf -as derives end extension infix inline opaque open throws -transparent using | * + - +``` +as derives end extension infix inline opaque open transparent using | * + - ``` See the [separate section on soft keywords](./soft-modifier.md) for additional @@ -135,45 +153,45 @@ The context-free syntax of Scala is given by the following EBNF grammar: ### Literals and Paths -```ebnf +``` SimpleLiteral ::= [‘-’] integerLiteral | [‘-’] floatingPointLiteral | booleanLiteral | characterLiteral - | stringLiteral ; + | stringLiteral Literal ::= SimpleLiteral | processedStringLiteral | symbolLiteral - | ‘null’ ; + | ‘null’ -QualId ::= id {‘.’ id} ; -ids ::= id {‘,’ id} ; +QualId ::= id {‘.’ id} +ids ::= id {‘,’ id} SimpleRef ::= id | [id ‘.’] ‘this’ - | [id ‘.’] ‘super’ [ClassQualifier] ‘.’ id ; + | [id ‘.’] ‘super’ [ClassQualifier] ‘.’ id -ClassQualifier ::= ‘[’ id ‘]’ ; +ClassQualifier ::= ‘[’ id ‘]’ ``` ### Types -```ebnf +``` Type ::= FunType | HkTypeParamClause ‘=>>’ Type | FunParamClause ‘=>>’ Type | MatchType - | InfixType ; + | InfixType FunType ::= FunTypeArgs (‘=>’ | ‘?=>’) Type - | HKTypeParamClause '=>' Type ; + | HKTypeParamClause '=>' Type FunTypeArgs ::= InfixType | ‘(’ [ FunArgTypes ] ‘)’ - | FunParamClause ; -FunParamClause ::= ‘(’ TypedFunParam {‘,’ TypedFunParam } ‘)’ ; -TypedFunParam ::= id ‘:’ Type ; -MatchType ::= InfixType `match` <<< TypeCaseClauses >>> ; -InfixType ::= RefinedType {id [nl] RefinedType} ; -RefinedType ::= AnnotType {[nl] Refinement} ; -AnnotType ::= SimpleType {Annotation} ; + | FunParamClause +FunParamClause ::= ‘(’ TypedFunParam {‘,’ TypedFunParam } ‘)’ +TypedFunParam ::= id ‘:’ Type +MatchType ::= InfixType `match` <<< TypeCaseClauses >>> +InfixType ::= RefinedType {id [nl] RefinedType} +RefinedType ::= AnnotType {[nl] Refinement} +AnnotType ::= SimpleType {Annotation} SimpleType ::= SimpleLiteral | ‘?’ TypeBounds @@ -182,37 +200,35 @@ SimpleType ::= SimpleLiteral | Singleton ‘.’ ‘type’ | ‘(’ Types ‘)’ | Refinement - | ‘$’ ‘{’ Block ‘}’ -- unless inside quoted pattern - | ‘$’ ‘{’ Pattern ‘}’ -- only inside quoted pattern | SimpleType1 TypeArgs - | SimpleType1 ‘#’ id ; + | SimpleType1 ‘#’ id Singleton ::= SimpleRef | SimpleLiteral - | Singleton ‘.’ id ; + | Singleton ‘.’ id FunArgType ::= Type - | ‘=>’ Type ; -FunArgTypes ::= FunArgType { ‘,’ FunArgType } ; -ParamType ::= [‘=>’] ParamValueType ; -ParamValueType ::= Type [‘*’] ; -TypeArgs ::= ‘[’ Types ‘]’ ; -Refinement ::= ‘{’ [RefineDcl] {semi [RefineDcl]} ‘}’ ; -TypeBounds ::= [‘>:’ Type] [‘<:’ Type] ; -TypeParamBounds ::= TypeBounds {‘:’ Type} ; -Types ::= Type {‘,’ Type} ; + | ‘=>’ Type +FunArgTypes ::= FunArgType { ‘,’ FunArgType } +ParamType ::= [‘=>’] ParamValueType +ParamValueType ::= Type [‘*’] +TypeArgs ::= ‘[’ Types ‘]’ +Refinement ::= :<<< [RefineDcl] {semi [RefineDcl]} >>> +TypeBounds ::= [‘>:’ Type] [‘<:’ Type] +TypeParamBounds ::= TypeBounds {‘:’ Type} +Types ::= Type {‘,’ Type} ``` ### Expressions -```ebnf +``` Expr ::= FunParams (‘=>’ | ‘?=>’) Expr | HkTypeParamClause ‘=>’ Expr - | Expr1 ; + | Expr1 BlockResult ::= FunParams (‘=>’ | ‘?=>’) Block | HkTypeParamClause ‘=>’ Block - | Expr1 ; + | Expr1 FunParams ::= Bindings | id - | ‘_’ ; + | ‘_’ Expr1 ::= [‘inline’] ‘if’ ‘(’ Expr ‘)’ {nl} Expr [[semi] ‘else’ Expr] | [‘inline’] ‘if’ Expr ‘then’ Expr [[semi] ‘else’ Expr] | ‘while’ ‘(’ Expr ‘)’ {nl} Expr @@ -226,23 +242,22 @@ Expr1 ::= [‘inline’] ‘if’ ‘(’ Expr ‘)’ {nl} Expr [[ | PrefixOperator SimpleExpr ‘=’ Expr | SimpleExpr ArgumentExprs ‘=’ Expr | PostfixExpr [Ascription] - | ‘inline’ InfixExpr MatchClause ; + | ‘inline’ InfixExpr MatchClause Ascription ::= ‘:’ InfixType - | ‘:’ Annotation {Annotation} ; -Catches ::= ‘catch’ (Expr | ExprCaseClause) ; -PostfixExpr ::= InfixExpr [id] -- only if language.postfixOperators is enabled ; + | ‘:’ Annotation {Annotation} +Catches ::= ‘catch’ (Expr | ExprCaseClause) +PostfixExpr ::= InfixExpr [id] -- only if language.postfixOperators is enabled InfixExpr ::= PrefixExpr | InfixExpr id [nl] InfixExpr - | InfixExpr MatchClause ; -MatchClause ::= ‘match’ <<< CaseClauses >>> ; -PrefixExpr ::= [PrefixOperator] SimpleExpr ; -PrefixOperator ::= ‘-’ | ‘+’ | ‘~’ | ‘!’ ; + | InfixExpr MatchClause +MatchClause ::= ‘match’ <<< CaseClauses >>> +PrefixExpr ::= [PrefixOperator] SimpleExpr +PrefixOperator ::= ‘-’ | ‘+’ | ‘~’ | ‘!’ -- unless backquoted SimpleExpr ::= SimpleRef | Literal | ‘_’ | BlockExpr - | ‘$’ ‘{’ Block ‘}’ -- unless inside quoted pattern - | ‘$’ ‘{’ Pattern ‘}’ -- only inside quoted pattern + | ExprSplice | Quoted | quoteId -- only inside splices | ‘new’ ConstrApp {‘with’ ConstrApp} [TemplateBody] @@ -251,174 +266,179 @@ SimpleExpr ::= SimpleRef | SimpleExpr ‘.’ id | SimpleExpr ‘.’ MatchClause | SimpleExpr TypeArgs - | SimpleExpr ArgumentExprs ; + | SimpleExpr ArgumentExprs Quoted ::= ‘'’ ‘{’ Block ‘}’ - | ‘'’ ‘[’ Type ‘]’ ; -ExprsInParens ::= ExprInParens {‘,’ ExprInParens} ; + | ‘'’ ‘[’ Type ‘]’ +ExprSplice ::= spliceId -- if inside quoted block + | ‘$’ ‘{’ Block ‘}’ -- unless inside quoted pattern + | ‘$’ ‘{’ Pattern ‘}’ -- when inside quoted pattern +ExprsInParens ::= ExprInParens {‘,’ ExprInParens} ExprInParens ::= PostfixExpr ‘:’ Type - | Expr ; + | Expr ParArgumentExprs ::= ‘(’ [‘using’] ExprsInParens ‘)’ - | ‘(’ [ExprsInParens ‘,’] PostfixExpr ‘*’ ‘)’ ; + | ‘(’ [ExprsInParens ‘,’] PostfixExpr ‘*’ ‘)’ ArgumentExprs ::= ParArgumentExprs - | BlockExpr ; -BlockExpr ::= <<< (CaseClauses | Block) >>> ; -Block ::= {BlockStat semi} [BlockResult] ; + | BlockExpr +BlockExpr ::= <<< (CaseClauses | Block) >>> +Block ::= {BlockStat semi} [BlockResult] BlockStat ::= Import | {Annotation {nl}} {LocalModifier} Def | Extension | Expr1 - | EndMarker ; + | EndMarker ForExpr ::= ‘for’ ‘(’ Enumerators0 ‘)’ {nl} [‘do‘ | ‘yield’] Expr | ‘for’ ‘{’ Enumerators0 ‘}’ {nl} [‘do‘ | ‘yield’] Expr - | ‘for’ Enumerators0 (‘do‘ | ‘yield’) Expr ; -Enumerators0 ::= {nl} Enumerators [semi] ; -Enumerators ::= Generator {semi Enumerator | Guard} ; + | ‘for’ Enumerators0 (‘do‘ | ‘yield’) Expr +Enumerators0 ::= {nl} Enumerators [semi] +Enumerators ::= Generator {semi Enumerator | Guard} Enumerator ::= Generator | Guard {Guard} - | Pattern1 ‘=’ Expr ; -Generator ::= [‘case’] Pattern1 ‘<-’ Expr ; -Guard ::= ‘if’ PostfixExpr ; - -CaseClauses ::= CaseClause { CaseClause } ; -CaseClause ::= ‘case’ Pattern [Guard] ‘=>’ Block ; -ExprCaseClause ::= ‘case’ Pattern [Guard] ‘=>’ Expr ; -TypeCaseClauses ::= TypeCaseClause { TypeCaseClause } ; -TypeCaseClause ::= ‘case’ InfixType ‘=>’ Type [semi] ; - -Pattern ::= Pattern1 { ‘|’ Pattern1 } ; -Pattern1 ::= Pattern2 [‘:’ RefinedType] ; -Pattern2 ::= [id ‘@’] InfixPattern [‘*’] ; -InfixPattern ::= SimplePattern { id [nl] SimplePattern } ; + | Pattern1 ‘=’ Expr +Generator ::= [‘case’] Pattern1 ‘<-’ Expr +Guard ::= ‘if’ PostfixExpr + +CaseClauses ::= CaseClause { CaseClause } +CaseClause ::= ‘case’ Pattern [Guard] ‘=>’ Block +ExprCaseClause ::= ‘case’ Pattern [Guard] ‘=>’ Expr +TypeCaseClauses ::= TypeCaseClause { TypeCaseClause } +TypeCaseClause ::= ‘case’ (InfixType | ‘_’) ‘=>’ Type [semi] + +Pattern ::= Pattern1 { ‘|’ Pattern1 } +Pattern1 ::= Pattern2 [‘:’ RefinedType] +Pattern2 ::= [id ‘@’] InfixPattern [‘*’] +InfixPattern ::= SimplePattern { id [nl] SimplePattern } SimplePattern ::= PatVar | Literal | ‘(’ [Patterns] ‘)’ | Quoted | SimplePattern1 [TypeArgs] [ArgumentPatterns] - | ‘given’ RefinedType ; + | ‘given’ RefinedType SimplePattern1 ::= SimpleRef - | SimplePattern1 ‘.’ id ; + | SimplePattern1 ‘.’ id PatVar ::= varid - | ‘_’ ; -Patterns ::= Pattern {‘,’ Pattern} ; + | ‘_’ +Patterns ::= Pattern {‘,’ Pattern} ArgumentPatterns ::= ‘(’ [Patterns] ‘)’ - | ‘(’ [Patterns ‘,’] PatVar ‘*’ ‘)’ ; + | ‘(’ [Patterns ‘,’] PatVar ‘*’ ‘)’ ``` ### Type and Value Parameters -```ebnf -ClsTypeParamClause::= ‘[’ ClsTypeParam {‘,’ ClsTypeParam} ‘]’ ; -ClsTypeParam ::= {Annotation} [‘+’ | ‘-’] id [HkTypeParamClause] TypeParamBounds ; +``` +ClsTypeParamClause::= ‘[’ ClsTypeParam {‘,’ ClsTypeParam} ‘]’ +ClsTypeParam ::= {Annotation} [‘+’ | ‘-’] id [HkTypeParamClause] TypeParamBounds -DefTypeParamClause::= ‘[’ DefTypeParam {‘,’ DefTypeParam} ‘]’ ; -DefTypeParam ::= {Annotation} id [HkTypeParamClause] TypeParamBounds ; +DefTypeParamClause::= ‘[’ DefTypeParam {‘,’ DefTypeParam} ‘]’ +DefTypeParam ::= {Annotation} id [HkTypeParamClause] TypeParamBounds -TypTypeParamClause::= ‘[’ TypTypeParam {‘,’ TypTypeParam} ‘]’ ; -TypTypeParam ::= {Annotation} id [HkTypeParamClause] TypeBounds ; +TypTypeParamClause::= ‘[’ TypTypeParam {‘,’ TypTypeParam} ‘]’ +TypTypeParam ::= {Annotation} id [HkTypeParamClause] TypeBounds -HkTypeParamClause ::= ‘[’ HkTypeParam {‘,’ HkTypeParam} ‘]’ ; -HkTypeParam ::= {Annotation} [‘+’ | ‘-’] (id [HkTypeParamClause] | ‘_’) TypeBounds ; +HkTypeParamClause ::= ‘[’ HkTypeParam {‘,’ HkTypeParam} ‘]’ +HkTypeParam ::= {Annotation} [‘+’ | ‘-’] (id [HkTypeParamClause] | ‘_’) TypeBounds -ClsParamClauses ::= {ClsParamClause} [[nl] ‘(’ [‘implicit’] ClsParams ‘)’] ; +ClsParamClauses ::= {ClsParamClause} [[nl] ‘(’ [‘implicit’] ClsParams ‘)’] ClsParamClause ::= [nl] ‘(’ ClsParams ‘)’ - | [nl] ‘(’ ‘using’ (ClsParams | FunArgTypes) ‘)’ ; -ClsParams ::= ClsParam {‘,’ ClsParam} ; -ClsParam ::= {Annotation} [{Modifier} (‘val’ | ‘var’) | ‘inline’] Param ; -Param ::= id ‘:’ ParamType [‘=’ Expr] ; - -DefParamClauses ::= {DefParamClause} [[nl] ‘(’ [‘implicit’] DefParams ‘)’] ; -DefParamClause ::= [nl] ‘(’ DefParams ‘)’ | UsingParamClause ; -UsingParamClause ::= [nl] ‘(’ ‘using’ (DefParams | FunArgTypes) ‘)’ ; -DefParams ::= DefParam {‘,’ DefParam} ; -DefParam ::= {Annotation} [‘inline’] Param ; + | [nl] ‘(’ ‘using’ (ClsParams | FunArgTypes) ‘)’ +ClsParams ::= ClsParam {‘,’ ClsParam} +ClsParam ::= {Annotation} [{Modifier} (‘val’ | ‘var’) | ‘inline’] Param +Param ::= id ‘:’ ParamType [‘=’ Expr] + +DefParamClauses ::= {DefParamClause} [[nl] ‘(’ [‘implicit’] DefParams ‘)’] +DefParamClause ::= [nl] ‘(’ DefParams ‘)’ | UsingParamClause +UsingParamClause ::= [nl] ‘(’ ‘using’ (DefParams | FunArgTypes) ‘)’ +DefParams ::= DefParam {‘,’ DefParam} +DefParam ::= {Annotation} [‘inline’] Param ``` ### Bindings and Imports -```ebnf -Bindings ::= ‘(’ [Binding {‘,’ Binding}] ‘)’ ; -Binding ::= (id | ‘_’) [‘:’ Type] ; +``` +Bindings ::= ‘(’ [Binding {‘,’ Binding}] ‘)’ +Binding ::= (id | ‘_’) [‘:’ Type] Modifier ::= LocalModifier | AccessModifier | ‘override’ - | ‘opaque’ ; + | ‘opaque’ LocalModifier ::= ‘abstract’ | ‘final’ | ‘sealed’ | ‘open’ | ‘implicit’ | ‘lazy’ - | ‘inline’ ; -AccessModifier ::= (‘private’ | ‘protected’) [AccessQualifier] ; -AccessQualifier ::= ‘[’ id ‘]’ ; + | ‘inline’ +AccessModifier ::= (‘private’ | ‘protected’) [AccessQualifier] +AccessQualifier ::= ‘[’ id ‘]’ -Annotation ::= ‘@’ SimpleType1 {ParArgumentExprs} ; +Annotation ::= ‘@’ SimpleType1 {ParArgumentExprs} -Import ::= ‘import’ ImportExpr {‘,’ ImportExpr} ; -Export ::= ‘export’ ImportExpr {‘,’ ImportExpr} ; +Import ::= ‘import’ ImportExpr {‘,’ ImportExpr} +Export ::= ‘export’ ImportExpr {‘,’ ImportExpr} ImportExpr ::= SimpleRef {‘.’ id} ‘.’ ImportSpec - | SimpleRef ‘as’ id ; + | SimpleRef ‘as’ id ImportSpec ::= NamedSelector | WildcardSelector - | ‘{’ ImportSelectors) ‘}’ ; -NamedSelector ::= id [‘as’ (id | ‘_’)] ; -WildCardSelector ::= ‘*' | ‘given’ [InfixType] ; + | ‘{’ ImportSelectors) ‘}’ +NamedSelector ::= id [‘as’ (id | ‘_’)] +WildCardSelector ::= ‘*' | ‘given’ [InfixType] ImportSelectors ::= NamedSelector [‘,’ ImportSelectors] - | WildCardSelector {‘,’ WildCardSelector} ; + | WildCardSelector {‘,’ WildCardSelector} -EndMarker ::= ‘end’ EndMarkerTag -- when followed by EOL ; +EndMarker ::= ‘end’ EndMarkerTag -- when followed by EOL EndMarkerTag ::= id | ‘if’ | ‘while’ | ‘for’ | ‘match’ | ‘try’ - | ‘new’ | ‘this’ | ‘given’ | ‘extension’ | ‘val’ ; + | ‘new’ | ‘this’ | ‘given’ | ‘extension’ | ‘val’ ``` ### Declarations and Definitions -```ebnf +``` RefineDcl ::= ‘val’ ValDcl | ‘def’ DefDcl - | ‘type’ {nl} TypeDcl ; + | ‘type’ {nl} TypeDcl Dcl ::= RefineDcl - | ‘var’ VarDcl ; -ValDcl ::= ids ‘:’ Type ; -VarDcl ::= ids ‘:’ Type ; -DefDcl ::= DefSig ‘:’ Type ; -DefSig ::= id [DefTypeParamClause] DefParamClauses ; -TypeDcl ::= id [TypeParamClause] {FunParamClause} TypeBounds [‘=’ Type] ; + | ‘var’ VarDcl +ValDcl ::= ids ‘:’ Type +VarDcl ::= ids ‘:’ Type +DefDcl ::= DefSig ‘:’ Type +DefSig ::= id [DefTypeParamClause] DefParamClauses +TypeDcl ::= id [TypeParamClause] {FunParamClause} TypeBounds [‘=’ Type] Def ::= ‘val’ PatDef | ‘var’ PatDef | ‘def’ DefDef | ‘type’ {nl} TypeDcl - | TmplDef ; + | TmplDef PatDef ::= ids [‘:’ Type] ‘=’ Expr - | Pattern2 [‘:’ Type] ‘=’ Expr ; + | Pattern2 [‘:’ Type] ‘=’ Expr DefDef ::= DefSig [‘:’ Type] ‘=’ Expr - | ‘this’ DefParamClause DefParamClauses ‘=’ ConstrExpr ; + | ‘this’ DefParamClause DefParamClauses ‘=’ ConstrExpr TmplDef ::= ([‘case’] ‘class’ | ‘trait’) ClassDef | [‘case’] ‘object’ ObjectDef | ‘enum’ EnumDef - | ‘given’ GivenDef ; -ClassDef ::= id ClassConstr [Template] ; -ClassConstr ::= [ClsTypeParamClause] [ConstrMods] ClsParamClauses ; -ConstrMods ::= {Annotation} [AccessModifier] ; -ObjectDef ::= id [Template] ; -EnumDef ::= id ClassConstr InheritClauses EnumBody ; -GivenDef ::= [GivenSig] (AnnotType [‘=’ Expr] | StructuralInstance) ; -GivenSig ::= [id] [DefTypeParamClause] {UsingParamClause} ‘:’ -- one of `id`, `DefParamClause`, `UsingParamClause` must be present ; -StructuralInstance ::= ConstrApp {‘with’ ConstrApp} [‘with’ TemplateBody] ; + | ‘given’ GivenDef +ClassDef ::= id ClassConstr [Template] +ClassConstr ::= [ClsTypeParamClause] [ConstrMods] ClsParamClauses +ConstrMods ::= {Annotation} [AccessModifier] +ObjectDef ::= id [Template] +EnumDef ::= id ClassConstr InheritClauses EnumBody +GivenDef ::= [GivenSig] (AnnotType [‘=’ Expr] | StructuralInstance) +GivenSig ::= [id] [DefTypeParamClause] {UsingParamClause} ‘:’ -- one of `id`, `DefParamClause`, `UsingParamClause` must be present +StructuralInstance ::= ConstrApp {‘with’ ConstrApp} [‘with’ WithTemplateBody] Extension ::= ‘extension’ [DefTypeParamClause] {UsingParamClause} - ‘(’ DefParam ‘)’ {UsingParamClause} ExtMethods ; -ExtMethods ::= ExtMethod | [nl] <<< ExtMethod {semi ExtMethod} >>> ; -ExtMethod ::= {Annotation [nl]} {Modifier} ‘def’ DefDef ; -Template ::= InheritClauses [TemplateBody] ; -InheritClauses ::= [‘extends’ ConstrApps] [‘derives’ QualId {‘,’ QualId}] ; -ConstrApps ::= ConstrApp ({‘,’ ConstrApp} | {‘with’ ConstrApp}) ; -ConstrApp ::= SimpleType1 {Annotation} {ParArgumentExprs} ; + ‘(’ DefParam ‘)’ {UsingParamClause} ExtMethods +ExtMethods ::= ExtMethod | [nl] <<< ExtMethod {semi ExtMethod} >>> +ExtMethod ::= {Annotation [nl]} {Modifier} ‘def’ DefDef + | Export +Template ::= InheritClauses [TemplateBody] +InheritClauses ::= [‘extends’ ConstrApps] [‘derives’ QualId {‘,’ QualId}] +ConstrApps ::= ConstrApp ({‘,’ ConstrApp} | {‘with’ ConstrApp}) +ConstrApp ::= SimpleType1 {Annotation} {ParArgumentExprs} ConstrExpr ::= SelfInvocation - | <<< SelfInvocation {semi BlockStat} >>> ; -SelfInvocation ::= ‘this’ ArgumentExprs {ArgumentExprs} ; + | <<< SelfInvocation {semi BlockStat} >>> +SelfInvocation ::= ‘this’ ArgumentExprs {ArgumentExprs} -TemplateBody ::= :<<< [SelfType] TemplateStat {semi TemplateStat} >>> ; +WithTemplateBody ::= <<< [SelfType] TemplateStat {semi TemplateStat} >>> +TemplateBody ::= :<<< [SelfType] TemplateStat {semi TemplateStat} >>> TemplateStat ::= Import | Export | {Annotation [nl]} {Modifier} Def @@ -426,16 +446,16 @@ TemplateStat ::= Import | Extension | Expr1 | EndMarker - | ; + | SelfType ::= id [‘:’ InfixType] ‘=>’ - | ‘this’ ‘:’ InfixType ‘=>’ ; + | ‘this’ ‘:’ InfixType ‘=>’ -EnumBody ::= :<<< [SelfType] EnumStat {semi EnumStat} >>> ; +EnumBody ::= :<<< [SelfType] EnumStat {semi EnumStat} >>> EnumStat ::= TemplateStat - | {Annotation [nl]} {Modifier} EnumCase ; -EnumCase ::= ‘case’ (id ClassConstr [‘extends’ ConstrApps]] | ids) ; + | {Annotation [nl]} {Modifier} EnumCase +EnumCase ::= ‘case’ (id ClassConstr [‘extends’ ConstrApps]] | ids) -TopStats ::= TopStat {semi TopStat} ; +TopStats ::= TopStat {semi TopStat} TopStat ::= Import | Export | {Annotation [nl]} {Modifier} Def @@ -443,9 +463,9 @@ TopStat ::= Import | Packaging | PackageObject | EndMarker - | ; -Packaging ::= ‘package’ QualId :<<< TopStats >>> ; -PackageObject ::= ‘package’ ‘object’ ObjectDef ; + | +Packaging ::= ‘package’ QualId :<<< TopStats >>> +PackageObject ::= ‘package’ ‘object’ ObjectDef -CompilationUnit ::= {‘package’ QualId semi} TopStats ; +CompilationUnit ::= {‘package’ QualId semi} TopStats ``` diff --git a/project/resources/referenceReplacements/sidebar.yml b/project/resources/referenceReplacements/sidebar.yml index 06507028c723..be67a0d6da99 100644 --- a/project/resources/referenceReplacements/sidebar.yml +++ b/project/resources/referenceReplacements/sidebar.yml @@ -135,6 +135,7 @@ subsection: index: reference/experimental/overview.md subsection: - page: reference/experimental/fewer-braces.md + hidden: true - page: reference/experimental/canthrow.md - page: reference/experimental/erased-defs.md - page: reference/experimental/erased-defs-spec.md From 383d8da5152ddc85b62565e2a91a339824d5a1f4 Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Fri, 7 Oct 2022 17:46:17 +0200 Subject: [PATCH 02/12] Add small improvements to the Reference page --- .../changed-features/compiler-plugins.md | 12 ++++---- .../changed-features/eta-expansion-spec.md | 4 +-- .../changed-features/main-functions.md | 2 +- .../contextual/by-name-context-parameters.md | 2 +- docs/_docs/reference/contextual/givens.md | 9 +++--- .../contextual/multiversal-equality.md | 7 +++-- .../reference/contextual/type-classes.md | 4 +-- .../dropped-features/delayed-init.md | 2 +- docs/_docs/reference/experimental/canthrow.md | 2 +- docs/_docs/reference/experimental/cc.md | 2 +- .../reference/experimental/explicit-nulls.md | 2 +- .../experimental/numeric-literals.md | 2 +- .../language-versions/source-compatibility.md | 2 +- .../metaprogramming/compiletime-ops.md | 22 +++++++------- .../metaprogramming/metaprogramming.md | 2 +- .../reference/new-types/type-lambdas-spec.md | 4 +-- .../reference/other-new-features/export.md | 8 ++--- .../other-new-features/kind-polymorphism.md | 2 +- .../other-new-features/opaques-details.md | 2 +- .../parameter-untupling-spec.md | 30 ------------------- .../other-new-features/parameter-untupling.md | 7 +++-- .../other-new-features/targetName.md | 2 +- 22 files changed, 53 insertions(+), 78 deletions(-) diff --git a/docs/_docs/reference/changed-features/compiler-plugins.md b/docs/_docs/reference/changed-features/compiler-plugins.md index 2a446e9cfb84..c2c171dbb2fe 100644 --- a/docs/_docs/reference/changed-features/compiler-plugins.md +++ b/docs/_docs/reference/changed-features/compiler-plugins.md @@ -4,18 +4,18 @@ title: "Changes in Compiler Plugins" movedTo: https://docs.scala-lang.org/scala3/reference/changed-features/compiler-plugins.html --- -Compiler plugins are supported by Dotty (and Scala 3) since 0.9. There are two notable changes -compared to `scalac`: +Compiler plugins are supported in Scala 3 since Dotty 0.9. There are two notable changes +compared to Scala 2: - No support for analyzer plugins - Added support for research plugins -[Analyzer plugins][1] in `scalac` run during type checking and may influence +[Analyzer plugins][1] run in Scala 2 during type checking and may influence normal type checking. This is a very powerful feature but for production usages, a predictable and consistent type checker is more important. For experimentation and research, Scala 3 introduces _research plugin_. Research plugins -are more powerful than `scalac` analyzer plugins as they let plugin authors customize +are more powerful than Scala 2 analyzer plugins as they let plugin authors customize the whole compiler pipeline. One can easily replace the standard typer by a custom one or create a parser for a domain-specific language. However, research plugins are only enabled for nightly or snaphot releases of Scala 3. @@ -26,7 +26,7 @@ _standard plugins_ in Scala 3. In terms of features, they are similar to ## Using Compiler Plugins -Both standard and research plugins can be used with `scalac` by adding the `-Xplugin:` option: +In Scala 3, both standard and research plugins can be used with `scalac` by adding the `-Xplugin:` option: ```shell scalac -Xplugin:pluginA.jar -Xplugin:pluginB.jar Test.scala @@ -40,7 +40,7 @@ the fully qualified plugin class name. The format of a property file is as follo pluginClass=dividezero.DivideZero ``` -This is different from `scalac` plugins that required a `scalac-plugin.xml` file. +This is different from Scala 2 plugins that require a `scalac-plugin.xml` file. Starting from 1.1.5, `sbt` also supports Scala 3 compiler plugins. Please refer to the [`sbt` documentation][2] for more information. diff --git a/docs/_docs/reference/changed-features/eta-expansion-spec.md b/docs/_docs/reference/changed-features/eta-expansion-spec.md index 67f7606d5b15..07f3c8b7bcc0 100644 --- a/docs/_docs/reference/changed-features/eta-expansion-spec.md +++ b/docs/_docs/reference/changed-features/eta-expansion-spec.md @@ -51,7 +51,7 @@ implicit val bla: Double = 1.0 val bar = foo // val bar: Int => Float = ... ``` -## Automatic Eta-Expansion and query types +## Automatic Eta-Expansion and context types A method with context parameters can be expanded to a value of a context type by writing the expected context type explicitly. @@ -66,7 +66,7 @@ val bar: Double ?=> Float = foo(3) - If `m` is has an empty argument list (i.e. has type `()R`): 1. If the expected type is of the form `() => T`, we eta expand. 2. If m is defined by Java, or overrides a Java defined method, we insert `()`. - 3. Otherwise we issue an error of the form: + 3. Otherwise we issue an error of the form: `method must be called with () argument` Thus, an unapplied method with an empty argument list is only converted to a function when a function type is expected. It is considered best practice to either explicitly apply the method to `()`, or convert it to a function with `() => m()`. diff --git a/docs/_docs/reference/changed-features/main-functions.md b/docs/_docs/reference/changed-features/main-functions.md index 2e82e70cff3d..5d366239acae 100644 --- a/docs/_docs/reference/changed-features/main-functions.md +++ b/docs/_docs/reference/changed-features/main-functions.md @@ -72,7 +72,7 @@ final class happyBirthday: case error: CLP.ParseError => CLP.showError(error) ``` -**Note**: The `` modifier above expresses that the `main` method is generated +**Note:** The `` modifier above expresses that the `main` method is generated as a static method of class `happyBirthday`. It is not available for user programs in Scala. Regular "static" members are generated in Scala using objects instead. [`@main`](https://scala-lang.org/api/3.x/scala/main.html) methods are the recommended scheme to generate programs that can be invoked from the command line in Scala 3. They replace the previous scheme to write program as objects with a special `App` parent class. In Scala 2, `happyBirthday` could be written also like this: diff --git a/docs/_docs/reference/contextual/by-name-context-parameters.md b/docs/_docs/reference/contextual/by-name-context-parameters.md index 8e8427f7457f..f92414678c27 100644 --- a/docs/_docs/reference/contextual/by-name-context-parameters.md +++ b/docs/_docs/reference/contextual/by-name-context-parameters.md @@ -53,7 +53,7 @@ In the example above, the definition of `s` would be expanded as follows. ```scala val s = summon[Test.Codec[Option[Int]]]( - optionCodec[Int](using intCodec) + using optionCodec[Int](using intCodec) ) ``` diff --git a/docs/_docs/reference/contextual/givens.md b/docs/_docs/reference/contextual/givens.md index 1d0baae9e257..d6273bae0e60 100644 --- a/docs/_docs/reference/contextual/givens.md +++ b/docs/_docs/reference/contextual/givens.md @@ -10,8 +10,9 @@ that serve for synthesizing arguments to [context parameters](./using-clauses.md ```scala trait Ord[T]: def compare(x: T, y: T): Int - extension (x: T) def < (y: T) = compare(x, y) < 0 - extension (x: T) def > (y: T) = compare(x, y) > 0 + extension (x: T) + def < (y: T) = compare(x, y) < 0 + def > (y: T) = compare(x, y) > 0 given intOrd: Ord[Int] with def compare(x: Int, y: Int) = @@ -51,7 +52,7 @@ given [T](using Ord[T]): Ord[List[T]] with If the name of a given is missing, the compiler will synthesize a name from the implemented type(s). -**Note** The name synthesized by the compiler is chosen to be readable and reasonably concise. For instance, the two instances above would get the names: +**Note:** The name synthesized by the compiler is chosen to be readable and reasonably concise. For instance, the two instances above would get the names: ```scala given_Ord_Int @@ -62,7 +63,7 @@ The precise rules for synthesizing names are found [here](./relationship-implici given instances of types that are "too similar". To avoid conflicts one can use named instances. -**Note** To ensure robust binary compatibility, publicly available libraries should prefer named instances. +**Note:** To ensure robust binary compatibility, publicly available libraries should prefer named instances. ## Alias Givens diff --git a/docs/_docs/reference/contextual/multiversal-equality.md b/docs/_docs/reference/contextual/multiversal-equality.md index f01b64d2e444..25cf57b6dc04 100644 --- a/docs/_docs/reference/contextual/multiversal-equality.md +++ b/docs/_docs/reference/contextual/multiversal-equality.md @@ -33,6 +33,7 @@ that derives `CanEqual`, e.g. ```scala class T derives CanEqual ``` +> Normally a [derives clause](./derivation.md) accepts only type classes with one parameter, however there is a special case for `CanEqual`. Alternatively, one can also provide a `CanEqual` given instance directly, like this: @@ -82,7 +83,7 @@ def canEqualAny[L, R]: CanEqual[L, R] = CanEqual.derived ``` Even though `canEqualAny` is not declared as `given`, the compiler will still -construct an `canEqualAny` instance as answer to an implicit search for the +construct a `canEqualAny` instance as answer to an implicit search for the type `CanEqual[L, R]`, unless `L` or `R` have `CanEqual` instances defined on them, or the language feature `strictEquality` is enabled. @@ -156,10 +157,10 @@ Instances are defined so that every one of these types has a _reflexive_ `CanEqu - Primitive numeric types can be compared with subtypes of `java.lang.Number` (and _vice versa_). - `Boolean` can be compared with `java.lang.Boolean` (and _vice versa_). - `Char` can be compared with `java.lang.Character` (and _vice versa_). - - Two sequences (of arbitrary subtypes of `scala.collection.Seq`) can be compared + - Two sequences (arbitrary subtypes of `scala.collection.Seq`) can be compared with each other if their element types can be compared. The two sequence types need not be the same. - - Two sets (of arbitrary subtypes of `scala.collection.Set`) can be compared + - Two sets (arbitrary subtypes of `scala.collection.Set`) can be compared with each other if their element types can be compared. The two set types need not be the same. - Any subtype of `AnyRef` can be compared with `Null` (and _vice versa_). diff --git a/docs/_docs/reference/contextual/type-classes.md b/docs/_docs/reference/contextual/type-classes.md index c4648b3baf28..d1ff02a0bf70 100644 --- a/docs/_docs/reference/contextual/type-classes.md +++ b/docs/_docs/reference/contextual/type-classes.md @@ -82,7 +82,7 @@ given Functor[List] with x.map(f) // List already has a `map` method ``` -With this `given` instance in scope, everywhere a `Functor` is expected, the compiler will accept a `List` to be used. +With this `given` instance in scope, everywhere a type with a `Functor` context bound is expected, the compiler will accept a `List` to be used. For instance, we may write such a testing method: @@ -214,7 +214,7 @@ instead of show(compute(i)(config))(config) ``` -Let's define this m then. First, we are going to define a type named `ConfigDependent` representing a function that when passed a `Config` produces a `Result`. +Let's define this `flatMap` then. First, we are going to define a type named `ConfigDependent` representing a function that when passed a `Config` produces a `Result`. ```scala type ConfigDependent[Result] = Config => Result diff --git a/docs/_docs/reference/dropped-features/delayed-init.md b/docs/_docs/reference/dropped-features/delayed-init.md index ab48de388569..8112c4d76c4f 100644 --- a/docs/_docs/reference/dropped-features/delayed-init.md +++ b/docs/_docs/reference/dropped-features/delayed-init.md @@ -18,7 +18,7 @@ object HelloWorld extends App { ``` However, the code is now run in the initializer of the object, which on -some JVM's means that it will only be interpreted. So, better not use it +some JVMs means that it will only be interpreted. So, better not use it for benchmarking! Also, if you want to access the command line arguments, you need to use an explicit `main` method for that. diff --git a/docs/_docs/reference/experimental/canthrow.md b/docs/_docs/reference/experimental/canthrow.md index 222bc63b6739..1eabfb22ff44 100644 --- a/docs/_docs/reference/experimental/canthrow.md +++ b/docs/_docs/reference/experimental/canthrow.md @@ -124,7 +124,7 @@ try body catch ... ``` -Note that the right-hand side of the synthesized given is `???` (undefined). This is OK since +Note that the right-hand side of the synthesized given is `compiletime.erasedValue`. This is OK since this given is erased; it will not be executed at runtime. **Note 1:** The [`saferExceptions`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$experimental$$saferExceptions$.html) feature is designed to work only with checked exceptions. An exception type is _checked_ if it is a subtype of diff --git a/docs/_docs/reference/experimental/cc.md b/docs/_docs/reference/experimental/cc.md index 592d410a4502..81cff3d2ab2d 100644 --- a/docs/_docs/reference/experimental/cc.md +++ b/docs/_docs/reference/experimental/cc.md @@ -177,7 +177,7 @@ def f(x: {c}-> Int): Int ``` Here, the actual argument to `f` is allowed to use the `c` capability but no others. -**Note**: It is strongly recommended to write the capability set and the arrow `->` without intervening spaces, +**Note:** It is strongly recommended to write the capability set and the arrow `->` without intervening spaces, as otherwise the notation would look confusingly like a function type. ## Subtyping and Subcapturing diff --git a/docs/_docs/reference/experimental/explicit-nulls.md b/docs/_docs/reference/experimental/explicit-nulls.md index 2b5ffe3559c6..e5d5d94b38cc 100644 --- a/docs/_docs/reference/experimental/explicit-nulls.md +++ b/docs/_docs/reference/experimental/explicit-nulls.md @@ -540,4 +540,4 @@ Our strategy for binary compatibility with Scala binaries that predate explicit and new libraries compiled without `-Yexplicit-nulls` is to leave the types unchanged and be compatible but unsound. -[More details](https://dotty.epfl.ch/docs/internals/explicit-nulls.html) +[Implementation details](https://dotty.epfl.ch/docs/internals/explicit-nulls.html) diff --git a/docs/_docs/reference/experimental/numeric-literals.md b/docs/_docs/reference/experimental/numeric-literals.md index e8d4f5309c1e..170e53408990 100644 --- a/docs/_docs/reference/experimental/numeric-literals.md +++ b/docs/_docs/reference/experimental/numeric-literals.md @@ -4,7 +4,7 @@ title: "Numeric Literals" movedTo: https://docs.scala-lang.org/scala3/reference/experimental/numeric-literals.html --- -**Note**: This feature is not yet part of the Scala 3 language definition. It can be made available by a language import: +This feature is not yet part of the Scala 3 language definition. It can be made available by a language import: ```scala import scala.language.experimental.genericNumberLiterals diff --git a/docs/_docs/reference/language-versions/source-compatibility.md b/docs/_docs/reference/language-versions/source-compatibility.md index 57bc15d11d88..077f06b2b4db 100644 --- a/docs/_docs/reference/language-versions/source-compatibility.md +++ b/docs/_docs/reference/language-versions/source-compatibility.md @@ -40,4 +40,4 @@ class C { ... } Language imports supersede command-line settings in the source files where they are specified. Only one language import specifying a source version is allowed in a source file, and it must come before any definitions in that file. -**Note**: The [Scala 3 Migration Guide](https://docs.scala-lang.org/scala3/guides/migration/compatibility-intro.html) gives further information to help the Scala programmer moving from Scala 2.13 to Scala 3. +**Note:** The [Scala 3 Migration Guide](https://docs.scala-lang.org/scala3/guides/migration/compatibility-intro.html) gives further information to help the Scala programmer moving from Scala 2.13 to Scala 3. diff --git a/docs/_docs/reference/metaprogramming/compiletime-ops.md b/docs/_docs/reference/metaprogramming/compiletime-ops.md index d101ae0c5c00..cb0874d5f8e4 100644 --- a/docs/_docs/reference/metaprogramming/compiletime-ops.md +++ b/docs/_docs/reference/metaprogramming/compiletime-ops.md @@ -11,7 +11,7 @@ The [`scala.compiletime`](https://scala-lang.org/api/3.x/scala/compiletime.html) ### `constValue` and `constValueOpt` `constValue` is a function that produces the constant value represented by a -type. +type, or a compile time error if the type is not a constant type. ```scala import scala.compiletime.constValue @@ -30,6 +30,8 @@ enabling us to handle situations where a value is not present. Note that `S` is the type of the successor of some singleton type. For example the type `S[1]` is the singleton type `2`. +Since tuples are not constant types, even if their constituants are, there is `constValueTuple`, which given a tuple type `(X1, ..., Xn)`, returns a tuple value `(constValue[X1], ..., constValue[Xn])`. + ### `erasedValue` So far we have seen inline methods that take terms (tuples and integers) as @@ -170,7 +172,7 @@ val concat: "a" + "b" = "ab" val addition: 1 + 1 = 2 ``` -## Summoning Implicits Selectively +## Summoning Givens Selectively It is foreseen that many areas of typelevel programming can be done with rewrite methods instead of implicits. But sometimes implicits are unavoidable. The @@ -178,16 +180,16 @@ problem so far was that the Prolog-like programming style of implicit search becomes viral: Once some construct depends on implicit search it has to be written as a logic program itself. Consider for instance the problem of creating a `TreeSet[T]` or a `HashSet[T]` depending on whether `T` has an `Ordering` or -not. We can create a set of implicit definitions like this: +not. We can create a set of given instances like this: ```scala trait SetFor[T, S <: Set[T]] class LowPriority: - implicit def hashSetFor[T]: SetFor[T, HashSet[T]] = ... + given hashSetFor[T]: SetFor[T, HashSet[T]] = ... object SetsFor extends LowPriority: - implicit def treeSetFor[T: Ordering]: SetFor[T, TreeSet[T]] = ... + given treeSetFor[T: Ordering]: SetFor[T, TreeSet[T]] = ... ``` Clearly, this is not pretty. Besides all the usual indirection of implicit @@ -236,18 +238,18 @@ inline def setFor[T]: Set[T] = summonFrom { `summonFrom` applications must be reduced at compile time. -Consequently, if we summon an `Ordering[String]` the code above will return a -new instance of `TreeSet[String]`. +Consequently, if a given instance of `Ordering[String]` is in the implicit scope, the code above will return a +new instance of `TreeSet[String]`. Such an instance is defined in `Ordering`'s companion object, so there will always be one. ```scala -summon[Ordering[String]] +summon[Ordering[String]] // Proves that an Ordering[String] is in scope println(setFor[String].getClass) // prints class scala.collection.immutable.TreeSet ``` -**Note** `summonFrom` applications can raise ambiguity errors. Consider the following +**Note:** `summonFrom` applications can raise ambiguity errors. Consider the following code with two givens in scope of type `A`. The pattern match in `f` will raise -an ambiguity error of `f` is applied. +an ambiguity error if `f` is applied. ```scala class A diff --git a/docs/_docs/reference/metaprogramming/metaprogramming.md b/docs/_docs/reference/metaprogramming/metaprogramming.md index 6b682d3ce237..e213b5fcbbe5 100644 --- a/docs/_docs/reference/metaprogramming/metaprogramming.md +++ b/docs/_docs/reference/metaprogramming/metaprogramming.md @@ -39,7 +39,7 @@ introduce the following fundamental facilities: representation of code. They can be parameterized and composed using splices, but their structure cannot be analyzed from the outside. TASTy reflection gives a way to analyze code structure by partly revealing the representation type of a piece of code in a standard API. The representation - type is a form of typed abstract syntax tree, which gives rise to the `TASTy` + type is a form of **t**yped **a**bstract **s**yntax **t**ree, which gives rise to the `TASTy` moniker. 6. [TASTy Inspection](./tasty-inspect.md) Typed abstract syntax trees are serialized diff --git a/docs/_docs/reference/new-types/type-lambdas-spec.md b/docs/_docs/reference/new-types/type-lambdas-spec.md index 5c791ba40272..d15d2d4fab25 100644 --- a/docs/_docs/reference/new-types/type-lambdas-spec.md +++ b/docs/_docs/reference/new-types/type-lambdas-spec.md @@ -103,9 +103,9 @@ type O2[X] = List[X] ``` would be treated as covariant, `X` is used covariantly on its right-hand side. -**Note**: The decision to treat `Nothing` as universal bottom type is provisional, and might be changed after further discussion. +**Note:** The decision to treat `Nothing` as universal bottom type is provisional, and might be changed after further discussion. -**Note**: Scala 2 and 3 differ in that Scala 2 also treats `Any` as universal top-type. This is not done in Scala 3. See also the discussion on [kind polymorphism](../other-new-features/kind-polymorphism.md) +**Note:** Scala 2 and 3 differ in that Scala 2 also treats `Any` as universal top-type. This is not done in Scala 3. See also the discussion on [kind polymorphism](../other-new-features/kind-polymorphism.md) ## Curried Type Parameters diff --git a/docs/_docs/reference/other-new-features/export.md b/docs/_docs/reference/other-new-features/export.md index ded55738b919..bdbdc42dd0b6 100644 --- a/docs/_docs/reference/other-new-features/export.md +++ b/docs/_docs/reference/other-new-features/export.md @@ -201,16 +201,16 @@ Consider the following example: ```scala class B { val c: Int } -object a { val b = new B } -export a.* +object A { val b = new B } +export A.* export b.* ``` -Is the `export b.*` clause legal? If yes, what does it export? Is it equivalent to `export a.b.*`? What about if we swap the last two clauses? +Is the `export b.*` clause legal? If yes, what does it export? Is it equivalent to `export A.b.*`? What about if we swap the last two clauses? ``` export b.* -export a.* +export A.* ``` To avoid tricky questions like these, we fix the elaboration order of exports as follows. diff --git a/docs/_docs/reference/other-new-features/kind-polymorphism.md b/docs/_docs/reference/other-new-features/kind-polymorphism.md index 057e9de9d55d..d4ff198c1ec7 100644 --- a/docs/_docs/reference/other-new-features/kind-polymorphism.md +++ b/docs/_docs/reference/other-new-features/kind-polymorphism.md @@ -43,5 +43,5 @@ It is declared `abstract` and `final`, so it can be neither instantiated nor ext `AnyKind` plays a special role in Scala's subtype system: It is a supertype of all other types no matter what their kind is. It is also assumed to be kind-compatible with all other types. Furthermore, `AnyKind` is treated as a higher-kinded type (so it cannot be used as a type of values), but at the same time it has no type parameters (so it cannot be instantiated). -**Note**: This feature is considered experimental but stable and it can be disabled under compiler flag +**Note:** This feature is considered experimental but stable and it can be disabled under compiler flag (i.e. `-Yno-kind-polymorphism`). diff --git a/docs/_docs/reference/other-new-features/opaques-details.md b/docs/_docs/reference/other-new-features/opaques-details.md index 83608ca78dd3..e58eae74811a 100644 --- a/docs/_docs/reference/other-new-features/opaques-details.md +++ b/docs/_docs/reference/other-new-features/opaques-details.md @@ -64,7 +64,7 @@ opaque type G = [T] =>> List[T] but the following are not: ```scala opaque type BadF[T] = [U] =>> (T, U) -opaque type BadG = [T] =>> [U] => (T, U) +opaque type BadG = [T] =>> [U] =>> (T, U) ``` ## Translation of Equality diff --git a/docs/_docs/reference/other-new-features/parameter-untupling-spec.md b/docs/_docs/reference/other-new-features/parameter-untupling-spec.md index e01e91059a27..0d0a58dccf02 100644 --- a/docs/_docs/reference/other-new-features/parameter-untupling-spec.md +++ b/docs/_docs/reference/other-new-features/parameter-untupling-spec.md @@ -4,37 +4,7 @@ title: "Parameter Untupling - More Details" movedTo: https://docs.scala-lang.org/scala3/reference/other-new-features/parameter-untupling-spec.html --- -## Motivation -Say you have a list of pairs - -```scala -val xs: List[(Int, Int)] -``` - -and you want to map `xs` to a list of `Int`s so that each pair of numbers is mapped to their sum. -Previously, the best way to do this was with a pattern-matching decomposition: - -```scala -xs.map { - case (x, y) => x + y -} -``` -While correct, this is inconvenient. Instead, we propose to write it the following way: - -```scala -xs.map { - (x, y) => x + y -} -``` - -or, equivalently: - -```scala -xs.map(_ + _) -``` - -Generally, a function value with `n > 1` parameters can be converted to a function with tupled arguments if the expected type is a unary function type of the form `((T_1, ..., T_n)) => U`. ## Type Checking diff --git a/docs/_docs/reference/other-new-features/parameter-untupling.md b/docs/_docs/reference/other-new-features/parameter-untupling.md index 4c0fdb2765e2..e00d2f5df7e8 100644 --- a/docs/_docs/reference/other-new-features/parameter-untupling.md +++ b/docs/_docs/reference/other-new-features/parameter-untupling.md @@ -57,12 +57,13 @@ The function value must be explicitly tupled, rather than the parameters untuple xs.map(combiner.tupled) ``` -A conversion may be provided in user code: +Though strongly discouraged, to have the same effect, an implicit conversion may be provided in user code: ```scala import scala.language.implicitConversions -transparent inline implicit def `fallback untupling`(f: (Int, Int) => Int): ((Int, Int)) => Int = - p => f(p._1, p._2) // use specialized apply instead of unspecialized `tupled` + +transparent inline given `fallback untupling`: Conversion[(Int, Int) => Int, ((Int, Int)) => Int] = _.tupled + xs.map(combiner) ``` diff --git a/docs/_docs/reference/other-new-features/targetName.md b/docs/_docs/reference/other-new-features/targetName.md index 09886968a232..3caec421da92 100644 --- a/docs/_docs/reference/other-new-features/targetName.md +++ b/docs/_docs/reference/other-new-features/targetName.md @@ -93,7 +93,7 @@ The relevant overriding rules can be summarized as follows: - If two members override, then both their erased names and their types must be the same. As usual, any overriding relationship in the generated code must also -be present in the original code. So the following example would also be in error: +be present in the original code. So the following example would also be an error: ```scala import annotation.targetName From 7a458712a3c12f9c877b324819cb145a59bee61e Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Tue, 11 Oct 2022 13:26:35 +0200 Subject: [PATCH 03/12] Update pattern-matching.md --- .../changed-features/pattern-matching.md | 156 +++++++++++------- 1 file changed, 95 insertions(+), 61 deletions(-) diff --git a/docs/_docs/reference/changed-features/pattern-matching.md b/docs/_docs/reference/changed-features/pattern-matching.md index a067d19f8ccd..46787f0a6e24 100644 --- a/docs/_docs/reference/changed-features/pattern-matching.md +++ b/docs/_docs/reference/changed-features/pattern-matching.md @@ -13,26 +13,38 @@ Scala 3 supports a superset of Scala 2 [extractors](https://www.scala-lang.org/f Extractors are objects that expose a method `unapply` or `unapplySeq`: ```scala -def unapply[A](x: T)(implicit x: B): U -def unapplySeq[A](x: T)(implicit x: B): U +def unapply(x: T): U +def unapplySeq(x: T): U +``` + +Where `T` is an arbitrary type, if it is a subtype of the scrutinee's type `Scrut`, a [type test](../other-new-features/type-test.md) is performed before calling the method. +`U` follows rules described in [Fixed Arity Extractors](#fixed-arity-extractors) and [Variadic Extractors](#variadic-extractors). + +**Note:** `U` can be the type of the extractor object. + +`unapply` and `unapplySeq` can actually have a more general signature, allowing for a leading type clause, as well as arbitrarily many using clauses, both before and after the regular term clause, and at most one implicit clause at the end, for example: + +```scala +def unapply[A, B](using C)(using D)(x: T)(using E)(using F)(implicit y: G): U = ??? ``` Extractors that expose the method `unapply` are called fixed-arity extractors, which work with patterns of fixed arity. Extractors that expose the method `unapplySeq` are called variadic extractors, which enables variadic patterns. -### Fixed-Arity Extractors +## Fixed-Arity Extractors + +Fixed-arity extractors expose the following signature (with potential type, using and implicit clauses): -Fixed-arity extractors expose the following signature: ```scala -def unapply[A](x: T)(implicit x: B): U +def unapply(x: T): U ``` The type `U` conforms to one of the following matches: -- Boolean match -- Product match +- [Boolean match](#boolean-match) +- [Product match](#product-match) Or `U` conforms to the type `R`: @@ -45,53 +57,24 @@ type R = { and `S` conforms to one of the following matches: -- single match -- name-based match +- [single match](#single-match) +- [name-based match](#name-based-match) The former form of `unapply` has higher precedence, and _single match_ has higher precedence over _name-based match_. +**Note:** the `S` in `R` can be `U`. + A usage of a fixed-arity extractor is irrefutable if one of the following condition holds: - `U = true` - the extractor is used as a product match -- `U = Some[T]` (for Scala 2 compatibility) - `U <: R` and `U <: { def isEmpty: false }` +- `U = Some[T]` -### Variadic Extractors - -Variadic extractors expose the following signature: - -```scala -def unapplySeq[A](x: T)(implicit x: B): U -``` - -The type `U` conforms to one of the following matches: - -- sequence match -- product-sequence match - -Or `U` conforms to the type `R`: - -```scala -type R = { - def isEmpty: Boolean - def get: S -} -``` - -and `S` conforms to one of the two matches above. - -The former form of `unapplySeq` has higher priority, and _sequence match_ has higher -precedence over _product-sequence match_. - -A usage of a variadic extractor is irrefutable if one of the following conditions holds: - -- the extractor is used directly as a sequence match or product-sequence match -- `U = Some[T]` (for Scala 2 compatibility) -- `U <: R` and `U <: { def isEmpty: false }` +**Note:** The last rule is necessary because, for compatibility reasons, `isEmpty` on `Some` has return type `Boolean` rather than `false`, even though it always returns `false`. -## Boolean Match +### Boolean Match - `U =:= Boolean` - Pattern-matching on exactly `0` patterns @@ -111,10 +94,10 @@ object Even: // even has an even number of characters ``` -## Product Match +### Product Match - `U <: Product` -- `N > 0` is the maximum number of consecutive (parameterless `def` or `val`) `_1: P1` ... `_N: PN` members in `U` +- `N > 0` is the maximum number of consecutive (`val` or parameterless `def`) `_1: P1` ... `_N: PN` members in `U` - Pattern-matching on exactly `N` patterns with types `P1, P2, ..., PN` For example: @@ -141,9 +124,11 @@ object FirstChars: // First: H; Second: i ``` -## Single Match +### Single Match -- If there is exactly `1` pattern, pattern-matching on `1` pattern with type `U` +- Pattern-matching on `1` pattern with type `S` + +For example, where `Nat <: R`, `S = Int`: @@ -162,27 +147,72 @@ object Nat: // 5 is a natural number ``` -## Name-based Match +### Name-based Match -- `N > 1` is the maximum number of consecutive (parameterless `def` or `val`) `_1: P1 ... _N: PN` members in `U` +- `S` has `N > 1` members such that they are each `val`s or parameterless `def`s, and named from `_1` with type `P1` to `_N` with type `PN` +- `S` doesn't have `N+1` members satisfying the previous point, i.e. `N` is maximal - Pattern-matching on exactly `N` patterns with types `P1, P2, ..., PN` +For example, where `U = AlwaysEmpty.type <: R`, `S = NameBased`: ```scala -object ProdEmpty: +object MyPatternMatcher: + def unapply(s: String) = AlwaysEmpty + +object AlwaysEmpty: + def isEmpty = true + def get = NameBased + +object NameBased: def _1: Int = ??? def _2: String = ??? - def isEmpty = true - def unapply(s: String): this.type = this - def get = this "" match - case ProdEmpty(_, _) => ??? + case MyPatternMatcher(_, _) => ??? case _ => () ``` -## Sequence Match +## Variadic Extractors + +Variadic extractors expose the following signature (with potential type, using and implicit clauses): + +```scala +def unapplySeq(x: T): U +``` + +Where `U` has to fullfill the following: + +1. Set `V := U` +2. `V` is valid if `V` conforms to one of the following matches: +- [sequence match](#sequence-match) +- [product-sequence match](#product-sequence-match) +3. Otherwise `U` has to conform to the type `R`: +```scala +type R = { + def isEmpty: Boolean + def get: S +} +``` +4. Set `V := S`, and reattempt 2., if it fails `U` is not valid. + +The `V := U` form of `unapplySeq` has higher priority, and _sequence match_ has higher +precedence over _product-sequence match_. + +**Note:** This means `isEmpty` is disregarded if the `V := U` form is valid + +A usage of a variadic extractor is irrefutable if one of the following conditions holds: + +- the extractor is used directly as a sequence match or product-sequence match +- `U <: R` and `U <: { def isEmpty: false }` +- `U = Some[T]` + +**Note:** The last rule is necessary because, for compatibility reasons, `isEmpty` on `Some` has return type `Boolean` rather than `false`, even though it always returns `false`. + +**Note:** Be careful, by the first condition and the note above, it is possible to define an irrefutable extractor with a `def isEmpty: true`. +This is strongly discouraged and, if found in the wild, is almost certainly a bug. + +### Sequence Match -- `U <: X`, `T2` and `T3` conform to `T1` +- `V <: X` ```scala type X = { @@ -192,10 +222,12 @@ type X = { def toSeq: scala.Seq[T3] } ``` - +- `T2` and `T3` conform to `T1` - Pattern-matching on _exactly_ `N` simple patterns with types `T1, T1, ..., T1`, where `N` is the runtime size of the sequence, or - Pattern-matching on `>= N` simple patterns and _a vararg pattern_ (e.g., `xs: _*`) with types `T1, T1, ..., T1, Seq[T1]`, where `N` is the minimum size of the sequence. +For example, where `V = S`, `U = Option[S] <: R`, `S = Seq[Char]` + ```scala @@ -211,14 +243,16 @@ object CharList: // e,x,a,m ``` -## Product-Sequence Match +### Product-Sequence Match -- `U <: Product` -- `N > 0` is the maximum number of consecutive (parameterless `def` or `val`) `_1: P1` ... `_N: PN` members in `U` +- `V <: Product` +- `N > 0` is the maximum number of consecutive (`val` or parameterless `def`) `_1: P1` ... `_N: PN` members in `V` - `PN` conforms to the signature `X` defined in Seq Pattern - Pattern-matching on exactly `>= N` patterns, the first `N - 1` patterns have types `P1, P2, ... P(N-1)`, the type of the remaining patterns are determined as in Seq Pattern. +For example, where `V = S`, `U = Option[S] <: R`, `S = (String, PN) <: Product`, `PN = Seq[Int]` + ```Scala class Foo(val name: String, val children: Int*) object Foo: @@ -227,7 +261,7 @@ object Foo: def foo(f: Foo) = f match case Foo(name, x, y, ns*) => ">= two children." - case Foo(name, ns*) => => "< two children." + case Foo(name, ns*) => "< two children." ``` There are plans for further simplification, in particular to factor out _product match_ From b0560f741bd7e3746efb10d1ec0b10b81892ce4e Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Tue, 11 Oct 2022 13:27:12 +0200 Subject: [PATCH 04/12] Update structural-types.md --- .../changed-features/structural-types.md | 98 ++++++++++--------- 1 file changed, 54 insertions(+), 44 deletions(-) diff --git a/docs/_docs/reference/changed-features/structural-types.md b/docs/_docs/reference/changed-features/structural-types.md index cc07487feb4d..b56f55c23551 100644 --- a/docs/_docs/reference/changed-features/structural-types.md +++ b/docs/_docs/reference/changed-features/structural-types.md @@ -35,19 +35,41 @@ configure how fields and methods should be resolved. Here's an example of a structural type `Person`: ```scala - class Record(elems: (String, Any)*) extends Selectable: - private val fields = elems.toMap - def selectDynamic(name: String): Any = fields(name) +type Person = Record { val name: String; val age: Int } +``` + +The type `Person` adds a _refinement_ to its parent type `Record` that defines the two fields `name` and `age`. We say the refinement is _structural_ since `name` and `age` are not defined in the parent type. But they exist nevertheless as members of type `Person`. - type Person = Record { val name: String; val age: Int } - ``` +This allows us to check at compiletime if accesses are valid: -The type `Person` adds a _refinement_ to its parent type `Record` that defines the two fields `name` and `age`. We say the refinement is _structural_ since `name` and `age` are not defined in the parent type. But they exist nevertheless as members of class `Person`. For instance, the following -program would print "Emma is 42 years old.": +```scala +val person: Person = ??? +println(s"${person.name} is ${person.age} years old.") // works +println(person.email) // error: value email is not a member of Person +``` +How is `Record` defined, and how does `person.name` resolve ? + +`Record` is a class that extends the marker trait [`scala.Selectable`](https://scala-lang.org/api/3.x/scala/Selectable.html) and defines +a method `selectDynamic`, which maps a field name to its value. +Selecting a member of a structural type is syntactic sugar for a call to this method. +The selections `person.name` and `person.age` are translated by +the Scala compiler to: ```scala - val person = Record("name" -> "Emma", "age" -> 42).asInstanceOf[Person] - println(s"${person.name} is ${person.age} years old.") +person.selectDynamic("name").asInstanceOf[String] +person.selectDynamic("age").asInstanceOf[Int] +``` + +For example, `Record` could be defined as follows: + +```scala +class Record(elems: (String, Any)*) extends Selectable: + private val fields = elems.toMap + def selectDynamic(name: String): Any = fields(name) +``` +Which allows us to create instances of `Person` like so: +```scala +val person = Record("name" -> "Emma", "age" -> 42).asInstanceOf[Person] ``` The parent type `Record` in this example is a generic class that can represent arbitrary records in its `elems` argument. This argument is a @@ -59,52 +81,45 @@ help from the user. In practice, the connection between a structural type and its underlying generic representation would most likely be done by a database layer, and therefore would not be a concern of the end user. -`Record` extends the marker trait [`scala.Selectable`](https://scala-lang.org/api/3.x/scala/Selectable.html) and defines -a method `selectDynamic`, which maps a field name to its value. -Selecting a structural type member is done by calling this method. -The `person.name` and `person.age` selections are translated by -the Scala compiler to: - -```scala - person.selectDynamic("name").asInstanceOf[String] - person.selectDynamic("age").asInstanceOf[Int] -``` - Besides `selectDynamic`, a `Selectable` class sometimes also defines a method `applyDynamic`. This can then be used to translate function calls of structural members. So, if `a` is an instance of `Selectable`, a structural call like `a.f(b, c)` would translate to ```scala - a.applyDynamic("f")(b, c) +a.applyDynamic("f")(b, c) ``` ## Using Java Reflection -Structural types can also be accessed using [Java reflection](https://www.oracle.com/technical-resources/articles/java/javareflection.html). Example: +Using `Selectable` and [Java reflection](https://www.oracle.com/technical-resources/articles/java/javareflection.html), we can select a member from unrelated classes. + +> Before resorting to structural calls with Java reflection one should consider alternatives. For instance, sometimes a more a modular _and_ efficient architecture can be obtained using [type classes](../contextual/type-classes.md). + +For example, we would like to provide behavior for both [`FileInputStream`](https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/io/FileInputStream.html#%3Cinit%3E(java.io.File)) and [`Channel`](https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/nio/channels/Channel.html) classes by calling their `close` method, however, these classes are unrelated, i.e. have no common supertype with a `close` method. Therefore, below we define a structural type `Closeable` that defines a `close` method. ```scala - type Closeable = { def close(): Unit } +type Closeable = { def close(): Unit } - class FileInputStream: - def close(): Unit +class FileInputStream: + def close(): Unit - class Channel: - def close(): Unit +class Channel: + def close(): Unit ``` -Here, we define a structural type `Closeable` that defines a `close` method. There are various classes that have `close` methods, we just list [`FileInputStream`](https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/io/FileInputStream.html#%3Cinit%3E(java.io.File)) and [`Channel`](https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/nio/channels/Channel.html) as two examples. It would be easiest if the two classes shared a common interface that factors out the `close` method. But such factorings are often not possible if different libraries are combined in one application. Yet, we can still have methods that work on -all classes with a `close` method by using the `Closeable` type. For instance, +Ideally we would add a common interface to both these classes to define the `close` method, however they are defined in libraries outside of our control. As a compromise we can use the structural type to define a single implementation for an `autoClose` method: + + ```scala - import scala.reflect.Selectable.reflectiveSelectable +import scala.reflect.Selectable.reflectiveSelectable - def autoClose(f: Closeable)(op: Closeable => Unit): Unit = - try op(f) finally f.close() +def autoClose(f: Closeable)(op: Closeable => Unit): Unit = + try op(f) finally f.close() ``` -The call `f.close()` has to use Java reflection to identify and call the `close` method in the receiver `f`. This needs to be enabled by an import -of `reflectiveSelectable` shown above. What happens "under the hood" is then the following: +The call `f.close()` requires `Closeable` to extend `Selectable` to identify and call the `close` method in the receiver `f`. A universal implicit conversion to `Selectable` is enabled by an import +of `reflectiveSelectable` shown above, based on [Java reflection](https://www.oracle.com/technical-resources/articles/java/javareflection.html). What happens "under the hood" is then the following: - - The import makes available an implicit conversion that turns any type into a - `Selectable`. `f` is wrapped in this conversion. + - The implicit conversion wraps `f` in an instance of `scala.reflect.Selectable` (which is a subtype of `Selectable`). - The compiler then transforms the `close` call on the wrapped `f` to an `applyDynamic` call. The end result is: @@ -113,7 +128,7 @@ of `reflectiveSelectable` shown above. What happens "under the hood" is then the reflectiveSelectable(f).applyDynamic("close")() ``` - The implementation of `applyDynamic` in `reflectiveSelectable`'s result -uses Java reflection to find and call a method `close` with zero parameters in the value referenced by `f` at runtime. +uses [Java reflection](https://www.oracle.com/technical-resources/articles/java/javareflection.html) to find and call a method `close` with zero parameters in the value referenced by `f` at runtime. Structural calls like this tend to be much slower than normal method calls. The mandatory import of `reflectiveSelectable` serves as a signpost that something inefficient is going on. @@ -121,8 +136,6 @@ Structural calls like this tend to be much slower than normal method calls. The `reflectiveSelectable` conversion. However, to warn against inefficient dispatch, Scala 2 requires a language import `import scala.language.reflectiveCalls`. -Before resorting to structural calls with Java reflection one should consider alternatives. For instance, sometimes a more a modular _and_ efficient architecture can be obtained using type classes. - ## Extensibility New instances of `Selectable` can be defined to support means of @@ -179,13 +192,10 @@ differences. is, as long as the correspondence of the structural type with the underlying value is as stated. -- [`Dynamic`](https://scala-lang.org/api/3.x/scala/Dynamic.html) is just a marker trait, which gives more leeway where and - how to define reflective access operations. By contrast - `Selectable` is a trait which declares the access operations. - - Two access operations, `selectDynamic` and `applyDynamic` are shared between both approaches. In `Selectable`, `applyDynamic` also may also take [`java.lang.Class`](https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/lang/Class.html) arguments indicating the method's formal parameter types. - [`Dynamic`](https://scala-lang.org/api/3.x/scala/Dynamic.html) comes with `updateDynamic`. + +- `updateDynamic` is unique to [`Dynamic`](https://scala-lang.org/api/3.x/scala/Dynamic.html) but as mentionned before, this fact is subject to change, and shouldn't be used as an assumption. [More details](structural-types-spec.md) From ae7ad2a1c6dafeba440412b5b6ac774706c736c4 Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Tue, 11 Oct 2022 13:28:27 +0200 Subject: [PATCH 05/12] Update #class-context-parameters in using-clauses.md --- .../reference/contextual/using-clauses.md | 29 ++++++++++++------- 1 file changed, 18 insertions(+), 11 deletions(-) diff --git a/docs/_docs/reference/contextual/using-clauses.md b/docs/_docs/reference/contextual/using-clauses.md index 8c522d82c402..649404318c42 100644 --- a/docs/_docs/reference/contextual/using-clauses.md +++ b/docs/_docs/reference/contextual/using-clauses.md @@ -50,29 +50,36 @@ Generally, context parameters may be defined either as a full parameter list `(p ## Class Context Parameters -If a class context parameter is made a member by adding a `val` or `var` modifier, -then that member is available as a given instance. +To make a class context parameter visible from outside the class body, it can be made into a member by adding a `val` or `var` modifier. +```scala +class GivenIntBox(using val usingParameter: Int): + def myInt = summon[Int] -Compare the following examples, where the attempt to supply an explicit `given` member induces an ambiguity: +val b = GivenIntBox(using 23) +import b.usingParameter +summon[Int] // 23 +``` +This is preferable to creating an explicit `given` member, as the latter creates ambiguity inside the class body: ```scala -class GivenIntBox(using val givenInt: Int): - def n = summon[Int] - -class GivenIntBox2(using givenInt: Int): - given Int = givenInt - //def n = summon[Int] // ambiguous +class GivenIntBox2(using usingParameter: Int): + given givenMember: Int = usingParameter + def n = summon[Int] // ambiguous given instances: both usingParameter and givenMember match type Int ``` -The `given` member is importable as explained in the section on [importing `given`s](./given-imports.md): +From the outside of `GivenIntBox`, `usingParameter` appears as if it were defined in the class as `given usingParameter: Int`, in particular it must be imported as described in the section on [importing `given`s](./given-imports.md). ```scala val b = GivenIntBox(using 23) +// Works: import b.given summon[Int] // 23 +usingParameter // 23 +// Fails: import b.* -//givenInt // Not found +summon[Int] // No given instance found +usingParameter // Not found ``` ## Inferring Complex Arguments From 70dc24eafd4a867afdd39a8056ed0ef561b6fbb1 Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Tue, 11 Oct 2022 13:35:12 +0200 Subject: [PATCH 06/12] Update macros.md --- .../_docs/reference/metaprogramming/macros.md | 81 ++++++++++++------- 1 file changed, 53 insertions(+), 28 deletions(-) diff --git a/docs/_docs/reference/metaprogramming/macros.md b/docs/_docs/reference/metaprogramming/macros.md index e1267ea82a59..53d06771f640 100644 --- a/docs/_docs/reference/metaprogramming/macros.md +++ b/docs/_docs/reference/metaprogramming/macros.md @@ -170,17 +170,10 @@ describing a function into a function mapping trees to trees. ```scala object Expr: ... - def betaReduce[...](...)(...): ... = ... + def betaReduce[T](expr: Expr[T])(using Quotes): Expr[T] ``` -The definition of `Expr.betaReduce(f)(x)` is assumed to be functionally the same as -`'{($f)($x)}`, however it should optimize this call by returning the -result of beta-reducing `f(x)` if `f` is a known lambda expression. -`Expr.betaReduce` distributes applications of `Expr` over function arrows: - -```scala -Expr.betaReduce(_): Expr[(T1, ..., Tn) => R] => ((Expr[T1], ..., Expr[Tn]) => Expr[R]) -``` +`Expr.betaReduce` returns an expression that is functionally equivalent to e, however if e is of the form `((y1, ..., yn) => e2)(e1, ..., en)` then it optimizes the top most call by returning the result of beta-reducing the application. Otherwise returns expr. ## Lifting Types @@ -192,7 +185,7 @@ quote but no splice between the parameter binding of `T` and its usage. But the code can be rewritten by adding an explicit binding of a `Type[T]`: ```scala -def to[T, R](f: Expr[T] => Expr[R])(using t: Type[T])(using Type[R], Quotes): Expr[T => R] = +def to[T, R](f: Expr[T] => Expr[R])(using t: Type[T], r: Type[R])(using Quotes): Expr[T => R] = '{ (x: t.Underlying) => ${ f('x) } } ``` @@ -217,14 +210,13 @@ would be rewritten to ```scala def to[T, R](f: Expr[T] => Expr[R])(using t: Type[T], r: Type[R])(using Quotes): Expr[T => R] = '{ - type T = t.Underlying + type T = summon[Type[T]].Underlying (x: T) => ${ f('x) } } ``` -The `summon` query succeeds because there is a given instance of -type `Type[T]` available (namely the given parameter corresponding -to the context bound `: Type`), and the reference to that value is +The `summon` query succeeds because there is a using parameter of +type `Type[T]`, and the reference to that value is phase-correct. If that was not the case, the phase inconsistency for `T` would be reported as an error. @@ -526,8 +518,8 @@ the code it runs produces one. ## Example Expansion -Assume we have two methods, one `map` that takes an `Expr[Array[T]]` and a -function `f` and one `sum` that performs a sum by delegating to `map`. +Assume we have two methods, `map` that takes an `Expr[Array[T]]` and a +function `f`, and `sum` that performs a sum by delegating to `map`. ```scala object Macros: @@ -552,38 +544,66 @@ object Macros: end Macros ``` -A call to `sum_m(Array(1,2,3))` will first inline `sum_m`: +A call to `sum_m(Array(1, 2, 3))` will first inline `sum_m`: ```scala -val arr: Array[Int] = Array.apply(1, [2,3 : Int]:Int*) -${_root_.Macros.sum('arr)} +val arr: Array[Int] = Array.apply(1, 2, 3) +${ _root_.Macros.sum('arr) } ``` -then it will splice `sum`: +then it will call `sum`: ```scala -val arr: Array[Int] = Array.apply(1, [2,3 : Int]:Int*) +val arr: Array[Int] = Array.apply(1, 2, 3) +${ '{ + var sum = 0 + ${ map('arr, x => '{sum += $x}) } + sum +} } +``` + +and cancel the `${'{...}}`: + +```scala +val arr: Array[Int] = Array.apply(1, 2, 3) var sum = 0 ${ map('arr, x => '{sum += $x}) } sum ``` -then it will inline `map`: +then it will extract `x => '{sum += $x}` into `f`, to have a value: ```scala -val arr: Array[Int] = Array.apply(1, [2,3 : Int]:Int*) +val arr: Array[Int] = Array.apply(1, 2, 3) var sum = 0 val f = x => '{sum += $x} -${ _root_.Macros.map('arr, 'f)(Type.of[Int])} +${ _root_.Macros.map('arr, 'f)(Type.of[Int]) } sum ``` -then it will expand and splice inside quotes `map`: +and then call `map`: ```scala -val arr: Array[Int] = Array.apply(1, [2,3 : Int]:Int*) +val arr: Array[Int] = Array.apply(1, 2, 3) + +var sum = 0 +val f = x => '{sum += $x} +${ '{ + var i: Int = 0 + while i < arr.length do + val element: Int = (arr)(i) + sum += element + i += 1 + sum +} } +``` + +and cancel the `${'{...}}` again: + +```scala +val arr: Array[Int] = Array.apply(1, 2, 3) var sum = 0 val f = x => '{sum += $x} @@ -598,7 +618,7 @@ sum Finally cleanups and dead code elimination: ```scala -val arr: Array[Int] = Array.apply(1, [2,3 : Int]:Int*) +val arr: Array[Int] = Array.apply(1, 2, 3) var sum = 0 var i: Int = 0 while i < arr.length do @@ -662,7 +682,7 @@ It is possible to deconstruct or extract values out of `Expr` using pattern matc `scala.quoted` contains objects that can help extracting values from `Expr`. -- `scala.quoted.Expr`/`scala.quoted.Exprs`: matches an expression of a value (or list of values) and returns the value (or list of values). +- `scala.quoted.Expr`/`scala.quoted.Exprs`: matches an expression of a value (resp. list of values) and returns the value (resp. list of values). - `scala.quoted.Const`/`scala.quoted.Consts`: Same as `Expr`/`Exprs` but only works on primitive values. - `scala.quoted.Varargs`: matches an explicit sequence of expressions and returns them. These sequences are useful to get individual `Expr[T]` out of a varargs expression of type `Expr[Seq[T]]`. @@ -682,6 +702,11 @@ private def sumExpr(argsExpr: Expr[Seq[Int]])(using Quotes): Expr[Int] = dynamicSum.foldLeft(Expr(staticSum))((acc, arg) => '{ $acc + $arg }) case _ => '{ $argsExpr.sum } + +sum(1, 2, 3) // gets matched by Varargs + +val xs = List(1, 2, 3) +sum(xs*) // doesn't get matched by Varargs ``` ### Quoted patterns From ec3f4e4a5235d61199c537f12e6827e4f31cc015 Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Tue, 11 Oct 2022 15:30:35 +0200 Subject: [PATCH 07/12] Update reflection.md --- .../reference/metaprogramming/reflection.md | 18 +++++++++++------- 1 file changed, 11 insertions(+), 7 deletions(-) diff --git a/docs/_docs/reference/metaprogramming/reflection.md b/docs/_docs/reference/metaprogramming/reflection.md index 74bb4f693e1b..4764704b7e9e 100644 --- a/docs/_docs/reference/metaprogramming/reflection.md +++ b/docs/_docs/reference/metaprogramming/reflection.md @@ -98,10 +98,11 @@ def macroImpl()(quotes: Quotes): Expr[Unit] = `quotes.reflect` contains three facilities for tree traversal and transformation. -`TreeAccumulator` ties the knot of a traversal. By calling `foldOver(x, tree)(owner)` -we can dive into the `tree` node and start accumulating values of type `X` (e.g., -of type `List[Symbol]` if we want to collect symbols). The code below, for -example, collects the `val` definitions in the tree. +`TreeAccumulator[X]` allows you to traverse the tree and aggregate data of type `X` along the way, by overriding its method `foldTree(x: X, tree: Tree)(owner: Symbol): X`. + +`foldOverTree(x: X, tree: Tree)(owner: Symbol): X` calls `foldTree` on each children of `tree` (using `fold` to give each call the value of the previous one). + +The code below, for example, collects the `val` definitions in the tree. ```scala def collectPatternVariables(tree: Tree)(using ctx: Context): List[Symbol] = @@ -115,12 +116,15 @@ def collectPatternVariables(tree: Tree)(using ctx: Context): List[Symbol] = acc(Nil, tree) ``` -A `TreeTraverser` extends a `TreeAccumulator` and performs the same traversal -but without returning any value. Finally, a `TreeMap` performs a transformation. +A `TreeTraverser` extends a `TreeAccumulator[Unit]` and performs the same traversal +but without returning any value. + +`TreeMap` transforms trees along the traversal, through overloading its methods it is possible to transform only trees of specific types, for example `transformStatement` only transforms `Statement`s. + #### ValDef.let -`quotes.reflect.ValDef` also offers a method `let` that allows us to bind the `rhs` (right-hand side) to a `val` and use it in `body`. +The object `quotes.reflect.ValDef` also offers a method `let` that allows us to bind the `rhs` (right-hand side) to a `val` and use it in `body`. Additionally, `lets` binds the given `terms` to names and allows to use them in the `body`. Their type definitions are shown below: From dacf08d6dcffab0142e3b7d823cc6d25cd068267 Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Tue, 11 Oct 2022 15:31:04 +0200 Subject: [PATCH 08/12] Hide simple-smp page and Move it after macros-spec --- docs/sidebar.yml | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/sidebar.yml b/docs/sidebar.yml index e28e0ae620d1..821f204c9da0 100644 --- a/docs/sidebar.yml +++ b/docs/sidebar.yml @@ -71,10 +71,11 @@ subsection: - page: reference/metaprogramming/macros.md - page: reference/metaprogramming/macros-spec.md hidden: true + - page: reference/metaprogramming/simple-smp.md # description of a simplified metaprogramming language, this might not be the best place for it + hidden: true - page: reference/metaprogramming/staging.md - page: reference/metaprogramming/reflection.md - page: reference/metaprogramming/tasty-inspect.md - - page: reference/metaprogramming/simple-smp.md - title: Other New Features index: reference/other-new-features/other-new-features.md subsection: From 9c556cf3f20d37b4bb5eb2074c97047ed304991f Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Tue, 11 Oct 2022 15:32:05 +0200 Subject: [PATCH 09/12] Update derivation.md --- docs/_docs/reference/contextual/derivation.md | 38 ++++++++----------- 1 file changed, 15 insertions(+), 23 deletions(-) diff --git a/docs/_docs/reference/contextual/derivation.md b/docs/_docs/reference/contextual/derivation.md index 87ae8a3a9a7e..b47bc77ecef9 100644 --- a/docs/_docs/reference/contextual/derivation.md +++ b/docs/_docs/reference/contextual/derivation.md @@ -7,7 +7,7 @@ movedTo: https://docs.scala-lang.org/scala3/reference/contextual/derivation.html Type class derivation is a way to automatically generate given instances for type classes which satisfy some simple conditions. A type class in this sense is any trait or class with a type parameter determining the type being operated on. Common examples are `Eq`, `Ordering`, or `Show`. For example, given the following `Tree` algebraic data type -(ADT), +(ADT): ```scala enum Tree[T] derives Eq, Ordering, Show: @@ -16,7 +16,7 @@ enum Tree[T] derives Eq, Ordering, Show: ``` The `derives` clause generates the following given instances for the `Eq`, `Ordering` and `Show` type classes in the -companion object of `Tree`, +companion object of `Tree`: ```scala given [T: Eq] : Eq[Tree[T]] = Eq.derived @@ -26,11 +26,21 @@ given [T: Show] : Show[Tree[T]] = Show.derived We say that `Tree` is the _deriving type_ and that the `Eq`, `Ordering` and `Show` instances are _derived instances_. -### Types supporting `derives` clauses +**Note:** The access to `derived` above is a normal access, therefore if there are multiple definitions of `derived` in the type class, overloading resolution applies. + +**Note:** `derived` can be used manually, this is useful when you do not have control over the definition. For example we can implement an `Ordering` for `Option`s like so: + +```scala +given [T: Ordering]: Ordering[Option[T]] = Ordering.derived +``` + +It is discouraged to directly refer to the `derived` member if you can use a `derives` clause instead. All data types can have a `derives` clause. This document focuses primarily on data types which also have a given instance of the `Mirror` type class available. +## `Mirror` + `Mirror` type class instances provide information at the type level about the components and labelling of the type. They also provide minimal term level infrastructure to allow higher level libraries to provide comprehensive derivation support. @@ -158,15 +168,11 @@ Note the following properties of `Mirror` types, + The methods `ordinal` and `fromProduct` are defined in terms of `MirroredMonoType` which is the type of kind-`*` which is obtained from `MirroredType` by wildcarding its type parameters. -### Type classes supporting automatic deriving +### Implementing `derived` with `Mirror` -A trait or class can appear in a `derives` clause if its companion object defines a method named `derived`. The -signature and implementation of a `derived` method for a type class `TC[_]` are arbitrary but it is typically of the -following form, +As seen before, the signature and implementation of a `derived` method for a type class `TC[_]` are arbitrary, but we expect it to typically be of the following form: ```scala -import scala.deriving.Mirror - inline def derived[T](using Mirror.Of[T]): TC[T] = ... ``` @@ -360,20 +366,6 @@ The framework described here enables all three of these approaches without manda For a brief discussion on how to use macros to write a type class `derived` method please read more at [How to write a type class `derived` method using macros](./derivation-macro.md). -### Deriving instances elsewhere - -Sometimes one would like to derive a type class instance for an ADT after the ADT is defined, without being able to -change the code of the ADT itself. To do this, simply define an instance using the `derived` method of the type class -as right-hand side. E.g, to implement `Ordering` for `Option` define, - -```scala -given [T: Ordering]: Ordering[Option[T]] = Ordering.derived -``` - -Assuming the `Ordering.derived` method has a context parameter of type `Mirror[T]` it will be satisfied by the -compiler generated `Mirror` instance for `Option` and the derivation of the instance will be expanded on the right -hand side of this definition in the same way as an instance defined in ADT companion objects. - ### Syntax ``` From 00b22b6ebf4b79eacbb2cff47fde709339743d24 Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Wed, 19 Oct 2022 13:19:18 +0200 Subject: [PATCH 10/12] Remove motivation from description of `summonFrom` --- .../metaprogramming/compiletime-ops.md | 37 +------------------ 1 file changed, 2 insertions(+), 35 deletions(-) diff --git a/docs/_docs/reference/metaprogramming/compiletime-ops.md b/docs/_docs/reference/metaprogramming/compiletime-ops.md index cb0874d5f8e4..19acf2abb99d 100644 --- a/docs/_docs/reference/metaprogramming/compiletime-ops.md +++ b/docs/_docs/reference/metaprogramming/compiletime-ops.md @@ -174,40 +174,7 @@ val addition: 1 + 1 = 2 ## Summoning Givens Selectively -It is foreseen that many areas of typelevel programming can be done with rewrite -methods instead of implicits. But sometimes implicits are unavoidable. The -problem so far was that the Prolog-like programming style of implicit search -becomes viral: Once some construct depends on implicit search it has to be -written as a logic program itself. Consider for instance the problem of creating -a `TreeSet[T]` or a `HashSet[T]` depending on whether `T` has an `Ordering` or -not. We can create a set of given instances like this: - -```scala -trait SetFor[T, S <: Set[T]] - -class LowPriority: - given hashSetFor[T]: SetFor[T, HashSet[T]] = ... - -object SetsFor extends LowPriority: - given treeSetFor[T: Ordering]: SetFor[T, TreeSet[T]] = ... -``` - -Clearly, this is not pretty. Besides all the usual indirection of implicit -search, we face the problem of rule prioritization where we have to ensure that -`treeSetFor` takes priority over `hashSetFor` if the element type has an -ordering. This is solved (clumsily) by putting `hashSetFor` in a superclass -`LowPriority` of the object `SetsFor` where `treeSetFor` is defined. Maybe the -boilerplate would still be acceptable if the crufty code could be contained. -However, this is not the case. Every user of the abstraction has to be -parameterized itself with a `SetFor` implicit. Considering the simple task _"I -want a `TreeSet[T]` if `T` has an ordering and a `HashSet[T]` otherwise"_, this -seems like a lot of ceremony. - -There are some proposals to improve the situation in specific areas, for -instance by allowing more elaborate schemes to specify priorities. But they all -keep the viral nature of implicit search programs based on logic programming. - -By contrast, the new `summonFrom` construct makes implicit search available +The new `summonFrom` construct makes implicit search available in a functional context. To solve the problem of creating the right set, one would use it as follows: @@ -223,7 +190,7 @@ inline def setFor[T]: Set[T] = summonFrom { A `summonFrom` call takes a pattern matching closure as argument. All patterns in the closure are type ascriptions of the form `identifier : Type`. -Patterns are tried in sequence. The first case with a pattern `x: T` such that an implicit value of type `T` can be summoned is chosen. +Patterns are tried in sequence. The first case with a pattern `x: T` such that a contextual value of type `T` can be summoned is chosen. Alternatively, one can also use a pattern-bound given instance, which avoids the explicit using clause. For instance, `setFor` could also be formulated as follows: From 7c47ac9e71f5d646d3a5db115e8e8504489ce5b7 Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Thu, 20 Oct 2022 11:16:54 +0200 Subject: [PATCH 11/12] Add "Exact mechanism" section to derivation.md --- docs/_docs/reference/contextual/derivation.md | 125 +++++++++++++++++- 1 file changed, 119 insertions(+), 6 deletions(-) diff --git a/docs/_docs/reference/contextual/derivation.md b/docs/_docs/reference/contextual/derivation.md index b47bc77ecef9..ff2bce9467e8 100644 --- a/docs/_docs/reference/contextual/derivation.md +++ b/docs/_docs/reference/contextual/derivation.md @@ -5,8 +5,8 @@ movedTo: https://docs.scala-lang.org/scala3/reference/contextual/derivation.html --- Type class derivation is a way to automatically generate given instances for type classes which satisfy some simple -conditions. A type class in this sense is any trait or class with a type parameter determining the type being operated -on. Common examples are `Eq`, `Ordering`, or `Show`. For example, given the following `Tree` algebraic data type +conditions. A type class in this sense is any trait or class with a single type parameter determining the type being operated +on, and the special case `CanEqual`. Common examples are `Eq`, `Ordering`, or `Show`. For example, given the following `Tree` algebraic data type (ADT): ```scala @@ -26,9 +26,7 @@ given [T: Show] : Show[Tree[T]] = Show.derived We say that `Tree` is the _deriving type_ and that the `Eq`, `Ordering` and `Show` instances are _derived instances_. -**Note:** The access to `derived` above is a normal access, therefore if there are multiple definitions of `derived` in the type class, overloading resolution applies. - -**Note:** `derived` can be used manually, this is useful when you do not have control over the definition. For example we can implement an `Ordering` for `Option`s like so: +**Note:** `derived` can be used manually, this is useful when you do not have control over the definition. For example we can implement `Ordering` for `Option`s like so: ```scala given [T: Ordering]: Ordering[Option[T]] = Ordering.derived @@ -36,7 +34,122 @@ given [T: Ordering]: Ordering[Option[T]] = Ordering.derived It is discouraged to directly refer to the `derived` member if you can use a `derives` clause instead. -All data types can have a `derives` clause. This document focuses primarily on data types which also have a given instance +## Exact mechanism +In the following, when type arguments are enumerated and the first index evaluates to a larger value than the last, then there are actually no arguments, for example: `A[T_2, ..., T_1]` means `A`. + +For a class/trait/object/enum `DerivingType[T_1, ..., T_N] derives TC`, a derived instance is created in `DerivingType`'s companion object (or `DerivingType` itself if it is an object). + +The general "shape" of the derived instance is as follows: +```scala +given [...](using ...): TC[ ... DerivingType[...] ... ] = TC.derived +``` +`TC.derived` should be an expression that conforms to the expected type on the left, potentially elaborated using term and/or type inference. + +**Note:** `TC.derived` is a normal access, therefore if there are multiple definitions of `TC.derived`, overloading resolution applies. + +What the derived instance precisely looks like depends on the specifics of `DerivingType` and `TC`, we first examine `TC`: + +### `TC` takes 1 parameter `F` + +Therefore `TC` is defined as `TC[F[A_1, ..., A_K]]` (`TC[F]` if `K == 0`) for some `F`. +There are two further cases depending on the kinds of arguments: + +#### `F` and all arguments of `DerivingType` have kind `*` +**Note:** `K == 0` in this case. + +The generated instance is then: +```scala +given [T_1: TC, ..., T_N: TC]: TC[DerivingType[T_1, ..., T_N]] = TC.derived +``` + +This is the most common case, and is the one that was highlighted in the introduction. + +**Note:** The `[T_i: TC, ...]` introduces a `(using TC[T_i], ...)`, more information in [Context Bounds](./context-bounds.md). +This allows the `derived` member to access these evidences. + +**Note:** If `N == 0` the above means: +```scala +given TC[DerivingType] = TC.derived +``` +For example, the class +```scala +case class Point(x: Int, y: Int) derives Ordering +``` +generates the instance +```scala +object Point: + ... + given Ordering[Point] = Ordering.derived +``` + + +#### `F` and `DerivingType` have parameters of matching kind on the right +This section covers cases where you can pair arguments of `F` and `DerivingType` starting from the right such that they have the same kinds pairwise, and all arguments of `F` or `DerivingType` (or both) are used up. +`F` must also have at least one parameter. + +The general shape will then be: +```scala +given [...]: TC[ [...] =>> DerivingType[...] ] = TC.derived +``` +Where of course `TC` and `DerivingType` are applied to types of the correct kind. + +To make this work, we split it into 3 cases: + +If `F` and `DerivingType` take the same number of arguments (`N == K`): +```scala +given TC[DerivingType] = TC.derived +// simplified form of: +given TC[ [A_1, ..., A_K] => DerivingType[A_1, ..., A_K] ] = TC.derived +``` +If `DerivingType` takes less arguments than `F` (`N < K`), we use only the rightmost parameters from the type lambda: +```scala +given TC[ [A_1, ..., A_K] =>> DerivingType[A_(K-N+1), ..., A_K] ] = TC.derived + +// if DerivingType takes no arguments (N == 0), the above simplifies to: +given TC[ [A_1, ..., A_K] =>> DerivingType ] = TC.derived +``` + +If `F` takes less arguments than `DerivingType` (`K < N`), we fill in the remaining leftmost slots with type parameters of the given: +```scala +given [T_1, ... T_(N-K)]: TC[[A_1, ..., A_K] =>> DerivingType[T_1, ... T_(N-K), A_1, ..., A_K]] = TC.derived +``` + +### `TC` is the `CanEqual` type class + +We have therefore: `DerivingType[T_1, ..., T_N] derives CanEqual`. + +Let `U_1`, ..., `U_M` be the parameters of `DerivingType` of kind `*`. +(These are a subset of the `T_i`s) + +The generated instance is then: +```scala +given [T_1L, T_1R, ..., T_NL, T_NR] // every parameter of DerivingType twice + (using CanEqual[U_1L, U_1R], ..., CanEqual[U_ML, U_MR]): // only parameters of DerivingType with kind * + CanEqual[DerivingType[T_1L, ..., T_NL], DerivingType[T_1R, ..., T_NR]] = // again, every parameter + CanEqual.derived +``` + +The bounds of `T_i`s are handled correctly, for example: `T_2 <: T_1` becomes `T_2L <: T_1L`. + +For example, the class +```scala +class MyClass[A, G[_]](a: A, b: G[B]) derives CanEqual +``` +generates the following given instance: +```scala +object MyClass: + ... + given [A_L, A_R, G_L[_], G_R[_]](using CanEqual[A_L, A_R]): CanEqual[MyClass[A_L, G_L], MyClass[A_R, G_R]] = CanEqual.derived +``` + +### `TC` is not valid for automatic derivation + +Throw an error. + +The exact error depends on which of the above conditions failed. +As an example, if `TC` takes more than 1 parameter and is not `CanEqual`, the error is `DerivingType cannot be unified with the type argument of TC`. + +All data types can have a `derives` clause. The rest of this document focuses primarily on data types which also have a given instance of the `Mirror` type class available. ## `Mirror` From 2574d80d6820f6cd02755b5d04bf7849d79354af Mon Sep 17 00:00:00 2001 From: Ben Hutchison Date: Sat, 22 Oct 2022 11:11:49 +1100 Subject: [PATCH 12/12] mention package for Mirror Newcomers (like me) don't know where `Mirror` comes from and there's actually a lot of Scala 3 content that skips over mentioning this --- docs/_docs/reference/contextual/derivation.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/_docs/reference/contextual/derivation.md b/docs/_docs/reference/contextual/derivation.md index ff2bce9467e8..c8d9e313554e 100644 --- a/docs/_docs/reference/contextual/derivation.md +++ b/docs/_docs/reference/contextual/derivation.md @@ -154,7 +154,7 @@ of the `Mirror` type class available. ## `Mirror` -`Mirror` type class instances provide information at the type level about the components and labelling of the type. +`scala.deriving.Mirror` type class instances provide information at the type level about the components and labelling of the type. They also provide minimal term level infrastructure to allow higher level libraries to provide comprehensive derivation support. @@ -172,7 +172,7 @@ Instances for `Mirror` are also generated conditionally for: - and where the compiler can generate a `Mirror` type class instance for each child case. -The `Mirror` type class definition is as follows: +The `scala.deriving.Mirror` type class definition is as follows: ```scala sealed trait Mirror: