From 8372014d4c581c0622c4c47d356114ddf0a659de Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Tue, 18 Oct 2022 10:51:03 +0200 Subject: [PATCH] update reference docs and replacements for 3.2.0 --- .../changed-features/pattern-bindings.md | 10 +- .../changed-features/pattern-matching.md | 27 +- .../reference/contextual/context-bounds.md | 11 +- .../reference/contextual/derivation-macro.md | 25 +- docs/_docs/reference/contextual/derivation.md | 42 +- .../dropped-features/nonlocal-returns.md | 2 +- .../dropped-features/this-qualifier.md | 2 - docs/_docs/reference/enums/enums-index.md | 2 +- .../reference/experimental/explicit-nulls.md | 2 +- .../reference/experimental/fewer-braces.md | 78 +--- .../experimental/numeric-literals.md | 2 +- docs/_docs/reference/experimental/overview.md | 1 - .../language-versions/binary-compatibility.md | 25 +- .../language-versions/language-versions.md | 1 + .../language-versions/source-compatibility.md | 19 +- .../metaprogramming/compiletime-ops.md | 8 +- docs/_docs/reference/new-types/new-types.md | 2 +- .../other-new-features/experimental-defs.md | 8 +- .../reference/other-new-features/export.md | 52 ++- .../other-new-features/indentation.md | 84 ++-- .../other-new-features/targetName.md | 2 +- docs/_docs/reference/overview.md | 4 +- docs/_docs/reference/syntax.md | 428 +++++++++--------- .../referenceReplacements/sidebar.yml | 1 + 24 files changed, 430 insertions(+), 408 deletions(-) diff --git a/docs/_docs/reference/changed-features/pattern-bindings.md b/docs/_docs/reference/changed-features/pattern-bindings.md index 2c8d1c10ceae..b7b7432e7817 100644 --- a/docs/_docs/reference/changed-features/pattern-bindings.md +++ b/docs/_docs/reference/changed-features/pattern-bindings.md @@ -7,7 +7,7 @@ movedTo: https://docs.scala-lang.org/scala3/reference/changed-features/pattern-b In Scala 2, pattern bindings in `val` definitions and `for` expressions are loosely typed. Potentially failing matches are still accepted at compile-time, but may influence the program's runtime behavior. -From Scala 3.1 on, type checking rules will be tightened so that warnings are reported at compile-time instead. +From Scala 3.2 on, type checking rules will be tightened so that warnings are reported at compile-time instead. ## Bindings in Pattern Definitions @@ -16,7 +16,7 @@ val xs: List[Any] = List(1, 2, 3) val (x: String) :: _ = xs // error: pattern's type String is more specialized // than the right-hand side expression's type Any ``` -This code gives a compile-time warning in Scala 3.1 (and also in Scala 3.0 under the `-source future` setting) whereas it will fail at runtime with a `ClassCastException` in Scala 2. In Scala 3.1, a pattern binding is only allowed if the pattern is _irrefutable_, that is, if the right-hand side's type conforms to the pattern's type. For instance, the following is OK: +This code gives a compile-time warning in Scala 3.2 (and also earlier Scala 3.x under the `-source future` setting) whereas it will fail at runtime with a `ClassCastException` in Scala 2. In Scala 3.2, a pattern binding is only allowed if the pattern is _irrefutable_, that is, if the right-hand side's type conforms to the pattern's type. For instance, the following is OK: ```scala val pair = (1, true) val (x, y) = pair @@ -25,7 +25,7 @@ Sometimes one wants to decompose data anyway, even though the pattern is refutab ```scala val first :: rest = elems // error ``` -This works in Scala 2. In fact it is a typical use case for Scala 2's rules. But in Scala 3.1 it will give a warning. One can avoid the warning by marking the right-hand side with an [`@unchecked`](https://scala-lang.org/api/3.x/scala/unchecked.html) annotation: +This works in Scala 2. In fact it is a typical use case for Scala 2's rules. But in Scala 3.2 it will give a warning. One can avoid the warning by marking the right-hand side with an [`@unchecked`](https://scala-lang.org/api/3.x/scala/unchecked.html) annotation: ```scala val first :: rest = elems: @unchecked // OK ``` @@ -40,7 +40,7 @@ val elems: List[Any] = List((1, 2), "hello", (3, 4)) for (x, y) <- elems yield (y, x) // error: pattern's type (Any, Any) is more specialized // than the right-hand side expression's type Any ``` -This code gives a compile-time warning in Scala 3.1 whereas in Scala 2 the list `elems` +This code gives a compile-time warning in Scala 3.2 whereas in Scala 2 the list `elems` is filtered to retain only the elements of tuple type that match the pattern `(x, y)`. The filtering functionality can be obtained in Scala 3 by prefixing the pattern with `case`: ```scala @@ -56,4 +56,4 @@ Generator ::= [‘case’] Pattern1 ‘<-’ Expr ## Migration -The new syntax is supported in Scala 3.0. However, to enable smooth cross compilation between Scala 2 and Scala 3, the changed behavior and additional type checks are only enabled under the `-source future` setting. They will be enabled by default in version 3.1 of the language. +The new syntax is supported in Scala 3.0. However, to enable smooth cross compilation between Scala 2 and Scala 3, the changed behavior and additional type checks are only enabled under the `-source future` setting. They will be enabled by default in version 3.2 of the language. diff --git a/docs/_docs/reference/changed-features/pattern-matching.md b/docs/_docs/reference/changed-features/pattern-matching.md index b4660f893141..a067d19f8ccd 100644 --- a/docs/_docs/reference/changed-features/pattern-matching.md +++ b/docs/_docs/reference/changed-features/pattern-matching.md @@ -4,7 +4,7 @@ title: "Option-less pattern matching" movedTo: https://docs.scala-lang.org/scala3/reference/changed-features/pattern-matching.html --- -The implementation of pattern matching in Scala 3 was greatly simplified compared to Scala 2. From a user perspective, this means that Scala 3 generated patterns are a *lot* easier to debug, as variables all show up in debug modes and positions are correctly preserved. +The implementation of pattern matching in Scala 3 was greatly simplified compared to Scala 2. From a user perspective, this means that Scala 3 generated patterns are a _lot_ easier to debug, as variables all show up in debug modes and positions are correctly preserved. Scala 3 supports a superset of Scala 2 [extractors](https://www.scala-lang.org/files/archive/spec/2.13/08-pattern-matching.html#extractor-patterns). @@ -12,7 +12,7 @@ Scala 3 supports a superset of Scala 2 [extractors](https://www.scala-lang.org/f Extractors are objects that expose a method `unapply` or `unapplySeq`: -```Scala +```scala def unapply[A](x: T)(implicit x: B): U def unapplySeq[A](x: T)(implicit x: B): U ``` @@ -25,7 +25,7 @@ called variadic extractors, which enables variadic patterns. Fixed-arity extractors expose the following signature: -```Scala +```scala def unapply[A](x: T)(implicit x: B): U ``` @@ -36,7 +36,7 @@ The type `U` conforms to one of the following matches: Or `U` conforms to the type `R`: -```Scala +```scala type R = { def isEmpty: Boolean def get: S @@ -62,7 +62,7 @@ A usage of a fixed-arity extractor is irrefutable if one of the following condit Variadic extractors expose the following signature: -```Scala +```scala def unapplySeq[A](x: T)(implicit x: B): U ``` @@ -73,7 +73,7 @@ The type `U` conforms to one of the following matches: Or `U` conforms to the type `R`: -```Scala +```scala type R = { def isEmpty: Boolean def get: S @@ -167,7 +167,7 @@ object Nat: - `N > 1` is the maximum number of consecutive (parameterless `def` or `val`) `_1: P1 ... _N: PN` members in `U` - Pattern-matching on exactly `N` patterns with types `P1, P2, ..., PN` -```Scala +```scala object ProdEmpty: def _1: Int = ??? def _2: String = ??? @@ -180,12 +180,11 @@ object ProdEmpty: case _ => () ``` - ## Sequence Match - `U <: X`, `T2` and `T3` conform to `T1` -```Scala +```scala type X = { def lengthCompare(len: Int): Int // or, `def length: Int` def apply(i: Int): T1 @@ -221,18 +220,18 @@ object CharList: the type of the remaining patterns are determined as in Seq Pattern. ```Scala -class Foo(val name: String, val children: Int *) +class Foo(val name: String, val children: Int*) object Foo: def unapplySeq(f: Foo): Option[(String, Seq[Int])] = Some((f.name, f.children)) def foo(f: Foo) = f match - case Foo(name, ns : _*) => - case Foo(name, x, y, ns : _*) => + case Foo(name, x, y, ns*) => ">= two children." + case Foo(name, ns*) => => "< two children." ``` -There are plans for further simplification, in particular to factor out *product -match* and *name-based match* into a single type of extractor. +There are plans for further simplification, in particular to factor out _product match_ +and _name-based match_ into a single type of extractor. ## Type testing diff --git a/docs/_docs/reference/contextual/context-bounds.md b/docs/_docs/reference/contextual/context-bounds.md index e336f00cc463..e0be7bfd31a6 100644 --- a/docs/_docs/reference/contextual/context-bounds.md +++ b/docs/_docs/reference/contextual/context-bounds.md @@ -10,7 +10,14 @@ A context bound is a shorthand for expressing the common pattern of a context pa def maximum[T: Ord](xs: List[T]): T = xs.reduceLeft(max) ``` -A bound like `: Ord` on a type parameter `T` of a method or class indicates a context parameter `using Ord[T]`. The context parameter(s) generated from context bounds come last in the definition of the containing method or class. For instance, +A bound like `: Ord` on a type parameter `T` of a method or class indicates a context parameter `using Ord[T]`. The context parameter(s) generated from context bounds +are added as follows: + + - If the method parameters end in an implicit parameter list or using clause, + context parameters are added in front of that list. + - Otherwise they are added as a separate parameter clause at the end. + +Example: ```scala def f[T: C1 : C2, U: C3](x: T)(using y: U, z: V): R @@ -19,7 +26,7 @@ def f[T: C1 : C2, U: C3](x: T)(using y: U, z: V): R would expand to ```scala -def f[T, U](x: T)(using y: U, z: V)(using C1[T], C2[T], C3[U]): R +def f[T, U](x: T)(using _: C1[T], _: C2[T], _: C3[U], y: U, z: V): R ``` Context bounds can be combined with subtype bounds. If both are present, subtype bounds come first, e.g. diff --git a/docs/_docs/reference/contextual/derivation-macro.md b/docs/_docs/reference/contextual/derivation-macro.md index 5ff0007268dd..060d04424132 100644 --- a/docs/_docs/reference/contextual/derivation-macro.md +++ b/docs/_docs/reference/contextual/derivation-macro.md @@ -31,12 +31,12 @@ given derived[T: Type](using Quotes): Expr[Eq[T]] and for comparison reasons we give the same signature we had with `inline`: ```scala -inline given derived[T]: (m: Mirror.Of[T]) => Eq[T] = ??? +inline given derived[T](using Mirror.Of[T]): Eq[T] = ??? ``` Note, that since a type is used in a subsequent stage it will need to be lifted -to a `Type` by using the corresponding context bound. Also, not that we can -summon the quoted `Mirror` inside the body of the `derived` this we can omit it +to a `Type` by using the corresponding context bound. Also, note that we can +summon the quoted `Mirror` inside the body of the `derived` thus we can omit it from the signature. The body of the `derived` method is shown below: @@ -49,15 +49,16 @@ given derived[T: Type](using Quotes): Expr[Eq[T]] = ev match case '{ $m: Mirror.ProductOf[T] { type MirroredElemTypes = elementTypes }} => val elemInstances = summonAll[elementTypes] - val eqProductBody: (Expr[T], Expr[T]) => Expr[Boolean] = (x, y) => - elemInstances.zipWithIndex.foldLeft(Expr(true: Boolean)) { - case (acc, (elem, index)) => - val e1 = '{$x.asInstanceOf[Product].productElement(${Expr(index)})} - val e2 = '{$y.asInstanceOf[Product].productElement(${Expr(index)})} - '{ $acc && $elem.asInstanceOf[Eq[Any]].eqv($e1, $e2) } - } - - '{ eqProduct((x: T, y: T) => ${eqProductBody('x, 'y)}) } + def eqProductBody(x: Expr[Product], y: Expr[Product])(using Quotes): Expr[Boolean] = { + elemInstances.zipWithIndex.foldLeft(Expr(true)) { + case (acc, ('{ $elem: Eq[t] }, index)) => + val indexExpr = Expr(index) + val e1 = '{ $x.productElement($indexExpr).asInstanceOf[t] } + val e2 = '{ $y.productElement($indexExpr).asInstanceOf[t] } + '{ $acc && $elem.eqv($e1, $e2) } + } + } + '{ eqProduct((x: T, y: T) => ${eqProductBody('x.asExprOf[Product], 'y.asExprOf[Product])}) } // case for Mirror.ProductOf[T] // ... diff --git a/docs/_docs/reference/contextual/derivation.md b/docs/_docs/reference/contextual/derivation.md index 972ac945a22d..87ae8a3a9a7e 100644 --- a/docs/_docs/reference/contextual/derivation.md +++ b/docs/_docs/reference/contextual/derivation.md @@ -19,9 +19,9 @@ The `derives` clause generates the following given instances for the `Eq`, `Orde companion object of `Tree`, ```scala -given [T: Eq] : Eq[Tree[T]] = Eq.derived -given [T: Ordering] : Ordering[Tree] = Ordering.derived -given [T: Show] : Show[Tree] = Show.derived +given [T: Eq] : Eq[Tree[T]] = Eq.derived +given [T: Ordering] : Ordering[Tree[T]] = Ordering.derived +given [T: Show] : Show[Tree[T]] = Show.derived ``` We say that `Tree` is the _deriving type_ and that the `Eq`, `Ordering` and `Show` instances are _derived instances_. @@ -29,17 +29,28 @@ We say that `Tree` is the _deriving type_ and that the `Eq`, `Ordering` and `Sho ### Types supporting `derives` clauses All data types can have a `derives` clause. This document focuses primarily on data types which also have a given instance -of the `Mirror` type class available. Instances of the `Mirror` type class are generated automatically by the compiler -for, - -+ enums and enum cases -+ case classes and case objects -+ sealed classes or traits that have only case classes and case objects as children +of the `Mirror` type class available. `Mirror` type class instances provide information at the type level about the components and labelling of the type. They also provide minimal term level infrastructure to allow higher level libraries to provide comprehensive derivation support. +Instances of the `Mirror` type class are generated automatically by the compiler +unconditionally for: +- enums and enum cases, +- case objects. + +Instances for `Mirror` are also generated conditionally for: +- case classes where the constructor is visible at the callsite (always true if the companion is not a case object) +- sealed classes and sealed traits where: + - there exists at least one child case, + - each child case is reachable from the parent's definition, + - if the sealed trait/class has no companion, then each child case is reachable from the callsite through the prefix of the type being mirrored, + - and where the compiler can generate a `Mirror` type class instance for each child case. + + +The `Mirror` type class definition is as follows: + ```scala sealed trait Mirror: @@ -119,10 +130,21 @@ new Mirror.Product: new Leaf(...) ``` +If a Mirror cannot be generated automatically for a given type, an error will appear explaining why it is neither a supported +sum type nor a product type. For example, if `A` is a trait that is not sealed, + +``` +No given instance of type deriving.Mirror.Of[A] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[A]: + * trait A is not a generic product because it is not a case class + * trait A is not a generic sum because it is not a sealed trait +``` + + Note the following properties of `Mirror` types, + Properties are encoded using types rather than terms. This means that they have no runtime footprint unless used and also that they are a compile time feature for use with Scala 3's metaprogramming facilities. ++ There is no restriction against the mirrored type being a local or inner class. + The kinds of `MirroredType` and `MirroredElemTypes` match the kind of the data type the mirror is an instance for. This allows `Mirror`s to support ADTs of all kinds. + There is no distinct representation type for sums or products (ie. there is no `HList` or `Coproduct` type as in @@ -145,7 +167,7 @@ following form, ```scala import scala.deriving.Mirror -def derived[T](using Mirror.Of[T]): TC[T] = ... +inline def derived[T](using Mirror.Of[T]): TC[T] = ... ``` That is, the `derived` method takes a context parameter of (some subtype of) type `Mirror` which defines the shape of diff --git a/docs/_docs/reference/dropped-features/nonlocal-returns.md b/docs/_docs/reference/dropped-features/nonlocal-returns.md index f7a78637c848..b7c707ba904b 100644 --- a/docs/_docs/reference/dropped-features/nonlocal-returns.md +++ b/docs/_docs/reference/dropped-features/nonlocal-returns.md @@ -5,7 +5,7 @@ title: "Deprecated: Nonlocal Returns" movedTo: https://docs.scala-lang.org/scala3/reference/dropped-features/nonlocal-returns.html --- -Returning from nested anonymous functions has been deprecated. +Returning from nested anonymous functions has been deprecated, and will produce a warning from version `3.2`. Nonlocal returns are implemented by throwing and catching `scala.runtime.NonLocalReturnException`-s. This is rarely what is intended by the programmer. It can be problematic because of the hidden performance cost of throwing and catching exceptions. Furthermore, it is a leaky implementation: a catch-all exception handler can intercept a `NonLocalReturnException`. diff --git a/docs/_docs/reference/dropped-features/this-qualifier.md b/docs/_docs/reference/dropped-features/this-qualifier.md index e1814e1d194e..d7de1b051da1 100644 --- a/docs/_docs/reference/dropped-features/this-qualifier.md +++ b/docs/_docs/reference/dropped-features/this-qualifier.md @@ -29,5 +29,3 @@ This can cause problems if a program tries to access the missing private field v // [C] needed if `field` is to be accessed through reflection val retained = field * field ``` - - diff --git a/docs/_docs/reference/enums/enums-index.md b/docs/_docs/reference/enums/enums-index.md index c49afaffea0d..fb46b3e3ed6b 100644 --- a/docs/_docs/reference/enums/enums-index.md +++ b/docs/_docs/reference/enums/enums-index.md @@ -1,7 +1,7 @@ --- layout: index title: "Enums" -movedTo: https://docs.scala-lang.org/scala3/reference/enums.html +movedTo: https://docs.scala-lang.org/scala3/reference/enums/index.html --- This chapter documents enums in Scala 3. diff --git a/docs/_docs/reference/experimental/explicit-nulls.md b/docs/_docs/reference/experimental/explicit-nulls.md index ec2298e3795a..2b5ffe3559c6 100644 --- a/docs/_docs/reference/experimental/explicit-nulls.md +++ b/docs/_docs/reference/experimental/explicit-nulls.md @@ -480,7 +480,7 @@ The program in [`unsafeNulls`](https://scala-lang.org/api/3.x/scala/runtime/stdL For example, the following code cannot be compiled even using unsafe nulls. Because of the Java interoperation, the type of the get method becomes `T | Null`. -```Scala +```scala def head[T](xs: java.util.List[T]): T = xs.get(0) // error ``` diff --git a/docs/_docs/reference/experimental/fewer-braces.md b/docs/_docs/reference/experimental/fewer-braces.md index f9a856dec39e..9a4a97198e23 100644 --- a/docs/_docs/reference/experimental/fewer-braces.md +++ b/docs/_docs/reference/experimental/fewer-braces.md @@ -4,80 +4,4 @@ title: "Fewer Braces" movedTo: https://docs.scala-lang.org/scala3/reference/experimental/fewer-braces.html --- -By and large, the possible indentation regions coincide with those regions where braces `{...}` are also legal, no matter whether the braces enclose an expression or a set of definitions. There is one exception, though: Arguments to function can be enclosed in braces but they cannot be simply indented instead. Making indentation always significant for function arguments would be too restrictive and fragile. - -To allow such arguments to be written without braces, a variant of the indentation scheme is implemented under language import -```scala -import language.experimental.fewerBraces -``` -Alternatively, it can be enabled with command line option `-language:experimental.fewerBraces`. - -This variant is more contentious and less stable than the rest of the significant indentation scheme. It allows to replace a function argument in braces by a `:` at the end of a line and indented code, similar to the convention for class bodies. It also allows to leave out braces around arguments that are multi-line function values. - -## Using `:` At End Of Line - - -Similar to what is done for classes and objects, a `:` that follows a function reference at the end of a line means braces can be omitted for function arguments. Example: -```scala -times(10): - println("ah") - println("ha") -``` - -The colon can also follow an infix operator: - -```scala -credentials ++ : - val file = Path.userHome / ".credentials" - if file.exists - then Seq(Credentials(file)) - else Seq() -``` - -Function calls that take multiple argument lists can also be handled this way: - -```scala -val firstLine = files.get(fileName).fold: - val fileNames = files.values - s"""no file named $fileName found among - |${values.mkString(\n)}""".stripMargin - : - f => - val lines = f.iterator.map(_.readLine) - lines.mkString("\n) -``` - - -## Lambda Arguments Without Braces - -Braces can also be omitted around multiple line function value arguments: -```scala -val xs = List.range(1, 10).map: x => - val y = x - 1 - y * y -xs.foldLeft(0): (x, y) => - x + y * 8 -``` -Braces can be omitted if the lambda starts with a parameter list and `=>` or `=>?` at the end of one line and it has an indented body on the following lines. - -## Syntax Changes - -``` -SimpleExpr ::= ... - | SimpleExpr `:` IndentedArgument - | SimpleExpr FunParams (‘=>’ | ‘?=>’) IndentedArgument -InfixExpr ::= ... - | InfixExpr id `:` IndentedArgument -IndentedArgument ::= indent (CaseClauses | Block) outdent -``` - -Note that a lambda argument must have the `=>` at the end of a line for braces -to be optional. For instance, the following would also be incorrect: - -```scala - xs.map x => x + 1 // error: braces or parentheses are required -``` -The lambda has to be enclosed in braces or parentheses: -```scala - xs.map(x => x + 1) // ok -``` +The documentation contained in this file is now part of [./indentation.html]. diff --git a/docs/_docs/reference/experimental/numeric-literals.md b/docs/_docs/reference/experimental/numeric-literals.md index 56684d2722d5..e8d4f5309c1e 100644 --- a/docs/_docs/reference/experimental/numeric-literals.md +++ b/docs/_docs/reference/experimental/numeric-literals.md @@ -73,7 +73,7 @@ trait FromDigits[T]: def fromDigits(digits: String): T ``` -Implementations of the `fromDigits` convert strings of digits to the values of the +Implementations of `fromDigits` convert strings of digits to the values of the implementation type `T`. The `digits` string consists of digits between `0` and `9`, possibly preceded by a sign ("+" or "-"). Number separator characters `_` are filtered out before diff --git a/docs/_docs/reference/experimental/overview.md b/docs/_docs/reference/experimental/overview.md index ecc253703df6..b4cb6575cf98 100644 --- a/docs/_docs/reference/experimental/overview.md +++ b/docs/_docs/reference/experimental/overview.md @@ -26,4 +26,3 @@ They can be imported at the top-level if all top-level definitions are `@experim Some experimental language features that are still in research and development can be enabled with special compiler options. These include * [`-Yexplicit-nulls`](./explicit-nulls.md). Enable support for tracking null references in the type system. - diff --git a/docs/_docs/reference/language-versions/binary-compatibility.md b/docs/_docs/reference/language-versions/binary-compatibility.md index 3e48090ba8c5..d0409d32e6b7 100644 --- a/docs/_docs/reference/language-versions/binary-compatibility.md +++ b/docs/_docs/reference/language-versions/binary-compatibility.md @@ -1,6 +1,7 @@ --- layout: doc-page title: "Binary Compatibility" +movedTo: https://docs.scala-lang.org/scala3/reference/language-versions/binary-compatibility.html --- In Scala 2 different minor versions of the compiler were free to change the way how they encode different language features in JVM bytecode so each bump of the compiler's minor version resulted in breaking binary compatibility and if a project had any Scala dependencies they all needed to be (cross-)compiled to the same minor Scala version that was used in that project itself. On the contrary, Scala 3 has a stable encoding into JVM bytecode. @@ -10,27 +11,3 @@ In addition to classfiles the compilation process in Scala 3 also produces files TASTy format is extensible but it preserves backward compatibility and the evolution happens between minor releases of the language. This means a Scala compiler in version `3.x1.y1` is able to read TASTy files produced by another compiler in version `3.x2.y2` if `x1 >= x2` (assuming two stable versions of the compiler are considered - `SNAPSHOT` or `NIGHTLY` compiler versions can read TASTy in an older stable format but their TASTY versions are not compatible between each other even if the compilers have the same minor version; also compilers in stable versions cannot read TASTy generated by an unstable version). TASTy version number has the format of `.-` and the numbering changes in parallel to language releases in such a way that a bump in language minor version corresponds to a bump in TASTy minor version (e.g. for Scala `3.0.0` the TASTy version is `28.0-0`). Experimental version set to 0 signifies a stable version while others are considered unstable/experimental. TASTy version is not strictly bound to the data format itself - any changes to the API of the standard library also require a change in TASTy minor version. - -Being able to bump the compiler version in a project without having to wait for all of its dependencies to do the same is already a big leap forward when compared to Scala 2. However, we might still try to do better, especially from the perspective of authors of libraries. -If you maintain a library and you would like it to be usable as a dependency for all Scala 3 projects, you would have to always emit TASTy in a version that would be readble by everyone, which would normally mean getting stuck at 3.0.x forever. - -To solve this problem a new experimental compiler flag `-scala-output-version ` (available since 3.1.2) has been added. Setting this flag makes the compiler produce TASTy files that should be possible to use by all Scala 3 compilers in version `` or newer. This flag was inspired by how `-java-output-version` (formerly `-release`) works for specifying the target version of JDK. More specifically this enforces emitting TASTy files in an older format ensuring that: -* the code contains no references to parts of the standard library which were added to the API after `` and would crash at runtime when a program is executed with the older version of the standard library on the classpath -* no dependency found on the classpath during compilation (except for the standard library itself) contains TASTy files produced by a compiler newer than `` (otherwise they could potentially leak such disallowed references to the standard library). - -If any of the checks above is not fulfilled or for any other reason older TASTy cannot be emitted (e.g. the code uses some new language features which cannot be expressed the the older format) the entire compilation fails (with errors reported for each of such issues). - -As this feature is experimental it does not have any special support in build tools yet (at least not in sbt 1.6.1 or lower). -E.g. when a project gets compiled with Scala compiler `3.x1.y1` and `-scala-output-version 3.x2` option and then published using sbt -then the standard library in version `3.x1.y1` gets added to the project's dependencies instead of `3.x2.y2`. -When the dependencies are added to the classpath during compilation with Scala `3.x2.y2` the compiler will crash while trying to read TASTy files in the newer format. -A currently known workaround is to modify the build definition of the dependent project by explicitly overriding the version of Scala standard library in dependencies, e.g. - -```scala -dependencyOverrides ++= Seq( - scalaOrganization.value %% "scala3-library" % scalaVersion.value, - scalaOrganization.value %% "scala3-library_sjs1" % scalaVersion.value // for Scala.js projects -) -``` - -The behaviour of `-scala-output-version` flag might still change in the future, especially it's not guaranteed that every new version of the compiler would be able to generate TASTy in all older formats going back to the one produced by `3.0.x` compiler. diff --git a/docs/_docs/reference/language-versions/language-versions.md b/docs/_docs/reference/language-versions/language-versions.md index c38538d3a82a..1bc8d939a7e9 100644 --- a/docs/_docs/reference/language-versions/language-versions.md +++ b/docs/_docs/reference/language-versions/language-versions.md @@ -1,6 +1,7 @@ --- layout: index title: "Language Versions" +movedTo: https://docs.scala-lang.org/scala3/reference/language-versions/index.html --- Additional information on interoperability and migration between Scala 2 and 3 can be found [here](https://docs.scala-lang.org/scala3/guides/migration/compatibility-intro.html). diff --git a/docs/_docs/reference/language-versions/source-compatibility.md b/docs/_docs/reference/language-versions/source-compatibility.md index 029a3674ba73..57bc15d11d88 100644 --- a/docs/_docs/reference/language-versions/source-compatibility.md +++ b/docs/_docs/reference/language-versions/source-compatibility.md @@ -1,25 +1,30 @@ --- layout: doc-page title: "Source Compatibility" -movedTo: https://docs.scala-lang.org/scala3/reference/language-versions.html +movedTo: https://docs.scala-lang.org/scala3/reference/language-versions/source-compatibility.html --- Scala 3 does NOT guarantee source compatibility between different minor language versions (e.g. some syntax valid in 3.x might get deprecated and then phased out in 3.y for y > x). There are also some syntax structures that were valid in Scala 2 but are not anymore in Scala 3. However the compiler provides a possibility to specify the desired version of syntax used in a particular file or globally for a run of the compiler to make migration between versions easier. -The default Scala language syntax version currently supported by the Dotty compiler is [`3.0`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/0$.html). There are also other language versions that can be specified instead: +The default Scala language syntax version currently supported by the Dotty compiler is [`3.2`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/2$.html). There are also other language versions that can be specified instead: -- [`3.0-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/0-migration$.html): Same as `3.0` but with a Scala 2 compatibility mode that helps moving Scala 2.13 sources over to Scala 3. In particular, it +- [`3.0-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/0-migration$.html): Same as +`3.0` and `3.1`, but with a Scala 2 compatibility mode that helps moving Scala 2.13 sources over to Scala 3. In particular, it - flags some Scala 2 constructs that are disallowed in Scala 3 as migration warnings instead of hard errors, - changes some rules to be more lenient and backwards compatible with Scala 2.13 - gives some additional warnings where the semantics has changed between Scala 2.13 and 3.0 - in conjunction with `-rewrite`, offer code rewrites from Scala 2.13 to 3.0. -- [`future`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future$.html): A preview of changes introduced in the next versions after 3.0. In the doc pages here we refer to the language version with these changes as `3.1`, but it might be that some of these changes will be rolled out in later `3.x` versions. +- [`3.0`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/0$.html), [`3.1`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/1$.html): the default set of features included in scala versions `3.0.0` to `3.1.3`. +- [`3.2`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/2$.html): the same as `3.0` and `3.1`, but in addition: + - [stricter pattern bindings](https://docs.scala-lang.org/scala3/reference/changed-features/pattern-bindings.html) are now enabled (part of `future` in earlier `3.x` releases), producing warnings for refutable patterns. These warnings can be silenced to achieve the same runtime behavior, but in `future` they become errors and refutable patterns will not compile. + - [Nonlocal returns](https://docs.scala-lang.org/scala3/reference/dropped-features/nonlocal-returns.html) now produce a warning upon usage (they are still an error under `future`). +- [`3.2-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/2-migration$.html): the same as `3.2`, but in conjunction with `-rewrite`, offer code rewrites from Scala `3.0/3.1` to `3.2`. +- [`future`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future$.html): A preview of changes that will be introduced in `3.x` versions after `3.2`. +Some Scala 2 specific idioms are dropped in this version. The feature set supported by this version may grow over time as features become stabilised for preview. -Some Scala 2 specific idioms will be dropped in this version. The feature set supported by this version will be refined over time as we approach its release. - -- [`future-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future-migration$.html): Same as `future` but with additional helpers to migrate from `3.0`. Similarly to the helpers available under `3.0-migration`, these include migration warnings and optional rewrites. +- [`future-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future-migration$.html): Same as `future` but with additional helpers to migrate from `3.2`. Similarly to the helpers available under `3.0-migration`, these include migration warnings and optional rewrites. There are two ways to specify a language version : diff --git a/docs/_docs/reference/metaprogramming/compiletime-ops.md b/docs/_docs/reference/metaprogramming/compiletime-ops.md index 944cdac5389a..d101ae0c5c00 100644 --- a/docs/_docs/reference/metaprogramming/compiletime-ops.md +++ b/docs/_docs/reference/metaprogramming/compiletime-ops.md @@ -107,7 +107,7 @@ If an inline expansion results in a call `error(msgStr)` the compiler produces an error message containing the given `msgStr`. ```scala -import scala.compiletime.{error, code} +import scala.compiletime.{error, codeOf} inline def fail() = error("failed for a reason") @@ -118,10 +118,10 @@ fail() // error: failed for a reason or ```scala -inline def fail(p1: => Any) = - error(code"failed on: $p1") +inline def fail(inline p1: Any) = + error("failed on: " + codeOf(p1)) -fail(identity("foo")) // error: failed on: identity("foo") +fail(identity("foo")) // error: failed on: identity[String]("foo") ``` ### The `scala.compiletime.ops` package diff --git a/docs/_docs/reference/new-types/new-types.md b/docs/_docs/reference/new-types/new-types.md index be98f42ec70d..8eb1d7b3bd1b 100644 --- a/docs/_docs/reference/new-types/new-types.md +++ b/docs/_docs/reference/new-types/new-types.md @@ -1,7 +1,7 @@ --- layout: index title: "New Types" -movedTo: https://docs.scala-lang.org/scala3/reference/new-types.html +movedTo: https://docs.scala-lang.org/scala3/reference/new-types/index.html --- This chapter documents the new types introduced in Scala 3. diff --git a/docs/_docs/reference/other-new-features/experimental-defs.md b/docs/_docs/reference/other-new-features/experimental-defs.md index ef9eca1ea7f5..760be63440f8 100644 --- a/docs/_docs/reference/other-new-features/experimental-defs.md +++ b/docs/_docs/reference/other-new-features/experimental-defs.md @@ -38,7 +38,7 @@ Experimental definitions can only be referenced in an experimental scope. Experi
Example 2 - + ```scala import scala.annotation.experimental @@ -72,7 +72,7 @@ Experimental definitions can only be referenced in an experimental scope. Experi
Example 3 - + ```scala import scala.annotation.experimental @@ -85,7 +85,7 @@ Experimental definitions can only be referenced in an experimental scope. Experi
Example 4 - + ```scala import scala.annotation.experimental @@ -106,7 +106,7 @@ Experimental definitions can only be referenced in an experimental scope. Experi
Example 5 - + ```scala @experimental trait ExpSAM { diff --git a/docs/_docs/reference/other-new-features/export.md b/docs/_docs/reference/other-new-features/export.md index 85f03de4104e..ded55738b919 100644 --- a/docs/_docs/reference/other-new-features/export.md +++ b/docs/_docs/reference/other-new-features/export.md @@ -24,7 +24,7 @@ class Copier: private val scanUnit = new Scanner export scanUnit.scan - export printUnit.{status => _, *} + export printUnit.{status as _, *} def status: List[String] = printUnit.status ++ scanUnit.status ``` @@ -55,8 +55,8 @@ one or more selectors `sel_i` that identify what gets an alias. Selectors can be of one of the following forms: - A _simple selector_ `x` creates aliases for all eligible members of `path` that are named `x`. - - A _renaming selector_ `x => y` creates aliases for all eligible members of `path` that are named `x`, but the alias is named `y` instead of `x`. - - An _omitting selector_ `x => _` prevents `x` from being aliased by a subsequent + - A _renaming selector_ `x as y` creates aliases for all eligible members of `path` that are named `x`, but the alias is named `y` instead of `x`. + - An _omitting selector_ `x as _` prevents `x` from being aliased by a subsequent wildcard selector. - A _given selector_ `given x` has an optional type bound `x`. It creates aliases for all eligible given instances that conform to either `x`, or `Any` if `x` is omitted, except for members that are named by a previous simple, renaming, or omitting selector. - A _wildcard selector_ `*` creates aliases for all eligible members of `path` except for given instances, @@ -75,9 +75,23 @@ A member is _eligible_ if all of the following holds: - it is not a constructor, nor the (synthetic) class part of an object, - it is a given instance (declared with `given`) if and only if the export is from a _given selector_. +It is a compile-time error if a simple or renaming selector does not identify +any eligible members. + It is a compile-time error if a simple or renaming selector does not identify any eligible members. -Type members are aliased by type definitions, and term members are aliased by method definitions. Export aliases copy the type and value parameters of the members they refer to. +Type members are aliased by type definitions, and term members are aliased by method definitions. For instance: +```scala +object O: + class C(val x: Int) + def m(c: C): Int = c.x + 1 +export O.* + // generates + // type C = O.C + // def m(c: O.C): Int = O.m(c) +``` + +Export aliases copy the type and value parameters of the members they refer to. Export aliases are always `final`. Aliases of given instances are again defined as givens (and aliases of old-style implicits are `implicit`). Aliases of extensions are again defined as extensions. Aliases of inline methods or values are again defined `inline`. There are no other modifiers that can be given to an alias. This has the following consequences for overriding: - Export aliases cannot be overridden, since they are final. @@ -132,6 +146,34 @@ Export clauses also fill a gap opened by the shift from package objects to top-l of internal compositions available to users of a package. Top-level definitions are not wrapped in a user-defined object, so they can't inherit anything. However, top-level definitions can be export clauses, which supports the facade design pattern in a safer and more flexible way. +## Export Clauses in Extensions + +An export clause may also appear in an extension. + +Example: +```scala +class StringOps(x: String): + def *(n: Int): String = ... + def capitalize: String = ... + +extension (x: String) + def take(n: Int): String = x.substring(0, n) + def drop(n: Int): String = x.substring(n) + private def moreOps = new StringOps(x) + export moreOps.* +``` +In this case the qualifier expression must be an identifier that refers to a unique parameterless extension method in the same extension clause. The export will create +extension methods for all accessible term members +in the result of the qualifier path. For instance, the extension above would be expanded to +```scala +extension (x: String) + def take(n: Int): String = x.substring(0, n) + def drop(n: Int): String = x.substring(n) + private def moreOps = StringOps(x) + def *(n: Int): String = moreOps.*(n) + def capitalize: String = moreOps.capitalize +``` + ## Syntax changes: ``` @@ -139,6 +181,8 @@ TemplateStat ::= ... | Export TopStat ::= ... | Export +ExtMethod ::= ... + | Export Export ::= ‘export’ ImportExpr {‘,’ ImportExpr} ImportExpr ::= SimpleRef {‘.’ id} ‘.’ ImportSpec ImportSpec ::= NamedSelector diff --git a/docs/_docs/reference/other-new-features/indentation.md b/docs/_docs/reference/other-new-features/indentation.md index 46bc21e2597d..8894a310b974 100644 --- a/docs/_docs/reference/other-new-features/indentation.md +++ b/docs/_docs/reference/other-new-features/indentation.md @@ -61,7 +61,7 @@ There are two rules: - after the leading parameters of an `extension`, or - after a `with` in a given instance, or - - after a ": at end of line" token (see below) + - after a `:` at the start of a template body (see discussion of `` below), or - after one of the following tokens: ``` @@ -98,7 +98,7 @@ There are two rules: - An `` is also inserted if the next token following a statement sequence starting with an `` closes an indentation region, i.e. is one of `then`, `else`, `do`, `catch`, `finally`, `yield`, `}`, `)`, `]` or `case`. - An `` is finally inserted in front of a comma that follows a statement sequence starting with an `` if the indented region is itself enclosed in parentheses + - An `` is finally inserted in front of a comma that follows a statement sequence starting with an `` if the indented region is itself enclosed in parentheses. It is an error if the indentation width of the token following an `` does not match the indentation of some previous line in the enclosing indentation region. For instance, the following would be rejected. @@ -134,12 +134,14 @@ is parsed as `if x then a + b + c else d`. The Scala grammar uses the term _template body_ for the definitions of a class, trait, or object that are normally enclosed in braces. The braces around a template body can also be omitted by means of the following rule. -If at the point where a template body can start there is a `:` that occurs at the end -of a line, and that is followed by at least one indented statement, the recognized -token is changed from ":" to ": at end of line". The latter token is one of the tokens -that can start an indentation region. The Scala grammar is changed so an optional ": at end of line" is allowed in front of a template body. +A template body can alternatively consist of a colon followed by one or more indented statements. To this purpose we introduce a new `` token that reads as +the standard colon "`:`" but is generated instead of it where `` +is legal according to the context free syntax, but only if the previous token +is an alphanumeric identifier, a backticked identifier, or one of the tokens `this`, `super`, "`)`", and "`]`". -Analogous rules apply for enum bodies and local packages containing nested definitions. +An indentation region can start after a ``. A template body may be either enclosed in braces, or it may start with +` ` and end with ``. +Analogous rules apply for enum bodies, type refinements, and local packages containing nested definitions. With these new rules, the following constructs are all valid: @@ -170,17 +172,19 @@ In each case, the `:` at the end of line can be replaced without change of meani The syntax changes allowing this are as follows: +Define for an arbitrary sequence of tokens or non-terminals `TS`: + ``` -Template ::= InheritClauses [colonEol] [TemplateBody] -EnumDef ::= id ClassConstr InheritClauses [colonEol] EnumBody -Packaging ::= ‘package’ QualId [nl | colonEol] ‘{’ TopStatSeq ‘}’ -SimpleExpr ::= ‘new’ ConstrApp {‘with’ ConstrApp} [[colonEol] TemplateBody] +:<<< TS >>> ::= ‘{’ TS ‘}’ + | +``` +Then the grammar changes as follows: +``` +TemplateBody ::= :<<< [SelfType] TemplateStat {semi TemplateStat} >>> +EnumBody ::= :<<< [SelfType] EnumStat {semi EnumStat} >>> +Refinement ::= :<<< [RefineDcl] {semi [RefineDcl]} >>> +Packaging ::= ‘package’ QualId :<<< TopStats >>> ``` - -Here, `colonEol` stands for ": at end of line", as described above. -The lexical analyzer is modified so that a `:` at the end of a line -is reported as `colonEol` if the parser is at a point where a `colonEol` is -valid as next token. ### Spaces vs Tabs @@ -444,15 +448,15 @@ indented regions where possible. When invoked with options `-rewrite -no-indent` The `-indent` option only works on [new-style syntax](./control-syntax.md). So to go from old-style syntax to new-style indented code one has to invoke the compiler twice, first with options `-rewrite -new-syntax`, then again with options `-rewrite -indent`. To go in the opposite direction, from indented code to old-style syntax, it's `-rewrite -no-indent`, followed by `-rewrite -old-syntax`. -### Variant: Indentation Marker `:` +### Variant: Indentation Marker `:` for Arguments -Generally, the possible indentation regions coincide with those regions where braces `{...}` are also legal, no matter whether the braces enclose an expression or a set of definitions. There is one exception, though: Arguments to function can be enclosed in braces but they cannot be simply indented instead. Making indentation always significant for function arguments would be too restrictive and fragile. +Generally, the possible indentation regions coincide with those regions where braces `{...}` are also legal, no matter whether the braces enclose an expression or a set of definitions. There is one exception, though: Arguments to functions can be enclosed in braces but they cannot be simply indented instead. Making indentation always significant for function arguments would be too restrictive and fragile. To allow such arguments to be written without braces, a variant of the indentation scheme is implemented under language import ```scala import language.experimental.fewerBraces ``` -This variant is more contentious and less stable than the rest of the significant indentation scheme. In this variant, a colon `:` at the end of a line is also one of the possible tokens that opens an indentation region. Examples: +In this variant, a `` token is also recognized where function argument would be expected. Examples: ```scala times(10): @@ -462,24 +466,44 @@ times(10): or +```scala +credentials `++`: + val file = Path.userHome / ".credentials" + if file.exists + then Seq(Credentials(file)) + else Seq() +``` + +or + ```scala xs.map: x => val y = x - 1 y * y ``` - -The colon is usable not only for lambdas and by-name parameters, but -also even for ordinary parameters: +What's more, a `:` in these settings can also be followed on the same line by the parameter part and arrow of a lambda. So the last example could be compressed to this: ```scala -credentials ++ : - val file = Path.userHome / ".credentials" - if file.exists - then Seq(Credentials(file)) - else Seq() +xs.map: x => + val y = x - 1 + y * y ``` +and the following would also be legal: +```scala +xs.foldLeft(0): (x, y) => + x + y +``` + +The grammar changes for this variant are as follows. -How does this syntax variant work? Colons at the end of lines are their own token, distinct from normal `:`. -The Scala grammar is changed so that colons at end of lines are accepted at all points -where an opening brace enclosing an argument is legal. Special provisions are taken so that method result types can still use a colon on the end of a line, followed by the actual type on the next. +``` +SimpleExpr ::= ... + | SimpleExpr ColonArgument +InfixExpr ::= ... + | InfixExpr id ColonArgument +ColonArgument ::= colon [LambdaStart] + indent (CaseClauses | Block) outdent +LambdaStart ::= FunParams (‘=>’ | ‘?=>’) + | HkTypeParamClause ‘=>’ +``` diff --git a/docs/_docs/reference/other-new-features/targetName.md b/docs/_docs/reference/other-new-features/targetName.md index d2a654697d15..09886968a232 100644 --- a/docs/_docs/reference/other-new-features/targetName.md +++ b/docs/_docs/reference/other-new-features/targetName.md @@ -29,7 +29,7 @@ The [`@targetName`](https://scala-lang.org/api/3.x/scala/annotation/targetName.h of type `String`. That string is called the _external name_ of the definition that's annotated. - 2. A `@targetName` annotation can be given for all kinds of definitions. + 2. A `@targetName` annotation can be given for all kinds of definitions except a top-level `class`, `trait`, or `object`. 3. The name given in a [`@targetName`](https://scala-lang.org/api/3.x/scala/annotation/targetName.html) annotation must be a legal name for the defined entities on the host platform. diff --git a/docs/_docs/reference/overview.md b/docs/_docs/reference/overview.md index 9b184a7408ba..d982cbeecff8 100644 --- a/docs/_docs/reference/overview.md +++ b/docs/_docs/reference/overview.md @@ -45,7 +45,7 @@ These constructs replace existing constructs with the aim of making the language - [Extension methods](contextual/extension-methods.md) replace implicit classes with a clearer and simpler mechanism. - [Opaque type aliases](other-new-features/opaques.md) - replace most uses of value classes while guaranteeing absence of boxing. + replace most uses of value classes while guaranteeing the absence of boxing. - [Top-level definitions](dropped-features/package-objects.md) replace package objects, dropping syntactic boilerplate. - [Export clauses](other-new-features/export.md) @@ -131,7 +131,7 @@ These are additions to the language that make it more powerful or pleasant to us ## Metaprogramming -The following constructs together aim to put metaprogramming in Scala on a new basis. So far, metaprogramming was achieved by a combination of macros and libraries such as [Shapeless](https://github.com/milessabin/shapeless) that were in turn based on some key macros. Current Scala 2 macro mechanisms are a thin veneer on top the current Scala 2 compiler, which makes them fragile and in many cases impossible to port to Scala 3. +The following constructs together aim to put metaprogramming in Scala on a new basis. So far, metaprogramming was achieved by a combination of macros and libraries such as [Shapeless](https://github.com/milessabin/shapeless) that were in turn based on some key macros. Current Scala 2 macro mechanisms are a thin veneer on top of the current Scala 2 compiler, which makes them fragile and in many cases impossible to port to Scala 3. It's worth noting that macros were never included in the [Scala 2 language specification](https://scala-lang.org/files/archive/spec/2.13/) and were so far made available only under an `-experimental` flag. This has not prevented their widespread usage. diff --git a/docs/_docs/reference/syntax.md b/docs/_docs/reference/syntax.md index 57b21659b7c4..015609a450a9 100644 --- a/docs/_docs/reference/syntax.md +++ b/docs/_docs/reference/syntax.md @@ -4,15 +4,29 @@ title: "Scala 3 Syntax Summary" movedTo: https://docs.scala-lang.org/scala3/reference/syntax.html --- + + The following description of Scala tokens uses literal characters `‘c’` when referring to the ASCII fragment `\u0000` – `\u007F`. _Unicode escapes_ are used to represent the [Unicode character](https://www.w3.org/International/articles/definitions-characters/) with the given hexadecimal code: -```ebnf -UnicodeEscape ::= ‘\’ ‘u’ {‘u’} hexDigit hexDigit hexDigit hexDigit ; -hexDigit ::= ‘0’ | … | ‘9’ | ‘A’ | … | ‘F’ | ‘a’ | … | ‘f’ ; +``` +UnicodeEscape ::= ‘\’ ‘u’ {‘u’} hexDigit hexDigit hexDigit hexDigit +hexDigit ::= ‘0’ | … | ‘9’ | ‘A’ | … | ‘F’ | ‘a’ | … | ‘f’ ``` Informal descriptions are typeset as `“some comment”`. @@ -22,70 +36,71 @@ Informal descriptions are typeset as `“some comment”`. The lexical syntax of Scala is given by the following grammar in EBNF form. -```ebnf -whiteSpace ::= ‘\u0020’ | ‘\u0009’ | ‘\u000D’ | ‘\u000A’ ; -upper ::= ‘A’ | … | ‘Z’ | ‘\$’ | ‘_’ “… and Unicode category Lu” ; -lower ::= ‘a’ | … | ‘z’ “… and Unicode category Ll” ; -letter ::= upper | lower “… and Unicode categories Lo, Lt, Nl” ; -digit ::= ‘0’ | … | ‘9’ ; -paren ::= ‘(’ | ‘)’ | ‘[’ | ‘]’ | ‘{’ | ‘}’ | ‘'(’ | ‘'[’ | ‘'{’ ; -delim ::= ‘`’ | ‘'’ | ‘"’ | ‘.’ | ‘;’ | ‘,’ ; -opchar ::= “printableChar not matched by (whiteSpace | upper | - lower | letter | digit | paren | delim | opchar | - Unicode_Sm | Unicode_So)” ; -printableChar ::= “all characters in [\u0020, \u007F] inclusive” ; -charEscapeSeq ::= ‘\’ (‘b’ | ‘t’ | ‘n’ | ‘f’ | ‘r’ | ‘"’ | ‘'’ | ‘\’) ; - -op ::= opchar {opchar} ; -varid ::= lower idrest ; +``` +whiteSpace ::= ‘\u0020’ | ‘\u0009’ | ‘\u000D’ | ‘\u000A’ +upper ::= ‘A’ | … | ‘Z’ | ‘\$’ | ‘_’ “… and Unicode category Lu” +lower ::= ‘a’ | … | ‘z’ “… and Unicode category Ll” +letter ::= upper | lower “… and Unicode categories Lo, Lt, Nl” +digit ::= ‘0’ | … | ‘9’ +paren ::= ‘(’ | ‘)’ | ‘[’ | ‘]’ | ‘{’ | ‘}’ +delim ::= ‘`’ | ‘'’ | ‘"’ | ‘.’ | ‘;’ | ‘,’ +opchar ::= ‘!’ | ‘#’ | ‘%’ | ‘&’ | ‘*’ | ‘+’ | ‘-’ | ‘/’ | ‘:’ | + ‘<’ | ‘=’ | ‘>’ | ‘?’ | ‘@’ | ‘\’ | ‘^’ | ‘|’ | ‘~’ + “… and Unicode categories Sm, So” +printableChar ::= “all characters in [\u0020, \u007E] inclusive” +charEscapeSeq ::= ‘\’ (‘b’ | ‘t’ | ‘n’ | ‘f’ | ‘r’ | ‘"’ | ‘'’ | ‘\’) + +op ::= opchar {opchar} +varid ::= lower idrest alphaid ::= upper idrest - | varid ; + | varid plainid ::= alphaid - | op ; + | op id ::= plainid - | ‘`’ { charNoBackQuoteOrNewline | UnicodeEscape | charEscapeSeq } ‘`’ ; -idrest ::= {letter | digit} [‘_’ op] ; -quoteId ::= ‘'’ alphaid ; + | ‘`’ { charNoBackQuoteOrNewline | UnicodeEscape | charEscapeSeq } ‘`’ +idrest ::= {letter | digit} [‘_’ op] +quoteId ::= ‘'’ alphaid +spliceId ::= ‘$’ alphaid ; -integerLiteral ::= (decimalNumeral | hexNumeral) [‘L’ | ‘l’] ; -decimalNumeral ::= ‘0’ | nonZeroDigit [{digit | ‘_’} digit] ; -hexNumeral ::= ‘0’ (‘x’ | ‘X’) hexDigit [{hexDigit | ‘_’} hexDigit] ; -nonZeroDigit ::= ‘1’ | … | ‘9’ ; +integerLiteral ::= (decimalNumeral | hexNumeral) [‘L’ | ‘l’] +decimalNumeral ::= ‘0’ | nonZeroDigit [{digit | ‘_’} digit] +hexNumeral ::= ‘0’ (‘x’ | ‘X’) hexDigit [{hexDigit | ‘_’} hexDigit] +nonZeroDigit ::= ‘1’ | … | ‘9’ floatingPointLiteral ::= [decimalNumeral] ‘.’ digit [{digit | ‘_’} digit] [exponentPart] [floatType] | decimalNumeral exponentPart [floatType] - | decimalNumeral floatType ; -exponentPart ::= (‘E’ | ‘e’) [‘+’ | ‘-’] digit [{digit | ‘_’} digit] ; -floatType ::= ‘F’ | ‘f’ | ‘D’ | ‘d’ ; + | decimalNumeral floatType +exponentPart ::= (‘E’ | ‘e’) [‘+’ | ‘-’] digit [{digit | ‘_’} digit] +floatType ::= ‘F’ | ‘f’ | ‘D’ | ‘d’ -booleanLiteral ::= ‘true’ | ‘false’ ; +booleanLiteral ::= ‘true’ | ‘false’ -characterLiteral ::= ‘'’ (printableChar | charEscapeSeq) ‘'’ ; +characterLiteral ::= ‘'’ (printableChar | charEscapeSeq) ‘'’ stringLiteral ::= ‘"’ {stringElement} ‘"’ - | ‘"""’ multiLineChars ‘"""’ ; + | ‘"""’ multiLineChars ‘"""’ stringElement ::= printableChar \ (‘"’ | ‘\’) | UnicodeEscape - | charEscapeSeq ; -multiLineChars ::= {[‘"’] [‘"’] char \ ‘"’} {‘"’} ; + | charEscapeSeq +multiLineChars ::= {[‘"’] [‘"’] char \ ‘"’} {‘"’} processedStringLiteral ::= alphaid ‘"’ {[‘\’] processedStringPart | ‘\\’ | ‘\"’} ‘"’ - | alphaid ‘"""’ {[‘"’] [‘"’] char \ (‘"’ | ‘$’) | escape} {‘"’} ‘"""’ ; + | alphaid ‘"""’ {[‘"’] [‘"’] char \ (‘"’ | ‘$’) | escape} {‘"’} ‘"""’ processedStringPart - ::= printableChar \ (‘"’ | ‘$’ | ‘\’) | escape ; + ::= printableChar \ (‘"’ | ‘$’ | ‘\’) | escape escape ::= ‘$$’ | ‘$’ letter { letter | digit } - | ‘{’ Block [‘;’ whiteSpace stringFormat whiteSpace] ‘}’ ; -stringFormat ::= {printableChar \ (‘"’ | ‘}’ | ‘ ’ | ‘\t’ | ‘\n’)} ; + | ‘{’ Block [‘;’ whiteSpace stringFormat whiteSpace] ‘}’ +stringFormat ::= {printableChar \ (‘"’ | ‘}’ | ‘ ’ | ‘\t’ | ‘\n’)} -symbolLiteral ::= ‘'’ plainid // until 2.13 ; +symbolLiteral ::= ‘'’ plainid // until 2.13 comment ::= ‘/*’ “any sequence of characters; nested comments are allowed” ‘*/’ - | ‘//’ “any sequence of characters up to end of line” ; + | ‘//’ “any sequence of characters up to end of line” -nl ::= “new line character” ; -semi ::= ‘;’ | nl {nl} ; +nl ::= “new line character” +semi ::= ‘;’ | nl {nl} ``` ## Optional Braces @@ -95,20 +110,24 @@ The lexical analyzer also inserts `indent` and `outdent` tokens that represent r In the context-free productions below we use the notation `<<< ts >>>` to indicate a token sequence `ts` that is either enclosed in a pair of braces `{ ts }` or that constitutes an indented region `indent ts outdent`. Analogously, the notation `:<<< ts >>>` indicates a token sequence `ts` that is either enclosed in a pair of braces `{ ts }` or that constitutes an indented region `indent ts outdent` that follows -a `:` at the end of a line. +a `colon` token. -```ebnf +A `colon` token reads as the standard colon "`:`" but is generated instead of it where `colon` is legal according to the context free syntax, but only if the previous token +is an alphanumeric identifier, a backticked identifier, or one of the tokens `this`, `super`, `new`, "`)`", and "`]`". + +``` +colon ::= ':' -- with side conditions explained above <<< ts >>> ::= ‘{’ ts ‘}’ - | indent ts outdent ; + | indent ts outdent :<<< ts >>> ::= [nl] ‘{’ ts ‘}’ - | `:` indent ts outdent ; + | colon indent ts outdent ``` ## Keywords ### Regular keywords -```ebnf +``` abstract case catch class def do else enum export extends false final finally for given if implicit import lazy match new @@ -121,9 +140,8 @@ type val var while with yield ### Soft keywords -```ebnf -as derives end extension infix inline opaque open throws -transparent using | * + - +``` +as derives end extension infix inline opaque open transparent using | * + - ``` See the [separate section on soft keywords](./soft-modifier.md) for additional @@ -135,45 +153,45 @@ The context-free syntax of Scala is given by the following EBNF grammar: ### Literals and Paths -```ebnf +``` SimpleLiteral ::= [‘-’] integerLiteral | [‘-’] floatingPointLiteral | booleanLiteral | characterLiteral - | stringLiteral ; + | stringLiteral Literal ::= SimpleLiteral | processedStringLiteral | symbolLiteral - | ‘null’ ; + | ‘null’ -QualId ::= id {‘.’ id} ; -ids ::= id {‘,’ id} ; +QualId ::= id {‘.’ id} +ids ::= id {‘,’ id} SimpleRef ::= id | [id ‘.’] ‘this’ - | [id ‘.’] ‘super’ [ClassQualifier] ‘.’ id ; + | [id ‘.’] ‘super’ [ClassQualifier] ‘.’ id -ClassQualifier ::= ‘[’ id ‘]’ ; +ClassQualifier ::= ‘[’ id ‘]’ ``` ### Types -```ebnf +``` Type ::= FunType | HkTypeParamClause ‘=>>’ Type | FunParamClause ‘=>>’ Type | MatchType - | InfixType ; + | InfixType FunType ::= FunTypeArgs (‘=>’ | ‘?=>’) Type - | HKTypeParamClause '=>' Type ; + | HKTypeParamClause '=>' Type FunTypeArgs ::= InfixType | ‘(’ [ FunArgTypes ] ‘)’ - | FunParamClause ; -FunParamClause ::= ‘(’ TypedFunParam {‘,’ TypedFunParam } ‘)’ ; -TypedFunParam ::= id ‘:’ Type ; -MatchType ::= InfixType `match` <<< TypeCaseClauses >>> ; -InfixType ::= RefinedType {id [nl] RefinedType} ; -RefinedType ::= AnnotType {[nl] Refinement} ; -AnnotType ::= SimpleType {Annotation} ; + | FunParamClause +FunParamClause ::= ‘(’ TypedFunParam {‘,’ TypedFunParam } ‘)’ +TypedFunParam ::= id ‘:’ Type +MatchType ::= InfixType `match` <<< TypeCaseClauses >>> +InfixType ::= RefinedType {id [nl] RefinedType} +RefinedType ::= AnnotType {[nl] Refinement} +AnnotType ::= SimpleType {Annotation} SimpleType ::= SimpleLiteral | ‘?’ TypeBounds @@ -182,37 +200,35 @@ SimpleType ::= SimpleLiteral | Singleton ‘.’ ‘type’ | ‘(’ Types ‘)’ | Refinement - | ‘$’ ‘{’ Block ‘}’ -- unless inside quoted pattern - | ‘$’ ‘{’ Pattern ‘}’ -- only inside quoted pattern | SimpleType1 TypeArgs - | SimpleType1 ‘#’ id ; + | SimpleType1 ‘#’ id Singleton ::= SimpleRef | SimpleLiteral - | Singleton ‘.’ id ; + | Singleton ‘.’ id FunArgType ::= Type - | ‘=>’ Type ; -FunArgTypes ::= FunArgType { ‘,’ FunArgType } ; -ParamType ::= [‘=>’] ParamValueType ; -ParamValueType ::= Type [‘*’] ; -TypeArgs ::= ‘[’ Types ‘]’ ; -Refinement ::= ‘{’ [RefineDcl] {semi [RefineDcl]} ‘}’ ; -TypeBounds ::= [‘>:’ Type] [‘<:’ Type] ; -TypeParamBounds ::= TypeBounds {‘:’ Type} ; -Types ::= Type {‘,’ Type} ; + | ‘=>’ Type +FunArgTypes ::= FunArgType { ‘,’ FunArgType } +ParamType ::= [‘=>’] ParamValueType +ParamValueType ::= Type [‘*’] +TypeArgs ::= ‘[’ Types ‘]’ +Refinement ::= :<<< [RefineDcl] {semi [RefineDcl]} >>> +TypeBounds ::= [‘>:’ Type] [‘<:’ Type] +TypeParamBounds ::= TypeBounds {‘:’ Type} +Types ::= Type {‘,’ Type} ``` ### Expressions -```ebnf +``` Expr ::= FunParams (‘=>’ | ‘?=>’) Expr | HkTypeParamClause ‘=>’ Expr - | Expr1 ; + | Expr1 BlockResult ::= FunParams (‘=>’ | ‘?=>’) Block | HkTypeParamClause ‘=>’ Block - | Expr1 ; + | Expr1 FunParams ::= Bindings | id - | ‘_’ ; + | ‘_’ Expr1 ::= [‘inline’] ‘if’ ‘(’ Expr ‘)’ {nl} Expr [[semi] ‘else’ Expr] | [‘inline’] ‘if’ Expr ‘then’ Expr [[semi] ‘else’ Expr] | ‘while’ ‘(’ Expr ‘)’ {nl} Expr @@ -226,23 +242,22 @@ Expr1 ::= [‘inline’] ‘if’ ‘(’ Expr ‘)’ {nl} Expr [[ | PrefixOperator SimpleExpr ‘=’ Expr | SimpleExpr ArgumentExprs ‘=’ Expr | PostfixExpr [Ascription] - | ‘inline’ InfixExpr MatchClause ; + | ‘inline’ InfixExpr MatchClause Ascription ::= ‘:’ InfixType - | ‘:’ Annotation {Annotation} ; -Catches ::= ‘catch’ (Expr | ExprCaseClause) ; -PostfixExpr ::= InfixExpr [id] -- only if language.postfixOperators is enabled ; + | ‘:’ Annotation {Annotation} +Catches ::= ‘catch’ (Expr | ExprCaseClause) +PostfixExpr ::= InfixExpr [id] -- only if language.postfixOperators is enabled InfixExpr ::= PrefixExpr | InfixExpr id [nl] InfixExpr - | InfixExpr MatchClause ; -MatchClause ::= ‘match’ <<< CaseClauses >>> ; -PrefixExpr ::= [PrefixOperator] SimpleExpr ; -PrefixOperator ::= ‘-’ | ‘+’ | ‘~’ | ‘!’ ; + | InfixExpr MatchClause +MatchClause ::= ‘match’ <<< CaseClauses >>> +PrefixExpr ::= [PrefixOperator] SimpleExpr +PrefixOperator ::= ‘-’ | ‘+’ | ‘~’ | ‘!’ -- unless backquoted SimpleExpr ::= SimpleRef | Literal | ‘_’ | BlockExpr - | ‘$’ ‘{’ Block ‘}’ -- unless inside quoted pattern - | ‘$’ ‘{’ Pattern ‘}’ -- only inside quoted pattern + | ExprSplice | Quoted | quoteId -- only inside splices | ‘new’ ConstrApp {‘with’ ConstrApp} [TemplateBody] @@ -251,174 +266,179 @@ SimpleExpr ::= SimpleRef | SimpleExpr ‘.’ id | SimpleExpr ‘.’ MatchClause | SimpleExpr TypeArgs - | SimpleExpr ArgumentExprs ; + | SimpleExpr ArgumentExprs Quoted ::= ‘'’ ‘{’ Block ‘}’ - | ‘'’ ‘[’ Type ‘]’ ; -ExprsInParens ::= ExprInParens {‘,’ ExprInParens} ; + | ‘'’ ‘[’ Type ‘]’ +ExprSplice ::= spliceId -- if inside quoted block + | ‘$’ ‘{’ Block ‘}’ -- unless inside quoted pattern + | ‘$’ ‘{’ Pattern ‘}’ -- when inside quoted pattern +ExprsInParens ::= ExprInParens {‘,’ ExprInParens} ExprInParens ::= PostfixExpr ‘:’ Type - | Expr ; + | Expr ParArgumentExprs ::= ‘(’ [‘using’] ExprsInParens ‘)’ - | ‘(’ [ExprsInParens ‘,’] PostfixExpr ‘*’ ‘)’ ; + | ‘(’ [ExprsInParens ‘,’] PostfixExpr ‘*’ ‘)’ ArgumentExprs ::= ParArgumentExprs - | BlockExpr ; -BlockExpr ::= <<< (CaseClauses | Block) >>> ; -Block ::= {BlockStat semi} [BlockResult] ; + | BlockExpr +BlockExpr ::= <<< (CaseClauses | Block) >>> +Block ::= {BlockStat semi} [BlockResult] BlockStat ::= Import | {Annotation {nl}} {LocalModifier} Def | Extension | Expr1 - | EndMarker ; + | EndMarker ForExpr ::= ‘for’ ‘(’ Enumerators0 ‘)’ {nl} [‘do‘ | ‘yield’] Expr | ‘for’ ‘{’ Enumerators0 ‘}’ {nl} [‘do‘ | ‘yield’] Expr - | ‘for’ Enumerators0 (‘do‘ | ‘yield’) Expr ; -Enumerators0 ::= {nl} Enumerators [semi] ; -Enumerators ::= Generator {semi Enumerator | Guard} ; + | ‘for’ Enumerators0 (‘do‘ | ‘yield’) Expr +Enumerators0 ::= {nl} Enumerators [semi] +Enumerators ::= Generator {semi Enumerator | Guard} Enumerator ::= Generator | Guard {Guard} - | Pattern1 ‘=’ Expr ; -Generator ::= [‘case’] Pattern1 ‘<-’ Expr ; -Guard ::= ‘if’ PostfixExpr ; - -CaseClauses ::= CaseClause { CaseClause } ; -CaseClause ::= ‘case’ Pattern [Guard] ‘=>’ Block ; -ExprCaseClause ::= ‘case’ Pattern [Guard] ‘=>’ Expr ; -TypeCaseClauses ::= TypeCaseClause { TypeCaseClause } ; -TypeCaseClause ::= ‘case’ InfixType ‘=>’ Type [semi] ; - -Pattern ::= Pattern1 { ‘|’ Pattern1 } ; -Pattern1 ::= Pattern2 [‘:’ RefinedType] ; -Pattern2 ::= [id ‘@’] InfixPattern [‘*’] ; -InfixPattern ::= SimplePattern { id [nl] SimplePattern } ; + | Pattern1 ‘=’ Expr +Generator ::= [‘case’] Pattern1 ‘<-’ Expr +Guard ::= ‘if’ PostfixExpr + +CaseClauses ::= CaseClause { CaseClause } +CaseClause ::= ‘case’ Pattern [Guard] ‘=>’ Block +ExprCaseClause ::= ‘case’ Pattern [Guard] ‘=>’ Expr +TypeCaseClauses ::= TypeCaseClause { TypeCaseClause } +TypeCaseClause ::= ‘case’ (InfixType | ‘_’) ‘=>’ Type [semi] + +Pattern ::= Pattern1 { ‘|’ Pattern1 } +Pattern1 ::= Pattern2 [‘:’ RefinedType] +Pattern2 ::= [id ‘@’] InfixPattern [‘*’] +InfixPattern ::= SimplePattern { id [nl] SimplePattern } SimplePattern ::= PatVar | Literal | ‘(’ [Patterns] ‘)’ | Quoted | SimplePattern1 [TypeArgs] [ArgumentPatterns] - | ‘given’ RefinedType ; + | ‘given’ RefinedType SimplePattern1 ::= SimpleRef - | SimplePattern1 ‘.’ id ; + | SimplePattern1 ‘.’ id PatVar ::= varid - | ‘_’ ; -Patterns ::= Pattern {‘,’ Pattern} ; + | ‘_’ +Patterns ::= Pattern {‘,’ Pattern} ArgumentPatterns ::= ‘(’ [Patterns] ‘)’ - | ‘(’ [Patterns ‘,’] PatVar ‘*’ ‘)’ ; + | ‘(’ [Patterns ‘,’] PatVar ‘*’ ‘)’ ``` ### Type and Value Parameters -```ebnf -ClsTypeParamClause::= ‘[’ ClsTypeParam {‘,’ ClsTypeParam} ‘]’ ; -ClsTypeParam ::= {Annotation} [‘+’ | ‘-’] id [HkTypeParamClause] TypeParamBounds ; +``` +ClsTypeParamClause::= ‘[’ ClsTypeParam {‘,’ ClsTypeParam} ‘]’ +ClsTypeParam ::= {Annotation} [‘+’ | ‘-’] id [HkTypeParamClause] TypeParamBounds -DefTypeParamClause::= ‘[’ DefTypeParam {‘,’ DefTypeParam} ‘]’ ; -DefTypeParam ::= {Annotation} id [HkTypeParamClause] TypeParamBounds ; +DefTypeParamClause::= ‘[’ DefTypeParam {‘,’ DefTypeParam} ‘]’ +DefTypeParam ::= {Annotation} id [HkTypeParamClause] TypeParamBounds -TypTypeParamClause::= ‘[’ TypTypeParam {‘,’ TypTypeParam} ‘]’ ; -TypTypeParam ::= {Annotation} id [HkTypeParamClause] TypeBounds ; +TypTypeParamClause::= ‘[’ TypTypeParam {‘,’ TypTypeParam} ‘]’ +TypTypeParam ::= {Annotation} id [HkTypeParamClause] TypeBounds -HkTypeParamClause ::= ‘[’ HkTypeParam {‘,’ HkTypeParam} ‘]’ ; -HkTypeParam ::= {Annotation} [‘+’ | ‘-’] (id [HkTypeParamClause] | ‘_’) TypeBounds ; +HkTypeParamClause ::= ‘[’ HkTypeParam {‘,’ HkTypeParam} ‘]’ +HkTypeParam ::= {Annotation} [‘+’ | ‘-’] (id [HkTypeParamClause] | ‘_’) TypeBounds -ClsParamClauses ::= {ClsParamClause} [[nl] ‘(’ [‘implicit’] ClsParams ‘)’] ; +ClsParamClauses ::= {ClsParamClause} [[nl] ‘(’ [‘implicit’] ClsParams ‘)’] ClsParamClause ::= [nl] ‘(’ ClsParams ‘)’ - | [nl] ‘(’ ‘using’ (ClsParams | FunArgTypes) ‘)’ ; -ClsParams ::= ClsParam {‘,’ ClsParam} ; -ClsParam ::= {Annotation} [{Modifier} (‘val’ | ‘var’) | ‘inline’] Param ; -Param ::= id ‘:’ ParamType [‘=’ Expr] ; - -DefParamClauses ::= {DefParamClause} [[nl] ‘(’ [‘implicit’] DefParams ‘)’] ; -DefParamClause ::= [nl] ‘(’ DefParams ‘)’ | UsingParamClause ; -UsingParamClause ::= [nl] ‘(’ ‘using’ (DefParams | FunArgTypes) ‘)’ ; -DefParams ::= DefParam {‘,’ DefParam} ; -DefParam ::= {Annotation} [‘inline’] Param ; + | [nl] ‘(’ ‘using’ (ClsParams | FunArgTypes) ‘)’ +ClsParams ::= ClsParam {‘,’ ClsParam} +ClsParam ::= {Annotation} [{Modifier} (‘val’ | ‘var’) | ‘inline’] Param +Param ::= id ‘:’ ParamType [‘=’ Expr] + +DefParamClauses ::= {DefParamClause} [[nl] ‘(’ [‘implicit’] DefParams ‘)’] +DefParamClause ::= [nl] ‘(’ DefParams ‘)’ | UsingParamClause +UsingParamClause ::= [nl] ‘(’ ‘using’ (DefParams | FunArgTypes) ‘)’ +DefParams ::= DefParam {‘,’ DefParam} +DefParam ::= {Annotation} [‘inline’] Param ``` ### Bindings and Imports -```ebnf -Bindings ::= ‘(’ [Binding {‘,’ Binding}] ‘)’ ; -Binding ::= (id | ‘_’) [‘:’ Type] ; +``` +Bindings ::= ‘(’ [Binding {‘,’ Binding}] ‘)’ +Binding ::= (id | ‘_’) [‘:’ Type] Modifier ::= LocalModifier | AccessModifier | ‘override’ - | ‘opaque’ ; + | ‘opaque’ LocalModifier ::= ‘abstract’ | ‘final’ | ‘sealed’ | ‘open’ | ‘implicit’ | ‘lazy’ - | ‘inline’ ; -AccessModifier ::= (‘private’ | ‘protected’) [AccessQualifier] ; -AccessQualifier ::= ‘[’ id ‘]’ ; + | ‘inline’ +AccessModifier ::= (‘private’ | ‘protected’) [AccessQualifier] +AccessQualifier ::= ‘[’ id ‘]’ -Annotation ::= ‘@’ SimpleType1 {ParArgumentExprs} ; +Annotation ::= ‘@’ SimpleType1 {ParArgumentExprs} -Import ::= ‘import’ ImportExpr {‘,’ ImportExpr} ; -Export ::= ‘export’ ImportExpr {‘,’ ImportExpr} ; +Import ::= ‘import’ ImportExpr {‘,’ ImportExpr} +Export ::= ‘export’ ImportExpr {‘,’ ImportExpr} ImportExpr ::= SimpleRef {‘.’ id} ‘.’ ImportSpec - | SimpleRef ‘as’ id ; + | SimpleRef ‘as’ id ImportSpec ::= NamedSelector | WildcardSelector - | ‘{’ ImportSelectors) ‘}’ ; -NamedSelector ::= id [‘as’ (id | ‘_’)] ; -WildCardSelector ::= ‘*' | ‘given’ [InfixType] ; + | ‘{’ ImportSelectors) ‘}’ +NamedSelector ::= id [‘as’ (id | ‘_’)] +WildCardSelector ::= ‘*' | ‘given’ [InfixType] ImportSelectors ::= NamedSelector [‘,’ ImportSelectors] - | WildCardSelector {‘,’ WildCardSelector} ; + | WildCardSelector {‘,’ WildCardSelector} -EndMarker ::= ‘end’ EndMarkerTag -- when followed by EOL ; +EndMarker ::= ‘end’ EndMarkerTag -- when followed by EOL EndMarkerTag ::= id | ‘if’ | ‘while’ | ‘for’ | ‘match’ | ‘try’ - | ‘new’ | ‘this’ | ‘given’ | ‘extension’ | ‘val’ ; + | ‘new’ | ‘this’ | ‘given’ | ‘extension’ | ‘val’ ``` ### Declarations and Definitions -```ebnf +``` RefineDcl ::= ‘val’ ValDcl | ‘def’ DefDcl - | ‘type’ {nl} TypeDcl ; + | ‘type’ {nl} TypeDcl Dcl ::= RefineDcl - | ‘var’ VarDcl ; -ValDcl ::= ids ‘:’ Type ; -VarDcl ::= ids ‘:’ Type ; -DefDcl ::= DefSig ‘:’ Type ; -DefSig ::= id [DefTypeParamClause] DefParamClauses ; -TypeDcl ::= id [TypeParamClause] {FunParamClause} TypeBounds [‘=’ Type] ; + | ‘var’ VarDcl +ValDcl ::= ids ‘:’ Type +VarDcl ::= ids ‘:’ Type +DefDcl ::= DefSig ‘:’ Type +DefSig ::= id [DefTypeParamClause] DefParamClauses +TypeDcl ::= id [TypeParamClause] {FunParamClause} TypeBounds [‘=’ Type] Def ::= ‘val’ PatDef | ‘var’ PatDef | ‘def’ DefDef | ‘type’ {nl} TypeDcl - | TmplDef ; + | TmplDef PatDef ::= ids [‘:’ Type] ‘=’ Expr - | Pattern2 [‘:’ Type] ‘=’ Expr ; + | Pattern2 [‘:’ Type] ‘=’ Expr DefDef ::= DefSig [‘:’ Type] ‘=’ Expr - | ‘this’ DefParamClause DefParamClauses ‘=’ ConstrExpr ; + | ‘this’ DefParamClause DefParamClauses ‘=’ ConstrExpr TmplDef ::= ([‘case’] ‘class’ | ‘trait’) ClassDef | [‘case’] ‘object’ ObjectDef | ‘enum’ EnumDef - | ‘given’ GivenDef ; -ClassDef ::= id ClassConstr [Template] ; -ClassConstr ::= [ClsTypeParamClause] [ConstrMods] ClsParamClauses ; -ConstrMods ::= {Annotation} [AccessModifier] ; -ObjectDef ::= id [Template] ; -EnumDef ::= id ClassConstr InheritClauses EnumBody ; -GivenDef ::= [GivenSig] (AnnotType [‘=’ Expr] | StructuralInstance) ; -GivenSig ::= [id] [DefTypeParamClause] {UsingParamClause} ‘:’ -- one of `id`, `DefParamClause`, `UsingParamClause` must be present ; -StructuralInstance ::= ConstrApp {‘with’ ConstrApp} [‘with’ TemplateBody] ; + | ‘given’ GivenDef +ClassDef ::= id ClassConstr [Template] +ClassConstr ::= [ClsTypeParamClause] [ConstrMods] ClsParamClauses +ConstrMods ::= {Annotation} [AccessModifier] +ObjectDef ::= id [Template] +EnumDef ::= id ClassConstr InheritClauses EnumBody +GivenDef ::= [GivenSig] (AnnotType [‘=’ Expr] | StructuralInstance) +GivenSig ::= [id] [DefTypeParamClause] {UsingParamClause} ‘:’ -- one of `id`, `DefParamClause`, `UsingParamClause` must be present +StructuralInstance ::= ConstrApp {‘with’ ConstrApp} [‘with’ WithTemplateBody] Extension ::= ‘extension’ [DefTypeParamClause] {UsingParamClause} - ‘(’ DefParam ‘)’ {UsingParamClause} ExtMethods ; -ExtMethods ::= ExtMethod | [nl] <<< ExtMethod {semi ExtMethod} >>> ; -ExtMethod ::= {Annotation [nl]} {Modifier} ‘def’ DefDef ; -Template ::= InheritClauses [TemplateBody] ; -InheritClauses ::= [‘extends’ ConstrApps] [‘derives’ QualId {‘,’ QualId}] ; -ConstrApps ::= ConstrApp ({‘,’ ConstrApp} | {‘with’ ConstrApp}) ; -ConstrApp ::= SimpleType1 {Annotation} {ParArgumentExprs} ; + ‘(’ DefParam ‘)’ {UsingParamClause} ExtMethods +ExtMethods ::= ExtMethod | [nl] <<< ExtMethod {semi ExtMethod} >>> +ExtMethod ::= {Annotation [nl]} {Modifier} ‘def’ DefDef + | Export +Template ::= InheritClauses [TemplateBody] +InheritClauses ::= [‘extends’ ConstrApps] [‘derives’ QualId {‘,’ QualId}] +ConstrApps ::= ConstrApp ({‘,’ ConstrApp} | {‘with’ ConstrApp}) +ConstrApp ::= SimpleType1 {Annotation} {ParArgumentExprs} ConstrExpr ::= SelfInvocation - | <<< SelfInvocation {semi BlockStat} >>> ; -SelfInvocation ::= ‘this’ ArgumentExprs {ArgumentExprs} ; + | <<< SelfInvocation {semi BlockStat} >>> +SelfInvocation ::= ‘this’ ArgumentExprs {ArgumentExprs} -TemplateBody ::= :<<< [SelfType] TemplateStat {semi TemplateStat} >>> ; +WithTemplateBody ::= <<< [SelfType] TemplateStat {semi TemplateStat} >>> +TemplateBody ::= :<<< [SelfType] TemplateStat {semi TemplateStat} >>> TemplateStat ::= Import | Export | {Annotation [nl]} {Modifier} Def @@ -426,16 +446,16 @@ TemplateStat ::= Import | Extension | Expr1 | EndMarker - | ; + | SelfType ::= id [‘:’ InfixType] ‘=>’ - | ‘this’ ‘:’ InfixType ‘=>’ ; + | ‘this’ ‘:’ InfixType ‘=>’ -EnumBody ::= :<<< [SelfType] EnumStat {semi EnumStat} >>> ; +EnumBody ::= :<<< [SelfType] EnumStat {semi EnumStat} >>> EnumStat ::= TemplateStat - | {Annotation [nl]} {Modifier} EnumCase ; -EnumCase ::= ‘case’ (id ClassConstr [‘extends’ ConstrApps]] | ids) ; + | {Annotation [nl]} {Modifier} EnumCase +EnumCase ::= ‘case’ (id ClassConstr [‘extends’ ConstrApps]] | ids) -TopStats ::= TopStat {semi TopStat} ; +TopStats ::= TopStat {semi TopStat} TopStat ::= Import | Export | {Annotation [nl]} {Modifier} Def @@ -443,9 +463,9 @@ TopStat ::= Import | Packaging | PackageObject | EndMarker - | ; -Packaging ::= ‘package’ QualId :<<< TopStats >>> ; -PackageObject ::= ‘package’ ‘object’ ObjectDef ; + | +Packaging ::= ‘package’ QualId :<<< TopStats >>> +PackageObject ::= ‘package’ ‘object’ ObjectDef -CompilationUnit ::= {‘package’ QualId semi} TopStats ; +CompilationUnit ::= {‘package’ QualId semi} TopStats ``` diff --git a/project/resources/referenceReplacements/sidebar.yml b/project/resources/referenceReplacements/sidebar.yml index 06507028c723..be67a0d6da99 100644 --- a/project/resources/referenceReplacements/sidebar.yml +++ b/project/resources/referenceReplacements/sidebar.yml @@ -135,6 +135,7 @@ subsection: index: reference/experimental/overview.md subsection: - page: reference/experimental/fewer-braces.md + hidden: true - page: reference/experimental/canthrow.md - page: reference/experimental/erased-defs.md - page: reference/experimental/erased-defs-spec.md