Skip to content

[AutoDiff] [stdlib] Conform 'Array.DifferentiableView' to collection protocols. #26062

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1,885 commits into from

Conversation

rxwei
Copy link
Contributor

@rxwei rxwei commented Jul 10, 2019

  • Conform 'Array.DifferentiableView' to collection protocols.
  • Massively revamp precondition messages.

dan-zheng and others added 30 commits May 6, 2019 23:46
- Remove `@trivial` type annotations.
- Update SIL FileCheck based on changes in generated code.
* Fix AutoDiff tests.

- Fix `createAutoDiffThunk` to pass ownership verification.
  - The `@differentiable` function operand must be copied if non-trivial.
- Fix `AutoDiff/builtin_differential_operators.swift` so that it actually runs.

* Revert changes to `emitBuiltinAutoDiffApplyAssociatedFunction`.

Re-add `createDestroyAddr`.
Fix indentation.
Mark `var allKeyPaths` as `@inlinable` only when all stored properties
are effectively public.

Otherwise, SIL verification fails:
"Key path in serialized function cannot reference non-public property".
- Fix `stdlib/public/TensorFlow/CMakeLists.txt`.
  - List `ExecuteOp.swift.gyb` under GYB_SOURCES.
- Replace `count(where:)` with `lazy.filter { ... }.count`.
  - `count(where:)` reverted in swiftlang#22289.
- Fix `ProtocolDecl::computeKnownProtocolKind()`.
- Update `test/TensorFlow/sema_errors.swift`.
  - Unfortunately, diagnostics have regressed.
* Revamp TensorFlowRuntime tests after GPE removal.

- Remove `%target-run-eager-swift` and `%target-run-gpe-swift`.
- Use `%target-run-simple-swift` in RUN lines.
- Remove `REQUIRES: swift_test_mode_optimize`.

* Move `Tensor.concatenated` differentiation tests to `tensor_autodiff_runtime.swift`.
The type set by cs.setType() in the dynamic callable application did not
have type-variables, so it was considering this string literal already
checked. This made it to SILGen with a nullptr keyword argument.
Remove problematic call to `TC.conformsToProtocol` on synthesized
`AllDifferentiableVariables` struct, which produced an `ErrorType`
and caused an IRGen assertion failure.
- lib/AST/Type.cpp -> Brittle Vector type lookup leads to Self type not
getting erased.
- lib/SILGen/SILGenBuiltin.cpp -> builtin apply macro switched from Expr to
[ManagedValue]. Need to remove all custom freeing logic.
- utils/build-presets.ini -> importing tensorflow means we cannot use
the new build-script impl.
`verify_all_overlays.swift` and `verify_all_overlays_O.swift`
have been deleted upstream.
test/SILGen/builtins.swift -> Extra assert change hanging around.
utils/PathSanitizingFileCheck -> injection bug.
validation-test/ParseableInterface/verify_all_overlays.py -> Just a fix.
On Linux, adding an extra rpath fixes the following error:
"error while loading shared libraries: lib_InternalSwiftSyntaxParser.so".

Revisit, this change should not be necessary.
swiftlang#24630)

Fix differentiation of `@inlinable public func differentiationFunction(from:)`.
These tests were previously disabled on tensorflow branch, but now pass.
…4640)

This test was originally XFAIL on macOS, but now passes since it's been
disabled. Re-enable XFAIL as part of TF-491.
dan-zheng and others added 13 commits July 1, 2019 15:54
…tlang#25914)

`Differentiable` derived conformances now supports class types.
Synthesis works just like for struct types, except `TangentVector = Self`
is never synthesized even if `Self` conforms to `AdditiveArithmetic`.

Class differentiation support requires further differentiation
transform changes.

Resolves TF-630.
Update to tensorflow/swift-apis@9a3f393, getting us function-builder-based `Sequential` layers.
Do the type checking for top level functions and methods that are marked as the transpose of another function.
…ding. (swiftlang#25958)

Associated functions should be released when an `autodiff_function` instruction has been folded away.
…wiftlang#25967)

In VJPEmitter, if the original function call has substitutions, we `partial_apply` it with no arguments to specialize it. This `partial_apply` is not being released. JVP and VJP are being specialized the same way, but they are not being released either.

To fix this, we release the `@differentiable` function returned by `autodiff_function`, which will release the original and the associated functions tht are to be filled in later altogether. If the original function does not have substitutions, we retain the original function to balance out the release of the `@differentiable` function that comes later. As a result, `ADContext::promoteToDifferentiableFunction` no longer needs to retain the associated functions.

Example where the original function has substitutions:
```
f' = partial_apply f<...>()
f_diff = autodiff_function f'
release_value f_diff
```

Example where the original function does not have substitutions:
```
retain_value f
f_diff = autodiff_function f
release_value f_diff
```

Note: This makes the `autodiff_function` folding optimization no longer able to detect the pattern, but it is necessary. We can rewrite the optimization later.

This should fix [TF-621](https://bugs.swift.org/browse/TF-621).
…ay'. (swiftlang#26023)

Add variants of map and reduce that take a `@differentiable` closure and are themselves differentiable.
```swift
extension Array {
  @differentiable(wrt: self)
  func differentiableMap<Result: Differentiable>(
    _ body: @differentiable (Element) -> Result
  ) -> [Result]

  @differentiable(wrt: (self, initialResult))
  func differentiableReduce<Result: Differentiable>(
    _ initialResult: Result,
    _ nextPartialResult: @differentiable (Result, Element) -> Result
  ) -> Result
}
```

Also make `Array.DifferentiableView` conform to `ExpressibleByArrayLiteral` so that tests and user code are easier to write.
Fix formatting, fix unused variable warnings.
…ents. (swiftlang#25974)

This PR enables the functionality shown in this example:

```swift
public protocol Distribution {
  associatedtype Value
  func logProbability(of value: Value) -> Float
}

public protocol DifferentiableDistribution: Differentiable, Distribution {
  @differentiable(wrt: self)
  func logProbability(of value: Value) -> Float
}

struct Foo: DifferentiableDistribution {
  @differentiable(wrt: self)
  func logProbability(of value: Float) -> Float {
    .zero
  }
}

@differentiable
func blah<T: DifferentiableDistribution>(_ x: T) -> Float where T.Value: AdditiveArithmetic {
  x.logProbability(of: .zero)
}
```

The fix is based on the fact that the Swift compiler does not add entries to the witness tables of protocols for overridden functions, to avoid redundancy. However, the `@differentiable` attribute being added should not be interpreted as an override as it adds new functionality, and should result in entries being added to the witness table. This PR adds this check to the override checking code, thus enabling support for the aforementioned feature.
@rxwei rxwei added the tensorflow This is for "tensorflow" branch PRs. label Jul 10, 2019
@rxwei rxwei requested a review from dan-zheng July 10, 2019 18:01
@rxwei rxwei changed the title Conform 'Array.DifferentiableView' to collection protocols. [AutoDiff] [stdlib] Conform 'Array.DifferentiableView' to collection protocols. Jul 10, 2019
@rxwei
Copy link
Contributor Author

rxwei commented Jul 10, 2019

cc @eaplatanios

@rxwei
Copy link
Contributor Author

rxwei commented Jul 10, 2019

@swift-ci please test tensorflow

@rxwei
Copy link
Contributor Author

rxwei commented Jul 10, 2019

An IRGen crasher is expected.

@dan-zheng
Copy link
Contributor

dan-zheng commented Feb 22, 2020

I accidentally deleted tensorflow branch, which closed this PR. That was not intentional, sorry!
It would be nice to protect tensorflow branch against deletion to prevent this from happening again.

@dan-zheng dan-zheng reopened this Feb 22, 2020
@dan-zheng dan-zheng removed the tensorflow This is for "tensorflow" branch PRs. label Sep 19, 2020
@dan-zheng dan-zheng changed the base branch from tensorflow to master September 19, 2020 05:34
@shahmishal
Copy link
Member

Please update the base branch to main by Oct 5th otherwise the pull request will be closed automatically.

  • How to change the base branch: (Link)
  • More detail about the branch update: (Link)

@shahmishal shahmishal closed this Oct 5, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.