Skip to content

Finishing the pytest migration. #1661

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 43 commits into from
Apr 24, 2020
Merged

Finishing the pytest migration. #1661

merged 43 commits into from
Apr 24, 2020

Conversation

gabrieldemarmiesse
Copy link
Member

@gabrieldemarmiesse gabrieldemarmiesse commented Apr 13, 2020

Finishing the pytest migration.

@gabrieldemarmiesse gabrieldemarmiesse changed the title Removed run_all_in_graph_and_eager_mode in attention_wrapper_test.py. [WIP] Removed run_all_in_graph_and_eager_mode in attention_wrapper_test.py. Apr 19, 2020
@boring-cyborg boring-cyborg bot added the test-cases Related to Addons tests label Apr 19, 2020
@gabrieldemarmiesse gabrieldemarmiesse changed the title [WIP] Removed run_all_in_graph_and_eager_mode in attention_wrapper_test.py. Removed run_all_in_graph_and_eager_mode in attention_wrapper_test.py. Apr 19, 2020
@gabrieldemarmiesse gabrieldemarmiesse changed the title Removed run_all_in_graph_and_eager_mode in attention_wrapper_test.py. Finishing the pytest migration. Apr 21, 2020
Copy link
Member

@seanpmorgan seanpmorgan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Final boss indeed haha. I'm about half-way through but wanted to checkpoint where I'm at.

Copy link
Member

@seanpmorgan seanpmorgan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM thanks for this! One note regarding testing GPU that needs to be tracked (possibly in the GPU issue)


with self.cached_session(use_gpu=True):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we mark this as a TODO to enable GPU testing in #1713 for this

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could, but I believe we don't need to test on gpu here. We don't depend on any custom op and we don't test in float16. I believe that this is tensorflow's job to do the right thing on gpu here, not ours. But I don't have a very strong opinion there, in the new setup, it's just going to be an additional decorator to add gpu testing.


def _testWithMaybeMultiAttention(
self,
is_multi,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So we remove all logic for handling is_multi. It was never used in the test suite, but I'm less familiar with this module and don't know if it'll ever be needed in testing going forward. Okay to remove IMO since it wasn't used and will be in git history.

@gabrieldemarmiesse gabrieldemarmiesse merged commit 5b12ae5 into tensorflow:master Apr 24, 2020
ashutosh1919 pushed a commit to ashutosh1919/addons that referenced this pull request Jul 12, 2020
* Removed run_all_in_graph_and_eager_mode in attention_wrapper_test.py.

* Refactoring.

* Some black bug.

* Done one test.

* Fuse functions.

* Rewrite with unittest2pytest.

* Moved function.

* Removed use gpu.

* Remove prints.

* Removed one assertAllclose.

* Removed some more self.

* Removed some more self.

* Moved function out of class.

* Removed one function.

* Removed some self.

* Moved a function out of tf.test.TestCase.

* Removed decorator.

* IIII

* Removed the run functions eagerly.

* Removed import.

* Removed is_multi.

* Removed self for batch.

* Removed some more self.

* Removed some more self.

* Removed some stuff.

* Unholy stuff there.

* Found a way to replace that.

* Works well.

* Removed self completely from function.

* Removed self from parameters.

* Managed to move the function.

* It works with pytest only.

* Fully converted to pytest.

* Minor simplification.

* Some comment.

* Removed some unused functions.

* Removed some elements from the test_no_deprecated_v1.

* Removed some files from blacklists.

* Forgot to remove run_eagerly.

* Removed tf.executing_eagerly.
jrruijli pushed a commit to jrruijli/addons that referenced this pull request Dec 23, 2020
* Removed run_all_in_graph_and_eager_mode in attention_wrapper_test.py.

* Refactoring.

* Some black bug.

* Done one test.

* Fuse functions.

* Rewrite with unittest2pytest.

* Moved function.

* Removed use gpu.

* Remove prints.

* Removed one assertAllclose.

* Removed some more self.

* Removed some more self.

* Moved function out of class.

* Removed one function.

* Removed some self.

* Moved a function out of tf.test.TestCase.

* Removed decorator.

* IIII

* Removed the run functions eagerly.

* Removed import.

* Removed is_multi.

* Removed self for batch.

* Removed some more self.

* Removed some more self.

* Removed some stuff.

* Unholy stuff there.

* Found a way to replace that.

* Works well.

* Removed self completely from function.

* Removed self from parameters.

* Managed to move the function.

* It works with pytest only.

* Fully converted to pytest.

* Minor simplification.

* Some comment.

* Removed some unused functions.

* Removed some elements from the test_no_deprecated_v1.

* Removed some files from blacklists.

* Forgot to remove run_eagerly.

* Removed tf.executing_eagerly.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla: yes seq2seq test-cases Related to Addons tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants