Skip to content

Github Actions: test newer python versions, take 2 #507

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 4 commits into from

Conversation

martinetd
Copy link
Collaborator

Tests themselves look fine, just the running with nosetests that is apparently a problem, so adding an ugly case seems to work.

Coverage isn't easy to make work directly so I've skipped it for now, we didn't actually use the coverage results as far as I can see?

Also updated the actions we use to fix deprecation warnings on every test run while I was here.

nosetests no longer works with python 3.10 onwards, but we do not
actually need it
Try to fix node.js 12 deprecation warning when building jobs by using newer
versions of actions/checkout, actions/setup-python, and
slackapi/slack-github-action

Assuming usage stays the same...
@@ -8,7 +8,7 @@ jobs:
strategy:
matrix:
os: [ubuntu-latest]
python-version: [ '3.7', '3.8', '3.9']
python-version: [ '3.7', '3.8', '3.9', '3.10', '3.11', '3.12-dev' ]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we really need to test so early version like 3.12-dev? Those tests need to pass for all the following commits, right?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can always ignore build failures?
It's better to know it'll fail so we know we have something to fix for the next python version, than to not know and have someone tell us a few weeks after the release.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The purpose of the tests is to validate the code changes are working for all the supported version. Having the green tick is very helpful to know this is working fine. Having a always red tick and us needing to go check it to see if this is the always failing test due to some changes in 3.12 is burdensome.

dev is dev... As long as the code is not stable and some changes could be make some failures only transient... i feel like this is adding more effort than helping us. Except if we agree to work on the failure quickly, and none of us are likely to have time for that.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, disabling the check is trivial -- can we just leave it on for now and disable it if that ever is a problem (e.g. a check keeps failing for more than a handful of PRs) ?
I was personally annoyed with the 3.10 breakage we had a couple of years ago, so if we had such a check in place at the time I'd have been happy to fix it before things broke on my laptop. fedora normally ships the next version right with its release (synchronized release of python and new release of fedora) so I'll always be one step ahead and really think I'd benefit from this.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK Agreed

@thiell thiell self-requested a review November 27, 2022 18:29
@thiell thiell added this to the 1.9.1 milestone Nov 27, 2022
@thiell thiell modified the milestones: 1.9.1, 1.9.2 Feb 10, 2023
@thiell thiell added the Tests label Sep 17, 2023
@thiell
Copy link
Collaborator

thiell commented Sep 18, 2023

Fixed in #531 thanks to nose-py3

@thiell thiell closed this Sep 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants