-
-
Notifications
You must be signed in to change notification settings - Fork 2.8k
xfail behaving more like skip #7060
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
as documented jsut below the example, the imperative xfail indeed behaves like the imperative skip as far as i understood your use-case is triggering a declared xfail from within the test we currently don't have such a concept available, but it is a desired feature as this is a case where a surprise XPASS is valuable information i vaguely recall we have a issue about this but i don't remember the details, so i wont try to search it for now |
Ok let me try and explain with an example: def test_xfail_1():
test_should_fail_today = True
if test_should_fail_today:
pytest.xfail("I'm gonna fail but I still expect to execute")
print("I'm doing stuff..")
raise ValueError("Boooo") Results in: This isn't an xfail... it's a skip. def test_xfail_2():
test_should_fail_today = True
print("I'm doing stuff..")
raise ValueError("Boooo")
if test_should_fail_today:
pytest.xfail("I'm gonna fail but I still expect to execute") Results in: I did read the documentation I quoted. The reason is clear and I understand it. My point is if the usage in my first example here is how it is intended to be used, this is semantically not xfail at all and the user may as well just use There is one way around this which does yield semantically correct behavior: def test_xfail():
test_should_fail_today = True
try:
print("I'm doing stuff..")
raise ValueError("Boooo")
except ValueError:
if test_should_fail_today:
pytest.xfail("I'm gonna fail but I still expect to execute") Yields:
If this is the intention usage of imperative xfail, then perhaps the docs should be adjusted to reflect this. |
That's not the intended usage, thats a bad workaround for the missing imperative marking |
Can you please elaborate? |
imperative xfail is intended to exit a test right at the point where it happens, indicating a known hopeless case we do not have a delayed xfail which is applied dynamically and then is evaluated after a failure/pass |
Which in my head at least is an imperative skip for the reasons described. To refer to it otherwise is confusing because the semantics are different to the declarative variant. I'll beg to disagree on the bad workaround as it matches more closely the semantics of xfail. I'm not stuck or anything so I'll close this ticket. |
Hi there,
In the docs for xfail it says:
The behavior describes doesn't really sound like xfail to me. It's more like skip. The testcase isn't run. If I put the pytest.xfail call at the end of the test, an exception will be thrown before I get to it.
Am I misunderstanding. What's the intended usage for this?
The reason what I want a dynamic xfail is because I have to read information during the test and use that to determine if the test can be marked as xfail.
Thank you
The text was updated successfully, but these errors were encountered: