Skip to content

xfail behaving more like skip #7060

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
mentaal opened this issue Apr 9, 2020 · 6 comments
Closed

xfail behaving more like skip #7060

mentaal opened this issue Apr 9, 2020 · 6 comments
Labels
type: question general question, might be closed after 2 weeks of inactivity

Comments

@mentaal
Copy link

mentaal commented Apr 9, 2020

Hi there,
In the docs for xfail it says:

def test_function2():
    import slow_module

    if slow_module.slow_function():
        pytest.xfail("slow_module taking too long")
These two examples illustrate situations where you don’t want to check for a condition at the module level, which is when a condition would otherwise be evaluated for marks.

This will make test_function XFAIL. Note that no other code is executed after the pytest.xfail call, differently from the marker. That’s because it is implemented internally by raising a known exception.

The behavior describes doesn't really sound like xfail to me. It's more like skip. The testcase isn't run. If I put the pytest.xfail call at the end of the test, an exception will be thrown before I get to it.

Am I misunderstanding. What's the intended usage for this?
The reason what I want a dynamic xfail is because I have to read information during the test and use that to determine if the test can be marked as xfail.

Thank you

@RonnyPfannschmidt
Copy link
Member

as documented jsut below the example, the imperative xfail indeed behaves like the imperative skip
as its implemented in terms of a specific exception

as far as i understood your use-case is triggering a declared xfail from within the test
(whic would mean still running the code expected to fail)

we currently don't have such a concept available, but it is a desired feature as this is a case where a surprise XPASS is valuable information

i vaguely recall we have a issue about this but i don't remember the details, so i wont try to search it for now

@RonnyPfannschmidt RonnyPfannschmidt added the type: question general question, might be closed after 2 weeks of inactivity label Apr 9, 2020
@mentaal
Copy link
Author

mentaal commented Apr 9, 2020

Ok let me try and explain with an example:

def test_xfail_1():
    test_should_fail_today = True
    if test_should_fail_today:
        pytest.xfail("I'm gonna fail but I still expect to execute")
    print("I'm doing stuff..")
    raise ValueError("Boooo")

Results in:
..\..\..\temp\test_xfail.py::test_xfail_1 XFAIL

This isn't an xfail... it's a skip.
Try again:

def test_xfail_2():
    test_should_fail_today = True
    print("I'm doing stuff..")
    raise ValueError("Boooo")
    if test_should_fail_today:
        pytest.xfail("I'm gonna fail but I still expect to execute")

Results in:
E ValueError: Boooo
Not an xfail at all, though obviously the reason is understood.

I did read the documentation I quoted. The reason is clear and I understand it. My point is if the usage in my first example here is how it is intended to be used, this is semantically not xfail at all and the user may as well just use pytest.skip.

There is one way around this which does yield semantically correct behavior:

def test_xfail():
    test_should_fail_today = True
    try:
        print("I'm doing stuff..")
        raise ValueError("Boooo")
    except ValueError:
        if test_should_fail_today:
            pytest.xfail("I'm gonna fail but I still expect to execute")

Yields:

..\..\..\temp\test_xfail.py::test_xfail I'm doing stuff..
XFAIL

If this is the intention usage of imperative xfail, then perhaps the docs should be adjusted to reflect this.

@RonnyPfannschmidt
Copy link
Member

That's not the intended usage, thats a bad workaround for the missing imperative marking

@mentaal
Copy link
Author

mentaal commented Apr 10, 2020

Can you please elaborate?

@RonnyPfannschmidt
Copy link
Member

imperative xfail is intended to exit a test right at the point where it happens, indicating a known hopeless case

we do not have a delayed xfail which is applied dynamically and then is evaluated after a failure/pass

@mentaal
Copy link
Author

mentaal commented Apr 10, 2020

Which in my head at least is an imperative skip for the reasons described. To refer to it otherwise is confusing because the semantics are different to the declarative variant. I'll beg to disagree on the bad workaround as it matches more closely the semantics of xfail. I'm not stuck or anything so I'll close this ticket.
Thanks for taking the time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: question general question, might be closed after 2 weeks of inactivity
Projects
None yet
Development

No branches or pull requests

2 participants