Skip to content

Adding Testing into the ReadMe or Contributions Files #254

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
swzCuroverse opened this issue Sep 24, 2022 · 3 comments
Open

Adding Testing into the ReadMe or Contributions Files #254

swzCuroverse opened this issue Sep 24, 2022 · 3 comments

Comments

@swzCuroverse
Copy link
Contributor

I think we are also designing tests for all the examples. We don't mention in the readme or contributions that new contributors should do this and/or how to do this. Can we expand it to cover this?

@kinow
Copy link
Member

kinow commented Sep 24, 2022

I think that's possible. I tried to keep the existing conformance-tests.yml up to date, but it may be missing some files. Furthermore, now that Sphinx build is able to run the workflows, that can also be used as a way to test that the examples work (for example, we could specify a :status-code: 0 and verify that the process exists with the desired status code).

@MARVEBUKA
Copy link
Contributor

Hello @kinow @swzCuroverse please I'll like to work on this. Can I have some guidelines please 🙏

@kinow
Copy link
Member

kinow commented Oct 22, 2022

Hello @kinow @swzCuroverse please I'll like to work on this. Can I have some guidelines please pray

As @swzCuroverse said in Matrix this issue is a little more elaborate than other issues. In case @MARVEBUKA still wants to work on this, or if anyone would like to give it a try, here's my understanding:

  • In our dependencies (see setup.cfg) we have cwltest:
    cwltest==2.*
  • This dependency was already present in this repository before the migration to Sphinx & Python (when we used Jekyll & Ruby)
  • Our Makefile has a target to run the unit tests of examples, but it is not currently active in GitHub Actions, I think:

    user_guide/Makefile

    Lines 35 to 36 in aaef445

    unittest-examples:
    cd src/_includes/cwl; cwltest --test=conformance-test.yml --tool=${RUNNER}
  • Previously, for each pull request CI (Travis later GH Actions) would pull the code, install dependencies, build the Ruby site, and then test the CWL workflows separately with cwltest, effectively running the workflows and testing their outputs (e.g.
    example_out:
    class: File
    checksum: sha1$a739a6ff72d660d32111265e508ed2fc91f01a7c
    basename: output.txt
    location: Any
    size: 36
    )

Note that currently every CWL example is “tested”, by being executed for every pull request. That wasn't the case, previously, as Ruby & Jekyll were not building and running the examples as we are doing with the runcmd directive and Python with Sphinx.

I think this issue could be fixed in a few different ways, for example:

  1. Someone adds back the Makefile target to the GitHub Actions, after making sure everything still runs, and also verifying if we have conformance tests (unit tests) for all the new workflows added recently (I suspect a few will be missing);
  2. Maybe we start relying on the runcmd to “test” the CWL examples. It would be possible to add a parameter to the directive with the expected exit code, and maybe the (optional) expected output, similar to the conformance tests;
  3. Maybe we use runcmd, adding some new arguments to run the conformance tests (i.e. it still runs the command, puts the output in the rendered HTML, but it also runs some conformance test when instructed via an argument).

Definitely not as simple as other issues. But maybe more technically challenging, and requires discussing with other CWL devs about the possible solutions, pros and cons, etc. Which is something very common when working on Open Source 😉

Hope that helps!

-Bruno

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants