Skip to content

Automate generation of output files? #148

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
tobyhodges opened this issue Jul 27, 2019 · 5 comments · Fixed by #248
Closed

Automate generation of output files? #148

tobyhodges opened this issue Jul 27, 2019 · 5 comments · Fixed by #248
Assignees

Comments

@tobyhodges
Copy link
Contributor

tobyhodges commented Jul 27, 2019

Can we add some kind of hook or additional CI to automate the creation of output files and include them in the user guide pages? This would help us to avoid content rot problems, such as those reported in #146 & #136

@mr-c
Copy link
Member

mr-c commented Jul 27, 2019

Maybe @tom-tan @manabuishii have suggestions ?

@manabuishii
Copy link
Contributor

Automated process is help a lot !!
Of course aspect of test .

Currently we do not know exact process of deployment User Guide.
Please teach us User Guide deploy process.

Sample codes are already execute test everytime,
so I think it is not difficult to generate output files.

@tobyhodges
Copy link
Contributor Author

The examples are checked with Travis but the User Guide (UG) pages are built and deployed directly through GitHub - the pages are built regardless of whether the examples fail to run on Travis.

The UG pages are built from Markdown files in the top-level of the repository or in _episodes and _extras. As is the default for GitHub Pages, these pages are built with Jekyll, which gets its configuration from _config.yml (this is where the magic to set up the "Episodes" & "Extras" collections takes place). Pages are organised and styled according to layouts defined in _layouts.

One of the nice things that Jekyll allows us to do is to include files in other locations. One advantage of this is that it helps to avoid redundancy. And it's probably this include magic that will be key to inserting the output of the tested examples from Travis. We will need to somehow add the output of each example to the _includes folder after testing and replace all output blocks currently in the UG Markdown with {% include <exercise-name-out>.txt %}.

This page explains how to set Travis up to build your GitHub Pages. If the output of each exercise has been saved into an appropriate location inside _includes/ then the rest should be fairly straightforward...

@mr-c
Copy link
Member

mr-c commented Sep 11, 2019

We can use prepend coloredlogs --to-html to the cwltool invocations to get HTML representation of the colors.

Example:
coloredlogs --to-html cwltool --validate README.rst > cwltool-validate-README.html

<code><span style="color:#010101;font-weight:bold">INFO</span> /home/michael/schema_salad/env3.7/bin/cwltool 1.0.20190906065220<br>
<span style="color:#010101;font-weight:bold">INFO</span> Resolved 'README.rst' to 'file:///home/michael/schema_salad/README.rst'<br>
<span style="color:#010101;font-weight:bold">ERROR</span> <span style="color:#DE382B">I'm sorry, I couldn't load this CWL file, try again with --debug for more information.<br>
The error was: while scanning a block scalar<br>
&nbsp;&nbsp;in &quot;file:///home/michael/schema_salad/README.rst&quot;, line 1, column 1<br>
expected chomping or indentation indicators, but found 'L'<br>
&nbsp;&nbsp;in &quot;file:///home/michael/schema_salad/README.rst&quot;, line 1, column 2</span></code>

@kinow kinow self-assigned this Aug 10, 2022
@kinow
Copy link
Member

kinow commented Aug 11, 2022

The user guide is now built with Sphinx. However, we still have the same problem. Examples are written, but with no guarantee that they run with the latest cwltool, and the output contains PII like usernames, directories.

I am now searching for a Sphinx directive to execute commands. Ideally, it would produce output simlar to {code-block} console, i.e. include an anchor link and caption if provided, and syntax highlighting for different languages.

Finally, it would either isolate the command execution, or execute the commands in a container to have consistent output (i.e. nothing of /home/kinow/mydirectorypreference/examples/hello.cwl, but instead something that gives the same output no matter the environment).


I will document the findings in this comment, in the checklist below. Feel free to add any items if you'd like me to try it, please:


Played a little more with sphinxcontrib-programoutput and looked at its source code too. I think it could work! 🎉

image

```{code-block} console
:name: installing-cwltool-with-pip
:caption: Installing `cwltool` with `pip`.

$ pip install cwltool
```

```{command-output} cwltool --version
:caption: Running `cwltool` test program-output.
:name: test-123
:shell:
```

The examples above contain pip install cwltool with Sphinx's vanilla code-block directive, and cwltool --version with the programoutput directive. It is still showing my user name and my home directory. But after reading some blogs, I found a few people using Docker to run the Sphinx build.

I am now testing if I can move the examples to real CWL files in src/examples (we had a similar directory for the Jekyll build) and run the commands with programoutput. Then, instruct ReadTheDocs & GitHub pages to both build Sphinx with Docker, which should give a consistent output, and without any private information.


Preview of vanilla code-block (no commands executed, i.e. I typed the command and its output), sphinx-programoutput running bash, and sphinx-autorun also running bash:

image

Some issues with syntax highlighting, anchor links, caption, and the extraneous characters due to terminal color characters, I think, which both sphinx-autoprogram and sphinx-autorun appear to need some configuration to display the output correctly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants