Skip to content

Create functional tests for all Debugging scenarios #2932

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
rogancarr opened this issue Mar 12, 2019 · 0 comments
Closed

Create functional tests for all Debugging scenarios #2932

rogancarr opened this issue Mar 12, 2019 · 0 comments
Assignees
Labels
API Issues pertaining the friendly API

Comments

@rogancarr
Copy link
Contributor

As laid out in #2498 , we need scenarios to cover the Debugging functionality we want fully supported in V1.

Scenarios

  • I can see how my data was read in to verify that I specified the schema correctly
  • I can see the output at the end of my pipeline to see which columns are available (score, probability, predicted label)
  • I can look at intermediate steps of the pipeline to debug my model.   Example: > I were to have the text "Help I'm a bug!" I should be able to see the steps where it is normalized to "help i'm a bug" then tokenized into ["help", "i'm", "a", "bug"] then mapped into term numbers [203, 25, 3, 511] then projected into the sparse float vector {3:1, 25:1, 203:1, 511:1}, etc. etc.
  • (P1) I can access the information needed for understanding the progress of my training (e.g. number of trees trained so far out of how many)
@rogancarr rogancarr added the API Issues pertaining the friendly API label Mar 12, 2019
@rogancarr rogancarr self-assigned this Mar 12, 2019
@ghost ghost locked as resolved and limited conversation to collaborators Mar 23, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
API Issues pertaining the friendly API
Projects
None yet
Development

No branches or pull requests

1 participant