Skip to content

MultiClassClassifier.Evaluate calculates the confusion matrix, but doesn't make it accessible to the user #2335

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
sfilipi opened this issue Jan 30, 2019 · 2 comments
Assignees
Labels
bug Something isn't working

Comments

@sfilipi
Copy link
Member

sfilipi commented Jan 30, 2019

The Evaluate method in the MultiClassClassifierEvaluator, calculates the overall metrics, and the confusionTable, but makes no use of the confusion table.

The confusion table should be part of the MultiClassClassifierMetrics.

@sfilipi sfilipi added the bug Something isn't working label Jan 30, 2019
@sfilipi sfilipi self-assigned this Feb 27, 2019
@eerhardt
Copy link
Member

Can this API be added after v1.0 without a breaking change? If so, it might not qualify for Project 13.

@Ivanidzo4ka
Copy link
Contributor

duplicate of #2009

@ghost ghost locked as resolved and limited conversation to collaborators Mar 25, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants