-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Using PFI with AutoML, possible? #4227
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
This is a duplicate of #3972 and #3976. Unfortunately, I am not aware of a solution to this at this point. The issue is that the model is saved to disk, and when it is reloaded it can't be used with PFI. See my analysis at #3976 (comment). @codemzs - I believe you mentioned someone was going to be available to fix this. Do you know who? Those 2 issues above are not assigned yet. |
@eerhardt - then any itea how we calculate weight and bias using AutoML if above approaches is not used? i.e. any second method is available.we required to access submodel are as follows: After loading model for binary classification we are trying to get submodel and calibrator but if you see after loading model we write below code ((Microsoft.ML.Calibrators.CalibratedModelParametersBase)((Microsoft.ML.Data.PredictionTransformerBase<Microsoft.ML.IPredictorProducing>)((Microsoft.ML.Data.TransformerChain<Microsoft.ML.ITransformer>)mlModel).LastTransformer).Model).SubModel if you see in CreateModel method we easily get SubModel by writing below code LinearBinaryModelParameters linearBinaryModelParameters = ((Microsoft.ML.Data.TransformerChain<Microsoft.ML.Data.BinaryPredictionTransformer<Microsoft.ML.Calibrators.CalibratedModelParametersBase<Microsoft.ML.Trainers.LinearBinaryModelParameters, Microsoft.ML.Calibrators.PlattCalibrator>>>)mlModel).LastTransformer.Model.SubModel; please guide us. |
Until the bug is fixed, the only approach I can think of would be to use reflection to get access to the underlying values. The problem is that you can't cast the object to a type because it is currently using an internal type ( |
@eerhardt - I have seen many question raise related to this topic.Please assign this ticket to AutoML team so that they are fix it. Thank you for cooperation. |
so how we can use reflection to get access to the underlying value till AutoML bug is not fixed? |
This is indeed a duplicate of #3972. As explained there, it was always possible to use PFI with AutoML by using the appropriate casts: |
I have a trained model and now trying to retrieve the feature weights. None of the objects returned expose a LastTransformer then I want to get the PFI information and I get stuck. There appears no way to get the LastTransformer object from the trainedModel.
The following cast lets me access the LastTransformer, however I cannot use it for PFI until I provide a better type for predictor. Debugging I can see it is of type Microsoft.ML.Data.RegressionPredictionTransformer<Microsoft.ML.IPredictorProducing> but I am unable to cast to that because Microsoft.ML.IPredictorProducing is not visible, so it seems like we're still stuck.
//setup code similar to famschopman
RegressionExperiment experiment = mlContext.Auto().CreateRegressionExperiment(experimentSettings);
var experimentResults = experiment.Execute(split.TrainSet, split.TestSet);
var predictor = ((TransformerChain)experimentResults.BestRun.Model).LastTransformer;
//this will not compile.
var permutationMetrics = mlContext.Regression.PermutationFeatureImportance(predictor, transformedData, permutationCount: 30);
The following compile error is produced.
The type arguments for method 'PermutationFeatureImportanceExtensions.PermutationFeatureImportance(RegressionCatalog, ISingleFeaturePredictionTransformer, IDataView, string, bool, int?, int)' cannot be inferred from the usage. Try specifying the type arguments explicitly.
how we get bias and weight using PFI?
The text was updated successfully, but these errors were encountered: