You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is it possible to change the hyper-parameter space of an algorithm (such as
PCA), so that it is restricted or enlargedd in respect to the default ones?
I know that we can override the get_hyperparameter_search_space method, but
I think it would be easier if there would be an accessible field in the AutoSklearnClassifierAutoSklearnRegressor object. Such a field could take
the form of a dictionary with keys like pca__keep_variance and values
objects from ConfigSpace.hyperparameters. Algorithms (e.g. PCA object)
should have a similar dictionary as a static field and a method to override
its entries, while the get_hyperparameter_search_space method shold just
return it...
The text was updated successfully, but these errors were encountered:
I agree, in general, I would like the pipeline space to be reconfigured for numerous other reasons. See the pipeline section of our project board.
Would you be able to provide a user example of how this might look like in code so we can discuss it and come to a long term solution?
In general, the issues with configurable pipelines is that they are intimately tied to our meta-learning which is an expensive offline optimization that comes with auto-sklearn, which becomes a research challenge.
I'm on a old version of auto-sklearn, but with more recent versions of ConfigSpace this should be even easier.
For instance, the get_hyperparameter_search_space method in PCA could
become:
# this is the only thing that stays in each algorithm:# with recent versions of ConfigSpace, the following dict could be a# `ConfigurationSpace` object, since it can be constructed in one# statement onlyhyperparameter_space=dict(
keep_variance=UniformFloatHyperparameter(
"keep_variance", 0.5, 0.9999, default_value=0.9999
),
whiten=CategoricalHyperparameter(
"whiten", ["False", "True"], default_value="False"
),
)
# the following can now go to the base class@classmethoddefget_hyperparameter_search_space(cls, dataset_properties=None):
cs=ConfigurationSpace()
cs.add_hyperparameters([vforvincls.hyperparameter_space.values()])
returncs# The following should go into the base class actually...@classmethoddefset_hyperparameter_space(cls, new_space):
""" Overwrite the search space of an algorithm. Parameters ---------- `new_space` : Union[ConfigSpace, Iterable[Hyperparameter]] """ifisinstance(new_space, ConfigurationSpace):
# old ConfigSpace:cls.hyperparameter_space=new_spaceelse:
fordimensioninnew_space:
cls.hyperparameter_space[dimension.name] =dimension
I don't exactly know where the set_hyperparameter_space should be called, but I think it should be just before the full search space is selected.
Short Question Description
Is it possible to change the hyper-parameter space of an algorithm (such as
PCA), so that it is restricted or enlargedd in respect to the default ones?
I know that we can override the
get_hyperparameter_search_space
method, butI think it would be easier if there would be an accessible field in the
AutoSklearnClassifier
AutoSklearnRegressor
object. Such a field could takethe form of a dictionary with keys like
pca__keep_variance
and valuesobjects from
ConfigSpace.hyperparameters
. Algorithms (e.g.PCA
object)should have a similar dictionary as a static field and a method to override
its entries, while the
get_hyperparameter_search_space
method shold justreturn it...
The text was updated successfully, but these errors were encountered: