Skip to content

Conversation

@MarianoChaves
Copy link

Here I included the possibility to include the desiderd hyperparameters of a given class. It can be useful depending on the usecase. I've included it in the documentation and also included an unity test for that.

Disclaimers:

  • Setting the hyperparameter is optional;
  • If the hyperarameter is setted, it is not necessary to set to all models, only for the desired ones.

@shankarpandala
Copy link
Owner

Thanks for your contribution.
Will look into it

@MarianoChaves MarianoChaves changed the title feat: including the possibility of set the hyperparameter. feat: including the possibility to set the hyperparameter. Dec 27, 2022
@MarianoChaves MarianoChaves changed the title feat: including the possibility to set the hyperparameter. feat: including the possibility to set the hyperparameter of a given model. Dec 27, 2022
@MarianoChaves MarianoChaves changed the title feat: including the possibility to set the hyperparameter of a given model. feat: including the possibility to choose the hyperparameter of a given model. Dec 27, 2022
@shankarpandala
Copy link
Owner

Closing as this contradicts LazyPredict's core philosophy.

LazyPredict is designed for quick model comparison without hyperparameter tuning. The entire point is to provide baseline performance with default parameters.

If you need hyperparameter tuning, use dedicated tools:

  • scikit-learn's GridSearchCV or RandomizedSearchCV
  • Optuna
  • Ray Tune
  • Hyperopt

The workflow should be:

  1. Use LazyPredict to identify promising models
  2. Then tune the best models with dedicated tuning tools

Adding hyperparameter tuning to LazyPredict would defeat its purpose and make it slow.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants