hgboost’s documentation!¶
hgboost
is short for Hyperoptimized Gradient Boosting and is a python package for hyperparameter optimization for xgboost, catboost and lightboost using cross-validation, and evaluating the results on an independent validation set.
hgboost
can be applied for classification and regression tasks.
hgboost
is fun because:
Hyperoptimization of the Parameter-space using bayesian approach.
Determines the best scoring model(s) using k-fold cross validation.
Evaluates best model on independent evaluation set.
Fit model on entire input-data using the best model.
Works for classification and regression
Creating a super-hyperoptimized model by an ensemble of all individual optimized models.
Return model, space and test/evaluation results.
Makes insightful plots.
Content¶
Background
Installation
Methods
Examples
Code Documentation
Quick install¶
pip install hgboost
Source code and issue tracker¶
Available on Github, erdogant/hgboost. Please report bugs, issues and feature extensions there.
Colab notebooks¶
- Some of the described examples can also be found in the notebooks:
Citing hgboost¶
Here is an example BibTeX entry:
- @misc{erdogant2020hgboost,
title={hgboost}, author={Erdogan Taskesen}, year={2020}, howpublished={url{https://github.com/erdogant/hgboost}}}