hgboost’s documentation!

hgboost is short for Hyperoptimized Gradient Boosting and is a python package for hyperparameter optimization for xgboost, catboost and lightboost using cross-validation, and evaluating the results on an independent validation set. hgboost can be applied for classification and regression tasks.

hgboost is fun because:

    1. Hyperoptimization of the Parameter-space using bayesian approach.

    1. Determines the best scoring model(s) using k-fold cross validation.

    1. Evaluates best model on independent evaluation set.

    1. Fit model on entire input-data using the best model.

    1. Works for classification and regression

    1. Creating an ensemble of all available methods is a one-liner.

    1. Return model, space and test/evaluation results.

    1. Makes insightful plots.

Content

Installation

Code Documentation

Quick install

pip install hgboost

Source code and issue tracker

Available on Github, erdogant/hgboost. Please report bugs, issues and feature extensions there.

Citing hgboost

Here is an example BibTeX entry:

@misc{erdogant2020hgboost,

title={hgboost}, author={Erdogan Taskesen}, year={2020}, howpublished={url{https://github.com/erdogant/hgboost}}}

Indices and tables