Tweeted By @rasbt
(Of course) XGBoost does not always win on tabular datasets. Made a HW where students got to tinker w. hyperparam tuning techniques (grid search, randomized search, Hyperopt, Optuna, successive halving) and algos (GBMs and everything in scikit-learn + mlxtend). Top-10 results: pic.twitter.com/aq88Uw5dkF
— Sebastian Raschka (@rasbt) November 16, 2021