771: Gradient Boosting: XGBoost, LightGBM and CatBoost, with Kirill Eremenko

Release Date:

Kirill Eremenko joins Jon Krohn for another exclusive, in-depth teaser for a new course just released on the SuperDataScience platform, “Machine Learning Level 2”. Kirill walks listeners through why decision trees and random forests are fruitful for businesses, and he offers hands-on walkthroughs for the three leading gradient-boosting algorithms today: XGBoost, LightGBM, and CatBoost.

This episode is brought to you by Ready Tensor, where innovation meets reproducibility (https://www.readytensor.ai/), and by Data Universe, the out-of-this-world data conference (https://datauniverse2024.com). Interested in sponsoring a SuperDataScience Podcast episode? Visit passionfroot.me/superdatascience for sponsorship information.

In this episode you will learn:
• All about decision trees [09:17]
• All about ensemble models [21:43]
• All about AdaBoost [36:47]
• All about gradient boosting [45:52]
• Gradient boosting for classification problems [59:54]
• Advantages of XGBoost [1:03:51]
• LightGBM [1:17:06]
• CatBoost [1:32:07]

Additional materials: www.superdatascience.com/771

771: Gradient Boosting: XGBoost, LightGBM and CatBoost, with Kirill Eremenko

Title
771: Gradient Boosting: XGBoost, LightGBM and CatBoost, with Kirill Eremenko
Copyright
Release Date

flashback