Tree based AI
算是又开了一个新坑吧👍,但是这部分我不会写的特别细,有的部分就用对应的资料代替了,如果我自己写,估计也是差不多,索性就省略掉,记录主要的思路和重要的推导。
Still updating
提纲:
- Decision Trees / CART
- Bagging / Random Forest
- Boosting / AdaBoosting
- GBDT / XGB
资料:
- CS229
- https://aman.ai/cs229
- Bloomberg
Decision Tree
看这篇文章:https://aman.ai/cs229/decision-trees/,还有Canada小哥的视频:video。
需要补充的点:
- 损失函数的解释:purity pdf
- 决策树的种类
- 创建树的训练过程
Different Decision Tree
different-decision-tree-algorithms-with-comparison-of-complexity-or-performance
https://blog.csdn.net/yinyu19950811/article/details/89575462
Training Process
https://github.com/XuRui314/MIT_6.036_homework_zxr/blob/main/Tree_Based_AI.ipynb
https://blog.csdn.net/jiaoyangwm/article/details/79525237
Bagging
https://carbonati.github.io/posts/random-forests-from-scratch/
Boosting
GBDT
XGB
All articles in this blog are licensed under CC BY-NC-SA 4.0 unless stating additionally.
Comment
ValineDisqus