Skip to content

Commit 09b1380

Browse files
committed
Update notes
1 parent 12ce810 commit 09b1380

File tree

2 files changed

+4
-4
lines changed

2 files changed

+4
-4
lines changed

notes/main.typ

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -435,21 +435,21 @@ to measure the importance of variable $x$ by comparing the loss when including $
435435
Ensemble learning, as the name suggests, combines multiple basde models to improve prediction performance. Common ensemble learning methods include bagging, boosting, and stacking.
436436

437437
=== Bagging
438-
BoostrapAggregating (bagging) is proposed by Leo Breiman @breiman1996bagging. The key idea is to generate multiple boostrap samples from the original data and train the base model on each boostrap sample then aggragate the predictions from all models, such as by averaging for regression or majority voting for classification.
438+
BoostrapAggregating (bagging) is proposed by Leo Breiman @breiman1996bagging (4 w + citations). The key idea is to generate multiple boostrap samples from the original data and train the base model on each boostrap sample then aggragate the predictions from all models, such as by averaging for regression or majority voting for classification.
439439

440440
==== Theory properties of bagging
441441

442442
- @breiman1996bagging Bagging can reduce the variance of unstable base models.
443-
- Peter Bühlmann and Bin Yu @buhlmann2002analyzing give some convergence rate analysis for bagging.
443+
- Peter Bühlmann and Bin Yu @buhlmann2002analyzing (1.2 k + citations) give some convergence rate analysis for bagging.
444444
- In the 2000s, many works on the theoretical understanding of bagging, seems no more work needed now.
445445

446446
==== Random forest
447447

448-
- Leo Breiman @breiman2001random proposed random forest, an ensemble learning method that builds multiple decision trees and merges their results to improve accuracy.
448+
- Leo Breiman @breiman2001random (17 w + citations) proposed random forest, an ensemble learning method that builds multiple decision trees and merges their results to improve accuracy.
449449

450450
==== Causal forest
451451

452-
- @athey2016recursive proposed causal tree to estimate heterogeneous treatment effects namely the conditional average treatment effect (CATE) by extending decision tree, then @wager2018estimation using random forest to improve the estimation accuracy and provide asymptotic normality for inference.
452+
- @athey2016recursive(2.5 k + citations) proposed causal tree to estimate heterogeneous treatment effects namely the conditional average treatment effect (CATE) by extending decision tree, then @wager2018estimation(4.3 k + citations) using random forest to improve the estimation accuracy and provide asymptotic normality for inference.
453453

454454
- @cattaneo2025honest establishes an inconsistency lower bound on the point wise convergence rate of causal tree, and challenges the $alpha$-regularity condition (each split leaves at least a fraction $alpha$ of available samples on each side) needed to establish the convergence rate in @wager2018estimation.
455455

static/notes/notes.pdf

128 Bytes
Binary file not shown.

0 commit comments

Comments
 (0)