You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: notes/Master.bib
+23Lines changed: 23 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -1,3 +1,26 @@
1
+
@article{agrawal2024automated,
2
+
title = {Automated efficient estimation using monte carlo efficient influence functions},
3
+
author = {Agrawal, Raj and Witty, Sam and Zane, Andy and Bingham, Elias},
4
+
journal = {Advances in Neural Information Processing Systems},
5
+
volume = {37},
6
+
pages = {16102--16132},
7
+
year = {2024}
8
+
}
9
+
@article{baydin2018automatic,
10
+
title = {Automatic differentiation in machine learning: a survey},
11
+
author = {Baydin, Atilim Gunes and Pearlmutter, Barak A and Radul, Alexey Andreyevich and Siskind, Jeffrey Mark},
12
+
journal = {Journal of machine learning research},
13
+
volume = {18},
14
+
number = {153},
15
+
pages = {1--43},
16
+
year = {2018}
17
+
}
18
+
19
+
@article{paszke2017automatic,
20
+
title = {Automatic differentiation in pytorch},
21
+
author = {Paszke, Adam and Gross, Sam and Chintala, Soumith and Chanan, Gregory and Yang, Edward and DeVito, Zachary and Lin, Zeming and Desmaison, Alban and Antiga, Luca and Lerer, Adam},
22
+
year = {2017}
23
+
}
1
24
@article{wang2025causal,
2
25
title = {Causal Inference: A Tale of Three Frameworks},
3
26
author = {Wang, Linbo and Richardson, Thomas and Robins, James},
Copy file name to clipboardExpand all lines: notes/main.typ
+13-5Lines changed: 13 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -320,6 +320,14 @@ Formulation in @van1991differentiable
320
320
=== Numerical calcaulation of influence function
321
321
@mukhinkernel
322
322
@jordan2022empirical
323
+
@agrawal2024automated
324
+
325
+
326
+
Automatic differentiation is remarkable! By building in all basic differentiable operations such as addition, multiplication, sine, and exponential functions, it calculates every derivative value and uses the chain rule to combine them, yielding derivatives equivalent to those computed by exact formulas.
327
+
@paszke2017automatic@baydin2018automatic
328
+
329
+
A conversation with #link("https://chat.qwen.ai/s/91761ba1-a011-4804-8f3d-38eadcc90472?fev=0.1.10")[Qwen]. The delta method used to estimate the variance of coefficients in our work @chen2024method requires computing numerical derivatives of a complex mapping involving nonlinear function solving and integration. This can be made differentiable using "AD"-friendly functions in the implementation. Therefore, #link("https://github.com/cxy0714/Method-of-Moments-Inference-for-GLMs/blob/main/demo_glm_MoM/function_of_glm_mom.R")[our R code] could be replaced with more powerful Python code.
330
+
323
331
=== Von mise representation
324
332
325
333
=== Tangent space
@@ -332,13 +340,13 @@ S8 in @graham2024towards
332
340
333
341
@wang2024multi used a little different neyman orthogonality. Their problem can be summarized by following:
334
342
335
-
When the model is $X ~ PP_(theta, overline(eta))$ where $overline(eta)$ is the (nuisance) parameter and $theta$ is the finite dimensional parameter of interest and $theta = R( overline(eta) ) = limits("max")_(eta) R( eta )$ where $R(eta) = EE_(X)L(X;eta)$ and $L$ is a loss function.
343
+
When the model is $X ~ PP_(theta, overline(eta))$ where $overline(eta)$ is the (nuisance) parameter and $theta$ is the finite dimensional parameter of interest and $theta = R( overline(eta) ) = limits("max")_(eta) R( eta )$ where $R(eta) = EE_(X)L(X;eta)$ and $L$ is a loss function.
0 commit comments