September 1st, 2025
Novel semi-parametric framework.
Identifying link formation with heterogeneous externalities.
Recursive estimation integrating kernel density and method of moments elements.
Making friends is easy!
Making friends is easy!
Making friends was easy!
Making friends was easy!

\(G_{ij}\)\(\; = {\Large 𝟙}\bigg\{\) \(\mathrm{H}_{i}\)\(\;+\;\)\(\mathrm{H}_{j}\) \(\ge\;\)\(U_{ij}\)\(\;\bigg\}\)
\(G_{ij} = {\Large 𝟙}\bigg\{\) \(h\)\((\)\(X_{i}\)\(,\,\)\(X_{j}\)\()\;+\;\) \(\mathrm{H}_{i} + \mathrm{H}_{j}\) \(\ge U_{ij}\bigg\}\)
\(G_{ij} = {\Large 𝟙}\bigg\{\) \(h\)\((X_{i}, X_{j}) +\;\) \(\mathrm{H}_{i} + \mathrm{H}_{j}\) \(\ge U_{ij}\bigg\}\)
\(h\colon {\mathcal{X}}^{2} \to \mathbb{R}\) examples:
\(G_{ij} = {\Large 𝟙}\bigg\{\) \(h(X_{i}, X_{j}) +\;\) \(\mathrm{H}_{i} + \mathrm{H}_{j}\) \(+ \beta \displaystyle\sum_{k\in\gamma_{n}(i, j)} \mathrm{H}_{k}\) \(\ge U_{ij}\bigg\}\)
\(G_{ij} = {\Large 𝟙}\bigg\{\) \(h(X_{i}, X_{j}) +\;\) \(\mathrm{H}_{i} + \mathrm{H}_{j}\) \(+ \beta \displaystyle\sum_{k\in{\color{red}\gamma_{\color{red}n}}(i, j)} \mathrm{H}_{k}\) \(\ge U_{ij}\bigg\}\)
\[\begin{align*} G_{ij}' &= {\Large 𝟙}\{h'(X_{i}, X_{j}) + \mathrm{H}_{i}' + \mathrm{H}_{j}' + \beta \sum_{k\in\gamma(i, j)} \mathrm{H}_{k}' \ge U_{ij}'\} \\ &= {\Large 𝟙}\{ah(X_{i}, X_{j}) + b + a\mathrm{H}_{i} + a\mathrm{H}_{j} + \beta \sum_{k\in\gamma(i, j)} a \mathrm{H}_{k} \ge aU_{ij} + b\} \\ &= G_{ij} \end{align*}\]
Congruent Nodes:
Copies:
Lemma 1 Under assumptions (A1)-(A6), the order of the fixed effects \({\{\mathrm{H}_{i}\}}_{i\in\mathcal{J}(x)}\) is asymptotically identifiable for any \(\mathcal{J}(x)\) collection of congruent nodes with observable characteristics \(x \in \mathcal{X}\).
\[\begin{align*} \mathbb{E}\left[G_{ij}\mid \gamma(i, j)=\emptyset, \mathrm{H}_{i} = \mathrm{H}_{j} = \eta_{0} , X_{i} = X_{j} = x \right] &= \mathbb{P}\left(U_{ij} \le h(x, x) + u_{0} - h_{}\right) \\ &= \mathbb{P}\left(U_{ij} \le u_{0}\right) \\ &= f_{0} \end{align*}\]
\[\begin{align*} f_{2} & = F_{U}\left(u_{2}\right) \\ &= \mathbb{P}\left(U_{ij} \le u_{2}\right) \\ &= \mathbb{P}\left(U_{ij} \le h(x, x) + \eta_{0} + \eta_{1} \right) \\ &= \mathbb{E}\left[G_{ij} \mid X_{i} = X_{j} = \bar x, \mathrm{H}_{i} = \eta_{0}, \mathrm{H}_{j} = \eta_{1}, \gamma(i, j) = \emptyset\right]. \end{align*}\]
\(f\)\(\;= F_{U}(u)\)
\(\quad = \mathbb{E}\big[G_{ij} \mid X_{i} = X_{j} = x, \mathrm{H}_{i} = \;\)\(\eta'\)\(\;, \mathrm{H}_{j} = \;\)\(\eta''\)\(\;, \gamma(i, j) = \emptyset\big]\)
\(f\)\(\;= F_{U}(u)\)
\(\quad = \mathbb{E}\big[G_{ij} \mid X_{i} = X_{j} = x, \mathrm{H}_{i} = \;\)\(\eta'\)\(\;, \mathrm{H}_{j} = \;\)\(\eta''\)\(\;, \gamma(i, j) = \emptyset\big]\)
\(f\)\(\;= F_{U}(u)\)
\(\quad = \mathbb{E}\big[G_{ij} \mid X_{i} = X_{j} = x, \mathrm{H}_{i} = \;\)\(\eta\)\(\;, \mathrm{H}_{j} = \eta_{m}, \gamma(i, j) = \emptyset\big]\)
Theorem 1 Under assumptions (A1)-(A6), a know hyper-diagonal value \(h_{d}\), and an \(\alpha \in (0, 2^{-1})\), the fixed effects, the error distribution, the externality parameter, and the homophily function are asymptotically uniquely identifiable.
Recursive estimator for \(\boldsymbol{\eta}\), \(\beta\), and \(F_{U}\)
\[\begin{align} F_u(v) &= \mathbb{P}\left(G_{ij}=1|\boldsymbol{\eta},\beta\right) \\ &= \frac{\mathbb{P}\left(G_{ij}=1 \right) f_{v|G_{ij=1}}(v)}{\mathbb{P}\left(G_{ij}=1 \right) f_{v|G_{ij=1}}(v) + \mathbb{P}\left(G_{ij}=0 \right) f_{v|G_{ij=0}}(v)} \\ &\overset{def}{=} \frac{p_{1}(v)}{p_{1}(v) + p_{0}(v)} \end{align}\]
\(F_u\) and \(\boldsymbol{\eta}\) can be estimated simultaneously (KS or DBMM estimators)
Pick normalized candidate \(\boldsymbol{\eta}\), and bandwidth \(b_{0}\)
\(p_{1} (v_{ij};\, \boldsymbol{\eta})= \frac{1}{b^0(|\mathcal{M}|-1)} \displaystyle{\sum_{km \in\{\mathcal{M} - \{ij\}\}}} {\Large 𝟙}_{\{g_{km}=1\}} K \left(\frac{v_{ij}-v_{km}}{b^0} \right)\)
\(p_{0} (v_{ij};\, \boldsymbol{\eta})= \frac{1}{b^0(|\mathcal{M}|-1)} \displaystyle{\sum_{km \in\{\mathcal{M} - \{ij\}\}}} {\Large 𝟙}_{\{g_{km}=0\}} K \left(\frac{v_{ij}-v_{km}}{b^0} \right)\)
\(\hat p_{1} (v;\, \boldsymbol{\hat\eta^{0}})= \frac{1}{b^0|\mathcal{M}|} \displaystyle{\sum_{km \in \mathcal{M}}} {\Large 𝟙}_{\{g_{km}=1\}} K \left(\frac{v-\hat\eta_{k}^{0} -\hat\eta_{m}^{0}}{b^0} \right)\)
\(\hat p_{0} (v;\, \boldsymbol{\hat\eta^{0}})= \frac{1}{b^0|\mathcal{M}|} \displaystyle{\sum_{km \in\mathcal{M}}} {\Large 𝟙}_{\{g_{km}=0\}} K \left(\frac{v-\hat\eta_{k}^{0} -\hat\eta_{m}^{0}}{b^0} \right)\)
\(\hat p_{1} (v;\, \boldsymbol{\hat\eta^{1}}, \hat\beta^1)= \frac{1}{b^1|\mathcal{L}|} \displaystyle{\sum_{km \in \mathcal{L}}} {\Large 𝟙}_{\{g_{km}=1\}} K \left(\frac{v-\hat\eta_{k}^{1} - \hat\eta_{m}^{1} - \hat\beta^1\sum_{j\in\gamma(k,m)}\hat\eta_{j}^{1} }{b^1} \right)\)
\(\hat p_{0} (v;\, \boldsymbol{\hat\eta^{1}}, \hat\beta^1)= \frac{1}{b^1|\mathcal{L}|} \displaystyle{\sum_{km \in\mathcal{L}}} {\Large 𝟙}_{\{g_{km}=0\}} K \left(\frac{v-\hat\eta_{k}^{1} - \hat\eta_{m}^{1} - \hat\beta^1\sum_{j\in\gamma(k,m)}\hat\eta_{j}^{1} }{b^1} \right)\)

