Skip to content

Commit

Permalink
Updating all docs
Browse files Browse the repository at this point in the history
  • Loading branch information
pzivich committed Apr 23, 2024
1 parent 649adfd commit cf805ad
Show file tree
Hide file tree
Showing 12 changed files with 539 additions and 570 deletions.
28 changes: 24 additions & 4 deletions delicatessen/data.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,13 @@ def load_shaq_free_throws():
Returns
-------
ndarray
array :
Returns a 24-by-2 NumPy array.
References
----------
Boos DD, & Stefanski LA. (2013). M-estimation (estimating equations). In Essential Statistical Inference
(pp. 297-337). Springer, New York, NY.
"""
d = np.array([[ 1, 4, 5],
[ 2, 5, 11],
Expand Down Expand Up @@ -43,7 +49,7 @@ def load_shaq_free_throws():


def load_inderjit():
"""Load example data from Inderjit et al. (2002) on the dose-response of herbicide on perennial ryegrass growth
"""Load example data from Inderjit et al. (2002) on the dose-response of herbicide on perennial ryegrass growth.
Notes
-----
Expand All @@ -53,7 +59,13 @@ def load_inderjit():
Returns
-------
ndarray
array :
Returns a 24-by-2 NumPy array.
References
----------
Inderjit, Streibig JC, & Olofsdotter M. (2002). Joint action of phenolic acid mixtures and its significance in
allelopathy research. *Physiologia Plantarum*, 114(3), 422-428.
"""
d = np.array([[7.5800000, 0.00],
[8.0000000, 0.00],
Expand Down Expand Up @@ -83,14 +95,22 @@ def load_inderjit():


def load_robust_regress(outlier=True):
"""Load illustrative example for robust linear regression.
"""Load illustrative example of robust linear regression published in Zivich et al. (2022).
Parameters
----------
outlier : bool, optional
Whether to induce the outlier (``True``) or not (``False``).
Returns
-------
array :
Returns a 15-by-2 NumPy array.
References
----------
Zivich PN, Klose M, Cole SR, Edwards JK, & Shook-Sa BE. (2022). Delicatessen: M-estimation in Python.
*arXiv:2203.11300*.
"""
height = [168.519, 166.944, 164.327, 164.058, 166.212, 167.358,
165.244, 169.352, 159.386, 166.953, 163.876,
Expand Down
41 changes: 10 additions & 31 deletions delicatessen/estimating_equations/basic.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,11 +20,6 @@ def ee_mean(theta, y):
\sum_{i=1}^n (Y_i - \theta) = 0
Note
----
All provided estimating equations are meant to be wrapped inside a user-specified function. Throughtout, these
user-defined functions are defined as ``psi``.
Parameters
----------
theta : ndarray, list, vector
Expand All @@ -36,7 +31,7 @@ def ee_mean(theta, y):
Returns
-------
array :
Returns a 1-by-n NumPy array evaluated for the input ``theta`` and ``y``
Returns a 1-by-`n` NumPy array evaluated for the input ``theta`` and ``y``
Examples
--------
Expand Down Expand Up @@ -89,11 +84,6 @@ def ee_mean_robust(theta, y, k, loss='huber', lower=None, upper=None):
Tukey's biweight, Andrew's Sine, and Hampel. See ``robust_loss_function`` for further details on the loss
functions for the robust mean.
Note
----
All provided estimating equations are meant to be wrapped inside a user-specified function. Throughtout, these
user-defined functions are defined as ``psi``.
Parameters
----------
theta : ndarray, list, vector
Expand All @@ -105,18 +95,18 @@ def ee_mean_robust(theta, y, k, loss='huber', lower=None, upper=None):
Tuning or hyperparameter for the chosen loss function. Notice that the choice of hyperparameter depends on the
loss function.
loss : str, optional
Robust loss function to use. Default is 'huber'. Options include 'andrew', 'hampel', 'huber', 'tukey'.
Robust loss function to use. Default is ``'huber'``. Options include ``'andrew'``, ``'hampel'``, ``'tukey'``.
lower : int, float, None, optional
Lower parameter for the 'hampel' loss function. This parameter does not impact the other loss functions.
Lower parameter for the Hampel loss function. This parameter does not impact the other loss functions.
Default is ``None``.
upper : int, float, None, optional
Upper parameter for the 'hampel' loss function. This parameter does not impact the other loss functions.
Upper parameter for the Hampel loss function. This parameter does not impact the other loss functions.
Default is ``None``.
Returns
-------
array :
Returns a 1-by-n NumPy array evaluated for the input theta and y
Returns a 1-by-`n` NumPy array evaluated for the input ``theta`` and ``y``.
Examples
--------
Expand Down Expand Up @@ -186,12 +176,6 @@ def ee_mean_variance(theta, y):
Unlike ``ee_mean``, ``theta`` consists of 2 parameters. The output covariance matrix will also provide estimates
for each of the ``theta`` values.
Note
----
All provided estimating equations are meant to be wrapped inside a user-specified function. Throughtout, these
user-defined functions are defined as ``psi``.
Parameters
----------
theta : ndarray, list, vector
Expand All @@ -204,7 +188,7 @@ def ee_mean_variance(theta, y):
Returns
-------
array :
Returns a 2-by-n NumPy array evaluated for the input theta and y
Returns a 2-by-`n` NumPy array evaluated for the input ``theta`` and ``y``.
Examples
--------
Expand Down Expand Up @@ -273,12 +257,12 @@ def ee_percentile(theta, y, q):
1-dimensional vector of n observed values. No missing data should be included (missing data may cause unexpected
behavior when attempting to calculate the mean).
q : float
Percentile to calculate. Must be (0, 1)
Percentile to calculate. Must be :math:`(0, 1)`
Returns
-------
array :
Returns a 1-by-n NumPy array evaluated for the input theta and y
Returns a 1-by-`n` NumPy array evaluated for the input ``theta`` and ``y``.
Examples
--------
Expand Down Expand Up @@ -309,7 +293,7 @@ def ee_percentile(theta, y, q):
>>> estr.theta
Then displays the estimated percentile / median. In this example, there is a difference between the closed form
solution (-0.07978) and M-Estimation (-0.06022).
solution (``-0.07978``) and M-Estimation (``-0.06022``).
References
----------
Expand Down Expand Up @@ -352,11 +336,6 @@ def ee_positive_mean_deviation(theta, y):
sandwich) cannot be used to estimate the variance. This estimating equation is offered for completeness, but is not
generally recommended for applications.
Note
----
All provided estimating equations are meant to be wrapped inside a user-specified function. Throughtout, these
user-defined functions are defined as ``psi``.
Parameters
----------
theta : ndarray, list, vector
Expand All @@ -369,7 +348,7 @@ def ee_positive_mean_deviation(theta, y):
Returns
-------
array :
Returns a 2-by-n NumPy array evaluated for the input theta and y
Returns a 2-by-`n` NumPy array evaluated for the input ``theta`` and ``y``.
Examples
--------
Expand Down
Loading

0 comments on commit cf805ad

Please sign in to comment.