Open Access Open Access  Restricted Access Access granted  Restricted Access Subscription Access

Vol 28, No 1 (2019)

Article

Bayesian Predictive Distribution for a Negative Binomial Model

Hamura Y., Kubokawa T.

Abstract

Estimation of the predictive probability function of a negative binomial distribution is addressed under the Kullback—Leibler risk. An identity that relates Bayesian predictive probability estimation to Bayesian point estimation is derived. Such identities are known in the cases of normal and Poisson distributions, and the paper extends the result to the negative binomial case. By using the derived identity, a dominance property of a Bayesian predictive probability is studied when the parameter space is restricted.

Mathematical Methods of Statistics. 2019;28(1):1-17
pages 1-17 views

Density Estimation for RWRE

Havet A., Lerasle M., Moulines É.

Abstract

We consider the problem of nonparametric density estimation of a random environment from the observation of a single trajectory of a random walk in this environment. We build several density estimators using the beta-moments of this distribution. Then we apply the Goldenschluger-Lepski method to select an estimator satisfying an oracle type inequality. We obtain non-asymptotic bounds for the supremum norm of these estimators that hold when the RWRE is recurrent or transient to the right. A simulation study supports our theoretical findings.

Mathematical Methods of Statistics. 2019;28(1):18-38
pages 18-38 views

A Semi-Parametric Mode Regression with Censored Data

Khardani S.

Abstract

In this work we suppose that the random vector (X, Y) satisfies the regression model Y = m(X) + ϵ, where m(·) belongs to some parametric class {\({m_\beta}(\cdot):\beta \in \mathbb{K}\)} and the error ϵ is independent of the covariate X. The response Y is subject to random right censoring. Using a nonlinear mode regression, a new estimation procedure for the true unknown parameter vector β0is proposed that extends the classical least squares procedure for nonlinear regression. We also establish asymptotic properties for the proposed estimator under assumptions of the error density. We investigate the performance through a simulation study.

Mathematical Methods of Statistics. 2019;28(1):39-56
pages 39-56 views

On the Power of Pearson’s Test under Local Alternatives in Autoregression with Outliers

Boldin M.V.

Abstract

We consider a stationary linear AR(p) model with contamination (gross errors in the observations). The autoregression parameters are unknown, as well as the distribution of innovations. Based on the residuals from the parameter estimates, an analog of the empirical distribution function is defined and a test of Pearson’s chi-square type is constructed for testing hypotheses on the distribution of innovations. We obtain the asymptotic power of this test under local alternatives and establish its qualitative robustness under the hypothesis and alternatives.

Mathematical Methods of Statistics. 2019;28(1):57-65
pages 57-65 views

A Large Deviation Approximation for Multivariate Density Functions

Joutard C.

Abstract

We establish a large deviation approximation for the density of an arbitrary sequence of random vectors, by assuming several assumptions on the normalized cumulant generating function and its derivatives. We give two statistical applications to illustrate the result, the first one dealing with a vector of independent sample variances and the second one with a Gaussian multiple linear regression model. Numerical comparisons are eventually provided for these two examples.

Mathematical Methods of Statistics. 2019;28(1):66-73
pages 66-73 views

Outliers and the Ostensibly Heavy Tails

Klebanov L., Volchenkova I.

Abstract

The aim of the paper is to show that the presence of one possible type of outliers is not connected to that of heavy tails of the distribution. In contrary, typical situation for outliers appearance is the case of compactly supported distributions.

Mathematical Methods of Statistics. 2019;28(1):74-81
pages 74-81 views