On Predictive Density Estimation under α-Divergence Loss
- Авторы: L’Moudden A.1, Marchand È.1
-
Учреждения:
- Dept. de math.
- Выпуск: Том 28, № 2 (2019)
- Страницы: 127-143
- Раздел: Article
- URL: https://journal-vniispk.ru/1066-5307/article/view/225906
- DOI: https://doi.org/10.3103/S1066530719020030
- ID: 225906
Цитировать
Аннотация
Based on X ∼ Nd(θ, σX2Id), we study the efficiency of predictive densities under α-divergence loss Lα for estimating the density of Y ∼ Nd(θ, σY2Id). We identify a large number of cases where improvement on a plug-in density are obtainable by expanding the variance, thus extending earlier findings applicable to Kullback-Leibler loss. The results and proofs are unified with respect to the dimension d, the variances σX2 and σY2, the choice of loss Lα; α ∈ (−1, 1). The findings also apply to a large number of plug-in densities, as well as for restricted parameter spaces with θ ∈ Θ ⊂ ℝd. The theoretical findings are accompanied by various observations, illustrations, and implications dealing for instance with robustness with respect to the model variances and simultaneous dominance with respect to the loss.
Об авторах
A. L’Moudden
Dept. de math.
Автор, ответственный за переписку.
Email: aziz.lmoudden@usherbrooke.ca
Канада, Sherbrooke, Qc
È. Marchand
Dept. de math.
Автор, ответственный за переписку.
Email: eric.marchand@usherbrooke.ca
Канада, Sherbrooke, Qc
Дополнительные файлы
