Scientific Library of Tomsk State University

   E-catalog        

Normal view MARC view

Minimum variance and minimum Kulback-Leibler mean estimation with a known quantile Z. N. Zenkova, W. Musoni, S. S. Tarima

By: Zenkova, Zhanna NContributor(s): Musoni, Wilson | Tarima, Sergey SMaterial type: ArticleArticleContent type: Текст Media type: электронный Subject(s): кумулятивная функция распределения | выборочное среднее | Кульбака-Лейблера расхождение | квантилиGenre/Form: статьи в сборниках Online resources: Click here to access online In: Пятая Международная конференция по стохастическим методам (МКСМ-5) : материалы Международной научной конференции, Россия, Москва, 23-27 ноября 2020 г С. 236-240Abstract: This work compares two mean estimators, MV and MKL, which incorporate information about a known quantile. MV minimizes variance and MKL minimizes Kulback-Leibler divergence. Both estimators are asymptotically equivalent and normally distributed but dier at nite sample sizes. Monte-Carlo simulation studies show that MV has higher mean squared error than MKL in the majority of simulated scenarios. Authors recommend using MKL when a quantile of an underlying distribution is known.
Tags from this library: No tags from this library for this title. Log in to add tags.
No physical items for this record

Библиогр.: 14 назв.

This work compares two mean estimators, MV and MKL, which incorporate information about a known quantile. MV minimizes variance and MKL minimizes Kulback-Leibler divergence. Both estimators are asymptotically equivalent and normally distributed but dier at nite sample sizes. Monte-Carlo simulation studies show that MV has higher mean squared error than MKL in the majority of simulated scenarios. Authors recommend using MKL when a quantile of an underlying distribution is known.

There are no comments on this title.

to post a comment.
Share