Qualitative Properties of Randomized Maximum Entropy Estimates of Probability Density Functions

The problem of randomized maximum entropy estimation for the probability density function of random model parameters with real data and measurement noises was formulated. This estimation procedure maximizes an information entropy functional on a set of integral equalities depending on the real data...

Full description

Bibliographic Details
Published in:Mathematics
Main Author: Yuri S. Popkov
Format: Article in Journal/Newspaper
Language:English
Published: MDPI AG 2021
Subjects:
Online Access:https://doi.org/10.3390/math9050548
https://doaj.org/article/555f4df5ad7e43869a20acb13498a4d2
Description
Summary:The problem of randomized maximum entropy estimation for the probability density function of random model parameters with real data and measurement noises was formulated. This estimation procedure maximizes an information entropy functional on a set of integral equalities depending on the real data set. The technique of the Gâteaux derivatives is developed to solve this problem in analytical form. The probability density function estimates depend on Lagrange multipliers, which are obtained by balancing the model’s output with real data. A global theorem for the implicit dependence of these Lagrange multipliers on the data sample’s length is established using the rotation of homotopic vector fields. A theorem for the asymptotic efficiency of randomized maximum entropy estimate in terms of stationary Lagrange multipliers is formulated and proved. The proposed method is illustrated on the problem of forecasting of the evolution of the thermokarst lake area in Western Siberia.