Relative information spectra with applications to statistical inference
For any pair of probability measures defined on a common space, their relative information spectra——specifically, the distribution functions of the loglikelihood ratio under either probability measure——fully encapsulate all that is relevant for distinguishing them. This paper explores the properties...
Saved in:
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
AIMS Press
2024-12-01
|
Series: | AIMS Mathematics |
Subjects: | |
Online Access: | https://www.aimspress.com/article/doi/10.3934/math.20241668 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | For any pair of probability measures defined on a common space, their relative information spectra——specifically, the distribution functions of the loglikelihood ratio under either probability measure——fully encapsulate all that is relevant for distinguishing them. This paper explores the properties of the relative information spectra and their connections to various measures of discrepancy including total variation distance, relative entropy, Rényi divergence, and general $ f $-divergences. A simple definition of sufficient statistics, termed $ I $-sufficiency, is introduced and shown to coincide with longstanding notions under the assumptions that the data model is dominated and the observation space is standard. Additionally, a new measure of discrepancy between probability measures, the NP-divergence, is proposed and shown to determine the area of the error probability pairs achieved by the Neyman-Pearson binary hypothesis tests. For independent identically distributed data models, that area is shown to approach 1 at a rate governed by the Bhattacharyya distance. |
---|---|
ISSN: | 2473-6988 |