Title : ( Cumulative α -Jensen–Shannon measure of divergence: Properties and applications )
Authors: haniyeh riyahi , Mohammad Baratnia , Mahdi Doostparast ,Access to full-text not allowed by authors
Abstract
The problem of quantifying the distance between distributions arises in various fields, including cryptography, information theory, communication networks, machine learning, and data mining. In this paper, the analogy with the cumulative Jensen–Shannon divergence, defined in [Non-parametric Jensen-Shannon divergence in Joint European Conference on Machine Learning and Knowledge Discovery in Databases, Springer, 2015, 173–189], we propose a new divergence measure based on the cumulative distribution function and call it the cumulative alpha-Jensen–Shannon divergence, denoted by CJS (alpha). Properties of CJS (alpha) are studied in detail, and also two upper bounds for CJS (alpha) are obtained. The simplified results under the proportional reversed hazard rate model are given. Various illustrative examples are analyzed.
Keywords
Distribution function; Entropy; Kullback–Leibler divergence; Proportional reversed hazard rate model@article{paperid:1093893,
author = {Riyahi, Haniyeh and محمد برات نیا and Doostparast, Mahdi},
title = {Cumulative
α
-Jensen–Shannon measure of divergence: Properties and applications},
journal = {Communications in Statistics - Theory and Methods},
year = {2023},
volume = {53},
number = {17},
month = {August},
issn = {0361-0926},
pages = {5989--6011},
numpages = {22},
keywords = {Distribution function; Entropy; Kullback–Leibler divergence; Proportional reversed hazard rate model},
}
%0 Journal Article
%T Cumulative
α
-Jensen–Shannon measure of divergence: Properties and applications
%A Riyahi, Haniyeh
%A محمد برات نیا
%A Doostparast, Mahdi
%J Communications in Statistics - Theory and Methods
%@ 0361-0926
%D 2023