Communications in Statistics - Theory and Methods, ( ISI ), Year (2023-8)

Title : ( Cumulativeα-Jensen–Shannon measure of divergence: Properties and applications )

Authors: haniyeh riyahi , Mohammad Baratnia , Mahdi Doostparast ,

Access to full-text not allowed by authors

Citation: BibTeX | EndNote

Abstract

The problem of quantifying the distance between distributions arises in various fields, including cryptography, information theory, communication networks, machine learning, and data mining. In this paper, the analogy with the cumulative Jensen–Shannon divergence, defined in [Non-parametric Jensen-Shannon divergence in Joint European Conference on Machine Learning and Knowledge Discovery in Databases, Springer, 2015, 173–189], we propose a new divergence measure based on the cumulative distribution function and call it the cumulative alpha-Jensen–Shannon divergence, denoted by CJS (alpha). Properties of CJS (alpha) are studied in detail, and also two upper bounds for CJS (alpha) are obtained. The simplified results under the proportional reversed hazard rate model are given. Various illustrative examples are analyzed.

Keywords

Distribution function; Entropy; Kullback–Leibler divergence; Proportional reversed hazard rate model
برای دانلود از شناسه و رمز عبور پرتال پویا استفاده کنید.

@article{paperid:1093893,
author = {Riyahi, Haniyeh and محمد برات نیا and Doostparast, Mahdi},
title = {Cumulativeα-Jensen–Shannon measure of divergence: Properties and applications},
journal = {Communications in Statistics - Theory and Methods},
year = {2023},
month = {August},
issn = {0361-0926},
keywords = {Distribution function; Entropy; Kullback–Leibler divergence; Proportional reversed hazard rate model},
}

[Download]

%0 Journal Article
%T Cumulativeα-Jensen–Shannon measure of divergence: Properties and applications
%A Riyahi, Haniyeh
%A محمد برات نیا
%A Doostparast, Mahdi
%J Communications in Statistics - Theory and Methods
%@ 0361-0926
%D 2023

[Download]