کارگاه مفصل و کاربردهایی از ان , 2015-02-26

Title : ( Properties of Csiszar φ-divergence and links via copula function )

Authors: Gholam Reza Mohtashami Borzadaran , Mohammad Amini ,

Citation: BibTeX | EndNote

Abstract

After the invention of the Shannon 1948 the extension of notion for the measure of information, with application in communication theory have been obtained. With the paper of Kullback and Leibler (1951) and their book in 1955 have raised the link between statistical inference and information theory with the applications in various fields. The Kullback-Leibler (KL) information is a fundamental quantity in probability and statistics, that measures the similarity of two distributions. Many divergence such as Renyi, Hellinger, χ2 divergence, Tsallis and Bhattacharyya distance have attracted the attention of many researches for various interpretations in these divergences have been applied to discovering the phenomena in the world via the KL information idea. We have surveyed the generalization of entropy and divergences between two probability density functions (or pmfs). Csiszar (1963) defined a general divergence that subsumes lots of well-known measures. We discussed and reviewed some of properties of this divergence, and also most of the special cases. Here, we have concentrated mostly on bivariate distributions. The mutual information can be expressed as a distance to the statistical independence in the space of distribution measured by the KL information between the actual joint distribution and the product of two marginal. Basic general properties of φ−divergences, including their axiomatic, and some important classes of φ−divergences are derived via the distance between the joint distribution and its product of marginal. This can be related to dependence. Minimization of this divergence under constraints is also a direction of this work. Copula constructed the bivariate (or multivariate) distribution function based on a function of marginal distribution functions that was introduced by Sklar (1959). Connections between the above results and its alternative via copula is our interest in this paper. The copula entropy is the difference between joint entropy and sum of the marginal entropy is an important rule. Our attention will be on φ−divergence based on copula entropy and dependence. The principle of maximum entropy provides a method to select the unknown pdf (or pmf) compatible to entropy under a specified constraint. This idea was introduced by Jaynes 1957 and obtained with a theorem by Kagan et al. (1973). Similar to arguments in minimization of Kullback Leibler information, we have applied to the minimization of φ−divergence under specified constraints. Also, maximum copula entropy under constraint with simple example are achieved. Finding copula entropy in bivariate equilibrium distribution and inequality of φ−divergence for weakly negative dependence via some distributions is the last part of the paper.

Keywords

, Maximum entropy, Phi divergence, Reyin entropy, Copula, Dependence
برای دانلود از شناسه و رمز عبور پرتال پویا استفاده کنید.

@inproceedings{paperid:1047293,
author = {Mohtashami Borzadaran, Gholam Reza and Amini, Mohammad},
title = {Properties of Csiszar φ-divergence and links via copula function},
booktitle = {کارگاه مفصل و کاربردهایی از ان},
year = {2015},
location = {کرمان, IRAN},
keywords = {Maximum entropy; Phi divergence; Reyin entropy; Copula; Dependence},
}

[Download]

%0 Conference Proceedings
%T Properties of Csiszar φ-divergence and links via copula function
%A Mohtashami Borzadaran, Gholam Reza
%A Amini, Mohammad
%J کارگاه مفصل و کاربردهایی از ان
%D 2015

[Download]