Open Access Open Access  Restricted Access Subscription or Fee Access

Some Bounds of Information Divergence Measures In Terms of Relative Arithmetic-Geometric Divergence

K.C. Jain, Ram Naresh Saraswat


Information and divergence measure are very useful and play an important role in many areas like as Sensor Networks [11], Testing the order in a Markov chain [12], Risk for binary experiments[13], Region segmentation and estimation [14] etc. In this paper we establish an upper and lower bounds of Chi-square divergence, Relative J-divergence, Jenson-Shannon’s divergence, Triangular discrimination etc. in terms of Relative arithmetic-geometric divergence measure using a new f-divergence measure and inequalities


Chi-square divergence, Jenson-Shannon’s divergence, Triangular discrimination etc.

Full Text:



  • There are currently no refbacks.

Disclaimer/Regarding indexing issue:

We have provided the online access of all issues and papers to the indexing agencies (as given on journal web site). It’s depend on indexing agencies when, how and what manner they can index or not. Hence, we like to inform that on the basis of earlier indexing, we can’t predict the today or future indexing policy of third party (i.e. indexing agencies) as they have right to discontinue any journal at any time without prior information to the journal. So, please neither sends any question nor expects any answer from us on the behalf of third party i.e. indexing agencies.Hence, we will not issue any certificate or letter for indexing issue. Our role is just to provide the online access to them. So we do properly this and one can visit indexing agencies website to get the authentic information.