This article was downloaded by: [University of Limerick]On: 22 April 2013, At: 08:19Publisher: Taylor & FrancisInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: MortimerHouse, 37-41 Mortimer Street, London W1T 3JH, UK
Communications in Statistics - Theory andMethodsPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/lsta20
An information improvement generating functionD.S. Hooda a & Umed Singh aa Deptt.of Mathematics & Statistics, Ha.ryana Agricultural U niversity, Hisar, 125004, INDIAVersion of record first published: 27 Jun 2007.
To cite this article: D.S. Hooda & Umed Singh (1990): An information improvement generating function,Communications in Statistics - Theory and Methods, 19:3, 1037-1046
To link to this article: http://dx.doi.org/10.1080/03610929008830245
PLEASE SCROLL DOWN FOR ARTICLE
Full terms and conditions of use: http://www.tandfonline.com/page/terms-and-conditions
This article may be used for research, teaching, and private study purposes. Any substantial orsystematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distributionin any form to anyone is expressly forbidden.
The publisher does not give any warranty express or implied or make any representation that thecontents will be complete or accurate or up to date. The accuracy of any instructions, formulae, anddrug doses should be independently verified with primary sources. The publisher shall not be liable forany loss, actions, claims, proceedings, demand, or costs or damages whatsoever or howsoever causedarising directly or indirectly in connection with or arising out of the use of this material.
COMMUN. STATIST.-THEORY METH., 1 9 ( 3 ) , 1037-1046 (1990)
AN INFORMATION IMPROVEMENT GENERATING FUNCTION
11.SHooda and U m e d Singh
Ilc,ptt.of M a h e m a t i c s & S t a t i s t i c s Ha ryana Agricul tural U niversi ty ,
Hisar- 125 004 (IND IA)
Key Words and Phrases : Mclment genera t ing funct ion; i n f o r m ~ . t i o n genera t ing funct ion; informz.tion improve- ment ; d i s c r e t e dis t r ibut ion; var iat ion of information; probabi l i ty and re fe rence measures.
ABSTRACT
H e r e w e def ine a n in format ion improvement genera t ing funct ion
whose derivative a t point 1 gives The i l l s measure of information
improvem e n t which has wide appl icat ions i n Economics. It con ta ins
Guiasu and R e i s c h e i r l s relat ive in format ion genera t ing funct ion and
Gulom b l s in format ion genera t ing func t ion a s par t i cu la r cases. Simple
expressions fo r impor tan t d i s c r e t e dis t r ibut ions have been o b t a i n e d
It h a s a l so been shown t h a t t h e i n f o r m ~ ~ t i o n improvemc:nt genera t ing
func t ion sugges t s a new informat ion indicator as t h e s tandard
deviation of t h e variat ion of i n f o r m a t i o n
1. IN'I'ROIXI CTION
'The successive derivatives of t h e moment genera t ing funct ion a t
point 0 gives t h e successive moments of a probabi l i ty dis t r ibut ion if
t h e s e moments exist. In t h e s a m e way t h e informcttion generat ing
func t ion of a probabi l i ty dis t r ibut ion whose derivative, c a l c u l a t e d a t
point 1, gives m o m e n t s of a "self- information" measure fo r t h e
probability dis t r ibut ion a s discussed in sec t ion 2. F o r a n example ,
Copyright O 1990 by Marcel Dekker, Inc.
Dow
nloa
ded
by [
Uni
vers
ity o
f L
imer
ick]
at 0
8:19
22
Apr
il 20
13
1038 HOODA AND SINGH
t h e f i r s t derivative of Golomb 's (1966) informz~tion generat ing
func t ion a t point 1 gives Shannon ' s en t ropy ( in f a c t , negat ive en t ropy)
of t h e corresponding probability distribution. T h e formula t ion works
equal ly well f o r bo th d i s c r e t e and cont inuous dis t r ibut ions, and Golomb
derived s imple expression of t h e informetion genera t ing funct ion for
the G h i f o r m , G e o m e t r i c , Z e t a , Exponential , Pare to and Nclrmal
distributions.
L a t e r on Guiasu and R ~ i s c h e r (1985) introduced t h e relat ive
in formzt ion genera t ing funct ion whose derivative gives wr l l known
s t a t i s t i c a l indices a s t h e Kullback-Leibler divergence b e t w r e n t w o
probability dis t r ibut ions and W ~ t a n a b e ' s measure of interdependence.
It con ta ins Golomb 's informs.tion generat ing funct ion a s a part icular
c a s e and includes both t h e bionomial and t h e Poisson dis tr ibut ions
which were not covered in Golomb 's work.
R e c e n t l y , Hooda and Singh (1988) defined a quant i t a t ive and
qual i ta t ive information generat ing funct ion whose derivative a t
point 1 gives useful informz~tion measure introduced by Belis and
Guiasu (1968). It con ta ins Golorr t l ' s inform:,tior. generat ing funct ion
a s a part icular case. These au thors obtained expressions fo r various
d i s c r e t e and cont inuous distributions.
In the presen t paper w e define an information improvement
genera t ing funct ion whcse derivative a t point 1 gives T h e i l ' s measure
(1967) of information-im~trovement which has a wide appl icat ion in
Economics. I t contains Guiasu and R ~ i s c h e r ' s re lat ive information
genera t ing f unctlon and Gulom b ' s information genera t ing funct ion a s
par t i cu la r cases. Simple expressions f o r c e r t a i n d i s c r e t e dis t r ibut ions
viz., g e o m e t r i c , binomial and Poisson dis tr ibut ions have been obtained.
T h e expression h a s a l so been derived considering t h r e e power ( t h e a , 0 and y-powers) distributions. I t has also been shown t h a t th i s
information improvement genera t ing funct ion suggests a new
information indicator as t h e s tandard deviation of t h e variation of
information. I t s appl icat ion has been given f o r t w o s tandard d i s c r e t e
distributions.
Dow
nloa
ded
by [
Uni
vers
ity o
f L
imer
ick]
at 0
8:19
22
Apr
il 20
13
AN INFORMATION IMPROVEMENT GENERATING FUNCTION 1039
L.et (X ,m) be a measure space. The initial mcasure m is the:
Lebesgue measure in the continuous case and one that assigns the
units t o each point in the discrete case. Let v be a reference
measure which is revised to w-measure on the basis of probability
measure p defined on the sample space X such that w << v << p <<m,
where "<<I' means "absolute continuous w. r. t. m". For more details
refer Halmos (1962). If m is a totally o -f inite measure, then we
can define t h e Random- Nikodym derivatives a s follows:
which a r e densities corresponding to the three measures u , v and w .
Suppose h is str ict ly positive m- almc~st ev~rywt~ere . The informtltion
generating function of f ( or p ), given the reference measure g( or v )
which was revised by h( or w ), i s defined as
provided that the integrals a r e convergent.
Obviously R ( f : g : b ; 1) = 1 and, rth derivative of (2.1) is
provided that the integral converges.
In particular ,
R 1 ( f : g : h ; 1 ) =If log % d m , X
(2.3 )
which is just the Theills (1967) measure of informa tion improvement
of v from w . W e deonote (2.3) by I (IJ ; v ; w ) which is a
measure of inaccuracy when the reference measure w is replaced
by v . I f v a n d ware finite measures, we have
Dow
nloa
ded
by [
Uni
vers
ity o
f L
imer
ick]
at 0
8:19
22
Apr
il 20
13
HOODA AND SINGH
t o t h e equa l i ty if and only if v = [ 1 7 3 I"
If (2.1) is convergent f o r t 3 1 , then f o r 0 2 . w e have
Now vsing
t in ( 2 4 ) w e ge t
I f i s probabi l i ty measures and 11 z v I. m where m6:ans
"equivalent &o1I, t h e n f and g a r e f i n i t e and s t r i c t l y positive a l m c s t
everywhere , and from (2.3), w e g e t
f R ' ( f : g : h ; i ) + ~ ~ ( ~ : f : h ; i ) = J { f log %+g log dm ,
X which i s t h e amount of J-divergence in i n f o r m d o n i m p r o v e m m t
be tween p r o b ~ b i l i t y mc:asures p and v with regard t o re fe rence
measure . I f we put w=m i.e. h ( x ) =1 and nl-almost everywhere i n (21) , w e
ob ta in
R ( f :g : l ; t ) = ~ f ( ~ ) ~ - ' dm , (2 .5 ) X
which is uniformly convergent f o r any t+1 and in par t i cu la r
which is Kc:rr idgels m e a s u r e (1967) of inaccuracy b e t w e e n probability
measure p and r e f e r e n c e m e a s u r e v .
I t is well known t h a t derivatives of moment genera t ing
funct ion evaluated a t ze ro yield various moments of t h e distribution. Dow
nloa
ded
by [
Uni
vers
ity o
f L
imer
ick]
at 0
8:19
22
Apr
il 20
13
AN INFORMATION IMPROVEMENT GENERATING FUNCTION 1041
The role played by zero in de te rmin ing moments is played by t = l
f o r t h e i n f o r m r t i o n genera t ing functions.
Taking t = 2 in (2.5), we ob ta in
which is ano ther s t a t i s t i c a l index assoc ia ted wi th probability dis t r i -
but ions f and g.
3. THE DISCRETE CASE
L I ~ X he a countab le s p a c e and m ( n ) = l f o r every n E X. T h e n
and
In c a s e f (n )=g(n) f o r a l l n E ~ , ( 3 . 2 ) b e c o m m Kul lback ' s measure
(1968) of relat ive information (d i rec ted divergence or amount of
i n f o r m ~ t i o n ) assoc ia ted with the t w o probability dis t r ibut iocs
g and h When h ( n ) = l fo r every n , X , then ( 5 2 ) becomes
nega t ive of Ker r idge ' s (1967) measure of inaccuracy assoc ia ted with
t h e t w o probabi l i ty dis t r ibut ions f and g.
A s par t i cu la r examples , W E give t h e information improvement
genera t ing func t ions a n d corresponding in format ion measures derived
f r o m i t fo r g e o r n e t r i ~ , b i n o m i a l , Poisson and a - p o w e r probability
distributions.
( i ) L e t t h e t h r e e g e o m e t r i c probabi l i ty dis t r ibut ions f , g and h b e
2 2 respect ively given by { m,mP,w ,.... 1, p + m = l , { q,qp,qp ,..... },p+q=l 2 and{ v,vu,vu ,......I, u+v=l , t h e n w e have
q ) t - 1 $ R(f:g:h;t) = (3 * 3 )
Dow
nloa
ded
by [
Uni
vers
ity o
f L
imer
ick]
at 0
8:19
22
Apr
il 20
13
HOODA AND SINGH
and
~ ' ( f : ~ : h ; t ) = e log -P + (1-2) log L-I)- u 1-u 0.4)
( i i ) L e t us c o n s i d e r a -power , 0-power and y -power probability dis-
t r i t u t i o n s represent ing f , g and h respect ively of t h e types given
below:
a n d
11' ( a ) (7 1 + ( 0 -7 ) ----- ~ ' ( f : g : h ; l ) = log ---- (3.6) 'I ( 6 1 rl (a )
( i i i ) If X = { 0,1,2, ..... N) and l e t p and v be t h e binomial probabi l i ty
dis t r ibut ion resect ively given by
and o be t h e m e a s u r e given by
N h(n) = ( ) , n =0,1, ........ ,N
11
Then
t- 1 R(f:g:h;t) = (pu + qvt-l)N
and
R1(f :g:h; l ) = N [ p log u + v(1-p) log (1-u)]
Dow
nloa
ded
by [
Uni
vers
ity o
f L
imer
ick]
at 0
8:19
22
Apr
il 20
13
AN INFORMATION IMPROVEMENT GENERATING FUNCTION
In particular
(iv) I f X= I0 ,1 ,2 ,.... 1 and p and v be the poisson probability dis-
tributions
and o be t h e f-actorial rnc,asure
1 h(n) = -- n = 0,1,2 ,.... n! ' then
ATt-I ~ ( f : g : h;t ) = e - ( A + Y (t-1))
and
R1(f:g:h;l) = Alog -f -7
In case A = y i.e. f(n)=g(n) for all n E X i then
R1(f:g:h;l) = A log X - A = A (log A-1)
which is a result due t o Guiasu and Reischeir (1985).
In particular
1 I ( p ;v ; [;-UT ] rn)=Iog m (x) + ~ l ( f : g : h ; i ) = i + ~ l o g y - y
Remarks: The continuous case is analogous to t h e discrete case.
Thus w e can define t h e information improvement generating
function for continuous distribution on the same lines. It may be
worth mentioning that t h e informrttion generating function simplifies
the computation of information measures. This also provides a
concise expression for which all t h e moments of self information
measure can be obtained simply on differentiat ion
Let p and v be the probability measure and the' reference
mmsure on the product space x S respectively, and let f i and gi , Dow
nloa
ded
by [
Uni
vers
ity o
f L
imer
ick]
at 0
8:19
22
Apr
il 20
13
1044 HOODA AND SINGH
i=1,2, .... s b e t h e marginal probability dis t r ibut ions on X of t h e
s-dimensional joint probability dis t r ibut ions f and g defining
probabi l i ty measure p and r e f e r e n c e measure v on X respectively.
If t h e revised r e f e r e n c e m e a s u r e w o n X' i s just t h e d i rec t
product probability m e a s u r e of t h e s e marginal probabi l i ty dis t r i -
but ions g i , viz.,
h(nl ,n2,-...- P 1 = g l ( n l ) g2(n2)..-.-gs (",I,
t h e n , supposing gi (ni)>O f o r al l i=1,2 ,..... ,s and a l l n . 1 e X , t h e f i r s t
derivative of t h e corresponding information improvement genera t ing
funct ion a t t = l i s
~ ' ( f : g : h ; l ) = z f (nl,n 2,....ns) log g(n l ,n2,....n s)
(nl,n2, ... E x3 S - c c f i b i ) log gi(ni)
S i = l n: E x
W e m a y ca l l ( 3 .12) inaccuracy m e a s u r e of in te rdependence ,
which is equivalent t o t h e form of K e r r i d g e l s inaccuracy r a t e
fo r s=2.
4. THE STANDARD DEVIATION O F THE VARIATION O F INFCRMATION
The in format ion improvement genera t ing func t ion sugges t s new
informat ion indices. O n e of them i s t h e s t a n d a r d deviation of t h e
divergence of t h e r e f e r e n c e v f rom t h e r e f e r e n c e m e a s u r e w revised
on t h e basis of probability measure p .
h = d r - ( l log f 41 )2 1 (4.1) X X
T h e indicator (4.1) measures t o what e x t e n t t h e Kullback-Leibler
number r e f l e c t s t h e divergence of v from w . Applying Chebyshevfs inequal i ty , w e c a n f ind upper (or lower )
bounds a s given below : Dow
nloa
ded
by [
Uni
vers
ity o
f L
imer
ick]
at 0
8:19
22
Apr
il 20
13
AN INFORMATION IMPROVEMENT GENERATING FUNCTION 1045
E r amples ----- (1) Let p , v and o be th ree Bernoulli distributions with the
parameters p,q and r respectively. Then the Kullback-Leibler
number is given by
d l - q ) 2 I ( ,, ; v ; 0 )= p(1-p) [log ---- I (4. 3 ) r(1- r )
and standard deviation of t h e directed divergence of V & w is
(2) I f v and v a r e the Poisson distributions with parameters A and
respectively a n d u is t h e factorial measure, then from C5.9) and
(4.1 ) , we have
and
In case A = 7 ,(4.5) and (4.6) become the result due t o
Guiasu and Rcischeir (1985).
The authors a r e indebted t o ref ree for his comments and
valuable suggestions.
BIBLIOGRAPHY
Belies,M. and Guiasu, S.(1968). A quantitative-qualitative measure of information in cybernatic systems, IEEE Trans.Inform. Theory,l4, 593-94.
Golomh, S.W. (1966). The information generating function of a probability distribution, IEEE ?rans.Inform.Theory, 12,75-77.
Guiasu, S. and Reischeir, C (1985). The relative information gen- erat ing function, Information Sciences, 35 , 235-41.
Hooda, D.S and U m e d Singh (1988). A quantitative and quali- tat ive information generating function -Accepted for publication.
Halmos, P.R.(1962). Measure theory, East-West Press Pvt-Ltd, New Delhi.
Dow
nloa
ded
by [
Uni
vers
ity o
f L
imer
ick]
at 0
8:19
22
Apr
il 20
13
1046 HOODA AND SINGH
Kerridge,I).F. (1967). Inaccuracy and inference, J.R(iyal Stat. Society Series 8 , 23, 184-94.
Kullback,S. (1963). Informsrtion theory and s ta t i s t ics , Dover Pub. Inc. ,New York.
Theil, H.(1967). Economics and Informz~tion Theory, North Hc'lland Put.Co. .Amsterdam.
Received J u l y 19b9; Revhed Febtiuwiy 1 9 9 0 .
Recommended by R. S . Clzhihatia, UYL iu~~n . i t g 0 6 HounXan- Cleati Lahc, Uoubton, TX.
Redetiecd by Nabendu Pa l , U n i v u ~ s L t y 0 6 SoLLthwutetin L o u h i a n a , LadayeLte, LA. and P a i x i c h L. Od&, Baqloh Un ivehb i t y , Waco, TX.
Dow
nloa
ded
by [
Uni
vers
ity o
f L
imer
ick]
at 0
8:19
22
Apr
il 20
13