Entropy of fuzzy events

FUZZY
sets and systems
ELSEVIER
Fuzzy Sets and Systems 88 (1997) 99 106
Entropy of fuzzy events
F. Criado a'*, T. Gachechiladze b
aDepartment of Statistics, Mfilaga University, 29071 Mfilaga, Spain
b Department o! Applied Mathematics and Computer Science, Tbilisi State University, Tbilisi, Georgia
Received January 1995; revised February 1996
Abstract
In order to elaborate a theory of communication which could be useful for designing a great variety of real communication
systems it is necessary to consider the superposition of two kinds of uncertainties: probabilistic and possibilistic. For example
Zadeh's general formula for the entropy of fuzzy random events takes into account these two basic uncertainties.
In this paper we consider Zadeh's entropy, structurally related with this quantity a weighted entropy of De Luca and
Termini and also the genetic connection of these entropies with Shannon's entropy function. @1997 Elsevier Science B.V.
Keyword~: Measure of information; Membership functions
1. Preliminary remarks
i.e.
We consider some variants o f fuzzy subsets entropies (only logarithmic measures) and the linking
relations o f these quantities which indicate a similarity o f many important characteristics o f fuzzy
subsets entropies and Shannon's entropy. W e begin with a brief consideration o f certain representation o f fuzzy subsets, which is convenient for our
aims.
IA = Is + Z~,,.
A is a support o f the mapping IA-.
According to Zadeh [ 11 ] the splitting component I s
is a fuzzy subset. Call l i d the dual subset with respect
to I s .
Notice that the component I d can be split again:
I s = (13, I 2,, ), where
I ~ = Z I d,
1.1. S e t splitting [1]
IgD=(1--P)IA,
* Corresponding author.
z'X-~[O;1].
(2)
Two sequential splittings induce the splitting o f initial
subset:
Let X be a finite set and A any subset, A C X. Represent the indicator o f this subset IA as a pair (IA-, ISD ),
where for convenience,
Is = p I A ,
IA:O=(I--x)IS;
p:X--+[0;1]
(1)
IA = (Zp/A,(I -- Zp)/A).
Let A , B C_X and ( I d , l i D ) and (I~ ,It?D ) the results
o f their splitting. How can A N B be split when I s and
are given? W e demand that this splitting should
satisfy the natural requirement:
1A~8(x)<~Is(x),I~(x),
0165-0114/97/$17.00 @ 1997 Elsevier Science B.V. All rights reserved
PII $0165-0114(96 )00073-5
X EX.
F Criado, 72 Gachechiladze/Fuzzy Sets and Systems 88 (1997) 99-106
100
We have
Such splitting in [4] is called "simultaneous".
In the case of the union A U B, fulfilment of the
natural requirement
IA n e(x) = IA (x)IB(x)
= (I`4(x) + I`4D(X))(I9 (X) + IhD(X))
Id-dB(x)>-I`4(x),lg(x),
= 1`4(x)I~ (x) + 1`4(x)lfo(x)
is demanded and, operating as in the case of intersection, one can easily obtain
+Ido(x)I 9 (x) + l`4o(x)IfD(x).
Taking into account the above we have two direct
possibilities of grouping the terms:
/-F8 (x) = 1`4(x) + ~ (x) - / W e (x),
1`4(x )½ (x)],
i.e. the grouping corresponds to the splitting ofA r~ B,
when
/~s(x)
=I`4(x)./~(x),
x EX.
(3)
Ij~(x
) = I`4(x) V/9(x),
x E X (V -- max)
Now consider the cartesian product. Let, for two
universal sets X and Y, A C_X, B C_Y. If their splittings
are given, then according to
IA x~(X, y) = hA x Y)mxxs)(x, Y)
IA ns(x ) = (1,4 (x ) + I`4D(X) )IB(x ) = 1`4(X )Is(x )
we can write
x E X.
/~-8 (x, y) = IA~;-Y(x, y ) . Ix~ B (x, y)
Now split 1B as
=/A'(x).19(y )
= 14 (X)If (X) + [IA(x)Ie(x) -- 1`4(X)If (X)].
We come to (3). Evidently the order of splittings is
not essential. For this reason, in [4], (2) and (3) are
called "sequential splitting".
( Id(x)(If(x ) + IfD(x))
I +Id°(x)(Ig(x) + Ift,(x)),
if IA(x ) ~ If(x),
(2) IAn/~(x) = / ( i 2 ( x ) + iAo(x))if(x)
[ +(Id(x ) + I2r,(x))IfD(x),
k
if If(x) <<,lA(x)
= 1,4 (x) A If (x)
=Id(x) AIj(y),
(A=min),
i.e. this grouping corresponds to following splitting of
the intersection:
(4)
(x,y) E X x
(simultaneous splitting).
Y
(8)
Finally, we list some simple relations connected
with the dual subset which can be proved directly from
the definitions:
( I ) Involution I(dr,)D = I d .
(2) Connection with Zadeh's complement I ~ d =
IA(. udD, where Ac is the usual complement of A
in the universal set.
(3) Duality laws for intersection and union (true for
both kinds of splitting)
I(Yng)D
+[ I A(x )IB(x ) -- I`4 (X ) A I~ (x)],
x E X.
(7)
IA~B (X, y) = IA~-~ (X, y ) A Ix~ ~ (x, y)
+[IA(x )IB(x ) -- 1`4(X )Ij3(X ) ]
IAHB(x ) =I`4(x) Alg(x),
(x,y) EX × Y
(sequential splitting)
IA nB(X ) = I`4 (X )(If (X ) + I~D(X ) )
xEX
(6)
(simultaneous splitting).
This splitting has the same character as (2). Indeed,
consider IAnB and first split I~ as
+[1A(x)I~(x) - l`4(x)Ie(x)],
(5)
and
+I`4D(X )I9 (x ) + l `4D(x )If o(x)]
-
x ~ x
(sequential splitting)
(1) IAnB(X)=I`4(X)I9 (x) + [I`4(x)IfD(x)
=1`4 (x )l9 (X )+[IA(x )I~(x )
x EX
IC`4Du9%nAns,
(9)
I(du~)o = I(don~o)u(A,.n~o)u(don~),
(10)
=
which turn into the De Morgan's laws for fuzzy
subsets when 1`4o = 1-7.4 and lad = 199, i.e.
when supports of both 1`4 and 19 mappings with
universal set.
F Criado, 72 GachechiladzelFuzzy Sets and Systems 88 (1997) 99-106
1.2. Splitting of Shannon's entropy
branching property in some other way then we can
rewrite (11 ) in following form:
Let us perform a splitting of the set X point by point.
In this case Shannon's entropy of probability distribution { Pi }7, H( { Pi }7 ) tums into H( {/AiPi }n, {/AiDpi}7 ),
where/Ai is the membership function of a fuzzy point
2~, i = 1, n. If we avail the branching property of function H(.) [3] then
.
{/A/°p,}7)
H(/A1p,,
D . . . , /A,P,
D )
. . . .
/AnPn,. /A1Pl,
= H(P(X),P()~D)) _.}..p(p()H -({/AiPi~}7~
\
X)
+P( 2 D )H (
H ( {/AiPi}7, {/ADpi}7)
-- - H(/AIpl, lAPP!; . . .
;/AnP,,
OPn )
]An
{/ADp,}7
n
(11)
n
P ( 2 D) = E / A i PD i .
P ( 2 ) = ~-~/Aip,,
i=l
where
Z( X / { pi } 7 ) = - ~-~ /AiPilog Pi]
(12)
EJ({/Ai }~,
n
=
(1 - / A i ) p i l o g p i
(12')
(17)
i=l
In addition functions Z(X/{ Pi }7) and L( X/{ p, }7 )
are connected by Jumarie's entropy [7]:
i=1
z(2D/{pJT)
(16)
where
= Z(J~/{p,}7) + Z(J(D/{pi}nl)
+L(2/{p~}7) + L(J(D/{pi}7),
101
{Pi)7)
= Z ( 2 / { p i } ~ ) + z ( 2 D / { p i } 7 ) + L(2/{pg}'~).
i=1
(18)
and
(13)
1.3. Zadeh's and De Luca and Termini's fl-entropies
Z ( 2 / { p i } ~ ) is Zadeh's entropy [12], i.e. an entropy
of fuzzy set X with respect to probability distribution
Formulas (12) and (13) represent particular
cases of Zadeh's and De Luca and Termini's flentropies, respectively. The characteristic equations
o f Z(f(/{pi}~ ) and L(2/{p~}7) are particular cases
of a more general equation whose solutions are flentropies; they are connected, with the usual fl-entropy
of probability (sub)distribution as ( Z ( X / { p i } 7 ) +
z ( x D / { p i } 7 ) ) and (L(g/{pi}7) + L(2D/{p~}7)),
with Shannon's entropy of split distribution (see [9]).
fl-entropy of probability (sub)distribution {/AiPi}7,
n
~--~i=1 /AiPi ~ 1, is d e f i n e d a s
Z ( 2 / { p i } 7 ) + z(2O/{p~}'~) = H({p~}7).
{Pi}7"
L( 2 / { pi }'~) = - ~--~ pi/Ai log/A~,
(14)
i=1
L(xD/{pi}7) = -£
(1 -/Ai)Pi log(1 -/Ai).
(14')
i=l
Function L()(/{ Pi }7 ) is actually a Kullback directed
divergence I({/AiPi}7 : {Pi}7) [t0] and
L( 2/{p~}7) + L( 2D/{pi}7)
n
H~ ( { /AiPi} 7 ) = - E (/AiP,)~+ 1 log(/AiPi )
i=1
n
= Eft( £/{/AiPi }7 ) ~- Zfl( g/{/AiPi }7 ),
(19)
=- - - E Pi(/Ai log/A/+ (1 --/Ai) log(1 --/Ai)) (15)
i--1
is a weighted entropy of De Luca and Termini [2].
We see that (11 ) is nothing else but Hiroto's measure of uncertainty [6]. Notice that if we avail the
1 log(.) ~ log2(. ).
where
Z~( 2/{/AiPi}7) = -~-~ (piPi) ~+1 log Pi
i=l
(20)
F. Criado, 72 Gaeheehiladze/Fuzzy Sets' and Systems 88 (1997) 99 106
102
is the fl-entropy of Zadeh and
L fl( 2 / { /tiPi }nl ) =
~
(/tiPi ) [~+1 log/ti
Proof. Condition (26) means that
(21)
Iff(/t, P) = J+lIff(1, p).
(28)
F o r / t = I from (25)
i--I
is the weighted fl-entropy of De Luca and Termini.
Consequently,
Iff(l,pq)
q/~l~(1,p) + p/3I~(1,q),
or
Z( 2 / { Pi }~ ) = Z°( )5/{/tiPi }7 )
(22)
(pq)-IsI~(1,pq)
and
p /~I~(1, p)+q-/JI~(1,q).
(29)
L( 2 / { pi}'~) = L°( X/{/tip i}~).
(23)
Definitions analogous to ( 1 9 ) - ( 2 3 ) can be given
for dual subsets.
Assuming that
p '~I~
F(log p )
from (29) one obtains
2. Characterization of ]~-entropy
F(x + y) = F(x) ÷ F(y),
2.1. Characterization of Z/J ( 2 / { IJiPi} ~)
where x = log p, y = log q.
The unique continuous solution of this functional
equation is
Let X be a set of events {xl .... ,x,} having probabilities Pl ..... p, and/t : X ~ [0; 1] some fuzzy subset of X. Realization of one event in which two kinds
of uncertainty are accumulated gives an information
Iff depending on two variables p and/t:
(30)
F(x ) = k log p,
where k is an arbitrary constant. Finally
Iff(1, p ) = p/IF(log p) = kp l~log p
1~ = I~(/t 2(x ), p(x ) ).
(24)
and formula (27) is proved.
[]
Similarly for a dual point:
Iff = I[~z(/tf D, p(x) ).
(24')
Let us try to find a function Iff that has following
properties:
(1) l~z(I~, pq) = q/~I~(/t, p) + p/~I~(/t, q)
(25)
and
(2) l~(2/t, p ) = )]~+'I~(/t, p),
(26)
for any 0 ~<p ~< 1, 0 < /t ~< 1 and nonnegative number
2.
Let us return to a subset 2 which is regarded as
the whole set of fuzzy points (21,22 . . . . . £n). We have
established that information gained from realization
of fuzzy event 2i is equal to
l~z(#ipi ) = k#i(/tiPi )/s log Pi.
(31 )
The mathematical expectation of this quantity with
respect to probability distribution {Pi}'l leads to (20).
It is worth to dwell on properties (25) and (26)
when fl = 0. In this case
(1') I°(/t, pq ) = I°(/t, p) ÷ l°(/t, q )
(25')
Theorem 2.1. Let I~(/t, p) be a function satisfying
conditions (25) and (26). Then
and
I~(/t, p) = kit(lip) l~log p,
We can consider I ° ( 1 , p ) as a measure of probabilistic uncertainty contained in the event x. When
event x is split the probabilistic uncertainty becomes
where k is" an arbitrary constant.
(27)
(2') Iz°(/t, p ) = / t l ° ( l , p).
(26')
F. Cr&do, T. Gachechilac~e/Fuzzy Sets and Systems 88 (1997) 99-106
proportional to a membership function of fuzzy event
2. I°(1, p) as though it was split like a set (see [9]) or
a probability measure: p(x) = #p(x) + (1 - #)p(x)
(see [5]). Eq. (25') reflects additivity of an unsplit
probability measure for independent events taking into
account (26').
103
Letting
#--fl--lI~(#,
l) = F(log#)
we come to (30) and consequently
(38)
If(#, 1) = k#//+l log#
2.2. Characterization of L il( 2 / { #iPi } ~)
from which according to (36), (35) is obtained.
Characterization of this quantity is analogous to
that of the above case, because comparing (20) and
(21) one can see that they may be obtained from
one another by replacement p ~ #. Evidently an
interpretation of postulates will be essentially different. As in the case of Zadeh's entropy one can assert
that
I f = 1~(#2, p).
(32)
Suppose that the function I{ has the following
properties:
(1) If(#l#2, P) = ~2
Lt#,,P)+JI+II~(#z,P)
' ¢¢+1I/~"
(33)
and
(2) I/c~(#,Tp) = y~I~(#, p)
(34)
for any 0 ~<p ~< 1, 0 < # ~< 1 and nonnegative ?.
Theorem 2.2. Let IlL~ be a function satisfying conditions (33) and (32), then
If(#, p) = k ( # p f # log #,
(35)
where k is an arbitrary constant.
[]
Fitting with each fuzzy event the quantity (38) and
considering a mathematical expectation with respect
to probability distribution {Pi}~, we obtain (21).
For fl = 0 (33) and (34) give
(1') I°(#1#2, p ) = # 2 I ° ( # l , p ) + #fl°(#2, p)
(33')
and
(2') I°(#, p) = I°(p, 1) = IL(#).
(34')
IL(#) can be considered as a measure ofpossibilistic
uncertainty contained in a fuzzy event 2.
We see that it is independent from classical chance.
Relation (33) represents a law for calculation of
possibilistic uncertainty in the case where the fuzzy
event is considered as a result of two sequential splittings with membership functions #1 and #2 and reflect independence of final result from succession of
splittings.
It seems natural that a measure of possibilistic uncertainty for fuzzy subset 37 may be represented by
following sum:
n
(39)
i=1
Notice that De Luca and Termini's entropy can be
represented by this function as
Proof. Eq. (34) means that
(36)
If(#, p) = #/3If(#, 1).
ED({#i}~, {P~}7) = L({#i}7) + L({#~}~).
(40)
For p = 1 from (33) one can obtain
L1.#1,1) + • Is+li[s,
1 L[#2, 1),
, Is+1iI~,
If(#1#2, 1) = ~ 2
,
or
(#l#2 )--fl--l l f (#1#2,
1)
= #?/~-'If(#,, 1) + #2/~-'If(#2, 1).
(37)
Comparing properties (25) and (26) ((25') and
(26')) with (33) and (34) ((33') and (34')) in spite
of the complete symmetry of (20) and (21) relative
to p ~ #, one can observe the difference in the structure of equations which, in the presence of specific
conditions of probabilistic and possibilistic uncertainties superposition, reflects the fundamental difference
between classical chance and fuzziness.
F Criado, T. Gachechiladze/Fuzzy Sets and Systems 88 (1997) 99 106
104
3. Some simple properties of Z()(/{ Pi }'~)
and L(2/{pi}~)
where equality holds iff pi = Pipi/P(X), i = 1, n, i.e.
all #i are equal.
Consider two universal sets of events X and Y. Let
)(C_X and ?C_ Y and suppose events x c X and
y E Y are independent.
According to (7), (12) and (14) it can easily be
proved that
Remark 3.1. According to (13) and (14) one can
write
H( { pi}nl ) >/ P( f()H ( {#iPi} ~ I
P(X)
+P(k
Z(XxY/{p,}7 x {qi}~n)
D )H
( {Y/DPi}'~ "~
(46)
= P(2)Z(f'/{q~}7) + Q(~*)z(x/{pi}7),
(41)
Remark 3.2. Suppose that supp X = X. Writing (45)
in some other form, namely
and
L(XxY/{pi}'~
--
x {qi}~)
Pi
log Pi <~ --
i=1
= P(2)L(?/{qi}'~) + Q(?)L()(/{pi}'~),
(42)
Pi
log
i=1
I'tip-J
P(X)
we get
tl
where P ( X ) = Y ~ i = J liiP i and Q(?) = ~iml lliqi.
Properties of Z(X/{pi}~) and L(X/{pi}~) which
will be considered below are represented as inequalities. These are consequences of the relation between
the arithmetical and geometrical means which in "logarithmic representation" has the form [3]:
Z
qi log ai ~ log
i=1
qiai
,
E(log #i) ~< log(E(pi)),
(47)
where equality holds iff all #i are equal.
Theorem 3.2. Let X be a fuzzy subset and suppose
that there are two probability distributions { Pi } ~ and
{P;}7, in addition let
(43)
i=1
qi >/O, ~']7-1 qi
#iPi = £ P~12i•
i--1
1 and which is also a basic
inequality when one considers the Shannon's entropy.
i--I
Theorem 3.1. Let X be a fuzzy subset with values of
membership function {Pi}~ and {Pi}7 a probability
distribution on X, then
z( 2/{pi}7) <~- £ #iPi log Pl
Z( X/ { Pi }~ ) >1P( 2 ) H ( {,u,p/}~ ~.
where equality holds iff Pi
ai,
where
=
\ P(X) }
(44)
Proof. Eq. (44) is a direct consequence of inequality
(43). Assuming
]2iPi
Pi
ai = - - ,
qi
(48)
i--I
=
P~, i = 1, n.
Proof. It follows from inequality (see (43)),
Equality holds when all Pi are equal.
qi - P( X ) '
Then
i = 1,n
H
k( P(X) ?~ <~ - ,=~ p ( x )
I~iPi
qi - p( f(),
fliP;
ai - qiP( X ' ) '
~j=, #jPj
i = 1, n.
we get
-- £
i=1
liiPi
P(X)
1ogPi>/--
£1tiPiloldiPip()~)
g P- X )
i=1
(45)
Theorem 3.3. Let X be a fuzzy subset and { pi } ~, and
{ p; }~ probability distributions on X; let probabilities
offuzzy events {/5,}~ = {#iPi}~ and {b;}~ = {Y,P;}~
105
F. Criado, T. Gaehechiladze/Fuzzy Sets and Systems 88 (1997)99-106
be connected by relation /3; = ~--~-~=1aijpj, i = 1, n,
where aij >1O, ~in~ aij = ~nj=l aij = 1. Then
Z( kl{p;}nl)>...Z( kl{pi}7 ),
(49)
where equality holds if all #i are equal and {pi}nl and
{P~}nl coincide up to a permutation.
Theorem 3.4. Let X C X be a fuzzy subset of
events, {#i}nl the values of corresponding membership function, and { P i } ~ a probability distribution
on X. Then
Z(X/{pi}nl)>...P(2) ( H (\ p{#iPi}nl
- ~ j - j ~ _ log
121),
(50)
Proof. From the conditions of the theorems it follows that total probability of fuzzy subset remains
unchanged:
L
#iPi
L'
=
P(2)/121, i
Relation (50) is a consequence of the inequality:
#iPi.
i=l
_L
i=l
i=1
Further, according to (43)
~
! n
z(x/{pi}~)
!
!
I
#iPi log Pi = --
= -i=l
aij#jpj log Pi
i=l
n
j=l
#iPi log #i
P(X)
~
L
>
--
i=1
#iPi log #ipi
P(X)
P(X')'
Theorem 3.5. Let 2 and X ' be subsets of fuzzy
events, {#i}nl and {#;}nl the values of corresponding
membership functions; let probabilities of subsets 2
n
n
and X ' be equal: ~i=l #iPi = ~i=l #;Pi. Then
n
#,p,Z
=-j=l
log p,i
a
L ( 2 / { Pi }nl) <<-- ~
i=l
n
i=l
n
>~ - Z
Relation (51 ) is a consequence of inequality:
n
#JPJ 10g Z
j=l
"
aij p;.
Since numbers rj = ~'~in_l aijpi>~O'
l, according to (47) one can write
and ~--~n
rj
j=l
#jpjlogrj>j j=l
= Z(2/{pj}7).
'
~-, #iPi log #iPi
- i=z-~_P ( X )
~
~
i=1
Z ( X / { p i } ~ ) >1 -
(51 )
where equality holds if["#i = #~, i = 1, n.
#JPJ log H P;aO
j=l
#iPi log #;
i=1
n
= -- Z
#jpjlogpj
j=l
[]
When all #i are equal this inequality becomes
!
where IX[ is the cardinal of the fuzzy subset 2 [8].
Equality holds iffpi =
-- 1,n
n
--
i=l
#iPi ,
#iPi
P ( 2 ) Log p ~ ) .
Theorem 3.6. Let 2 and X ' be subsets of fuzzy
events, {#i}nl and {/~}nl the values of corresponding
membership functions, { Pi } ~ a probability distribution on X; let the probabilities of fuzzy points of 2
and X ' be/3i = #iPi and/3~ = #~Pi, connected by the
relation /31 = •nj=, aij/3j, where aij >~0, Z i L I aij =
~ - 1 aij = 1. Then
L( X'/{pi}nl) >~L(2/{pi}7).
(52)
H ( { P i } I ) >JH({Pi}nl),
where equality hold iff distributions {Pi}nl and {p~}~
coincide up to a permutation [3].
Theorems analogous to the one considered above
holds for the function L(X/{pi}nl). Their proofs
are also analogous to those corresponding for
Z( X/{pi}nl).
Equality holds iff all pi are equal and {#i }nl and {#; }7
coincide up to a permutation.
Proof.
n
-Z
i=1
#~Pi log #;
106
K Criado, T. Gachechiladze/Fuzzy Sets and Systems 88 (1997) 99-106
n
= -Z
n
aij/zjpj log/z; :
Z
i=1
n
t/
-- Z / z J p J Z aij/zi'
j=l
j=l
n
i=1
t~
= _y"/zjp;logl-I/z;
j=l
°,,
i=1
j=l
0
i=l
Since numbers/Zj
then according to (51 )
i:1 au/zi E
[0, 1], i = l , n ,
L(Xt/{pi}~) ~ -~/zjpjlog/z~t~ - ~/zjpjlog/zj
]=l
j=l
= L(~5/{pi}~).
If all Pi are equal then the relation between probabilities o f fuzzy points come to p~ = )-~<~=laij/zj and
n
n
(52) leads to - ~z=l/Z~ log/Z~ ~> - ~ i = l #i log/z/. In
addition, if {#~}~ coincides with {/Zi}~' up to a permutation, i.e. the cardinalities o f ~" and ) ( ' are equal,
I)(I = [)('l, then the last inequality can be written in
the form:
H \]-£T()
({.,}7]
where under existing conditions equality actually takes
place.
4. Conclusion
The consideration o f Z a d e h ' s entropy function and
De Luca and Termini's weighted entropy function
showed that they are genetically connected with the
entropy function o f Shannon. This fact is very important, since Shannon's entropy is the only function that
satisfies requirements which are generally accepted
as necessary for a meaningful measure o f uncertainty
[9].
Representation o f a fuzzy set as a result o f set
splitting procedure permits to show that Hiroto's
entropy used as a measure o f uncertainty in the
presence o f chance and fuzziness is a natural generalization o f Shannon's entropy in the case o f set splitting, and can be represented as a sum o f four items:
the entropies o f Zadeh and the weighted entropies
o f De Luca and Termini o f split components. Each
item is a mathematical expectation with respect to a
probability distribution on the universal set o f events
o f the information function connected with particular
split events.
A characterization o f the individual information
function allows to reveal the superposition features
o f probabilistic and possibilistic uncertainties and a
connection o f independence and a sequential splitting
o f events.
The items connected with probabilistic and possibilistic uncertainties clearly appear in the formula o f
Shannon's entropy.
This facilitates their comparison and, to some extent, highlights which properties o f Shannon's entropy
are retained or which new properties reflect the above
mentioned superposition.
References
[1] F. Criado and T. Gachechiladze, Fuzzy Random Events and
Their Corresponding Conditional Probability Measures
(Royal Academy of Sciences, Madrid, to appear).
[2] A. De Luca and S. Termini, A definition of non-probabilistic
entropy in the setting of fuzzy sets, InJorm. and Control
20 (1972) 301.
[3] A. Finstein, Foundations of Information Theory (New
York, 1958).
[4] T. Gachechiladze and T. Manjaparashwili, Fuzzy sets,
Scientific Rep. of Tbilisi University, Appl. Math. 133
(1988).
[5] T. Gachechiladze and T. Manjaparashwili, Fuzzy random
events and their relative probability measures, Scientific
Rep. of Tbilisi University, Appl. Math. 139 (3) (1989).
[6] K. Hiroto, Ambiguity based on the concept of subjective
entropy, in: M.M. Gupta and E. Sanchez, Eds., Fuzz)'
Information and Derision Processes (North-Holland,
Amsterdam, 1982).
[7] G. Jumarie, Further advances on the general thermodynamics of open systems via information theory, effective
entropy, negative information, Internat. J. Systems Sei. 6
(1975) 249-268.
[8] A. Kaufmann, Introduction a la thborie des sous-ensemble
flous (Paris, 1977).
[9] G. J. Klir and M. Mariano, On the uniqueness ofpossibilistic
measure of uncertainty and information, Fuzz)' Sets and
Systems 24 (2) (1987) 197-220.
[10] S. Kullback, InJormation Theory and Statistics (Wiley,
New York, 1958).
[11] L. Zadeh, Fuzzy sets, Inform. and Control 8 (1965)
338 -353.
[12] L. Zadeh, Probability measures of fuzzy events, J. Math.
Anal. Appl. 23 (1968) 424-427.