MennBruce1969

OPT]JAAI1 COfHNG POR JvLA..XHnJ.M EJi'FI CIENCY
1\
A thoeifJ E:nthmitted in prt:t>tial IJD,t:!.sfn.otio:n of thE:l
rcqniromon.ts fox· the fl(Jt�'eo of' J,Iastor of .S iez1o�j in.
Eng :i'.iwe :r :i :ng
by
Jrrne �
1969
San FCT!rando Valley Sta to Coller�e
June,
:i. i
1969
Soct:i.on.
AHS'I':!ir\G'P
IIL
Aim Ji'AJ:O
J)E'f'J<:m,JIHING
'l'HE AVtmACm LEWG'i'H'
IN HUii'l''MAl'i' CODING
VI.
VII.
15
THE H1JFFM.AN METHOJ)
•
•
•
•
�
•
Fl100F OF. 'l'HE LOWgTl BCHTtm
•
•
•
•
4
•
•
•
•
•
•
•
•
•
•
•
20
30
KVA:LTJA'l'ION OF EJi'l<'ICD�NCY A'I' RAGH
OJ! 'rHE HUii'Ji'HAr� PllOCESS..
ITERA'riON
DEIUVA'l'I01'1 OF Tim MI:HI?Ilffi\f AVEHACi-J�
. IX.
38
CONCJ,US I ON
APPETifiiX
•
32
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
40
Page
Table
I
Q
I I.
COm�D S'S'7olB0"�",8 Alm PfWJiABI"LITIE�)
FOR CAIJCULATI011 Ol'' AVgHAGE
III.
U'.:I'�G'J1I-I J<::v�JJPU�
SYlJl30I.,S A>l:D
•
•
•
•
,,
•
•
•
•
•
•
•
•
•
•
PHOHAlHL:l'PL!�S ron
A CODE NO� OPTIMI�En BY TJD;
SHAN1WN=FA1W i.J<:'Li10D
'' P'J' n,rAL corm
•
•
•
•
•
•
"
.
.
.
•
•
•
•
•
..
.
•
•
7
9
l2
LIST OJ<' FIGTJJU�S
2.
3.
6.
TH:8 ). ounrPH lU�S'J'lU G'rJ OJL
G
e
•
<
•
•
CA LCUI..h 'I' I 0 N 0 F AVEliA G1� un·; GTH
EXAMPI,E OF 'I'HE CALCU:LATI 01\ Oii'
ENTROJ'Y FOJt A HUPF1,f.Al'f COD:!'�
•
•
'J.'(
19
21
•
•
23
28
'by
lfac tal' of Sc tencc
June,
optinality of
Of
these,
thn
minimrun tLna,
To datet
there
a
co�e
in
three that
orror
is
2n
a
i'n r�ng:i.:n0EH' lng
1969
digit�l
are
and cost
con�unication sy�tom.
most often chosen
in transmitting
a
are
me�Bage.
optimal prooGdure for minimi�ing
--
� tr��s i
-- -
� �� � �� � ��e
2
Fano method.
19 48.
.
... --- .
. .
· · · · · ···--· - · · · . ·-·
..
·-·····--- - · -···-· ··· .
······-----�· - · · ·
··-------·····
·
·
· ·-·-··· ··
----· ·-·
"1 .
, ____________
.
of a coded messa�e is the Shannon -
It was developed independently by them in
The optimality of their procedure applies only to
special cases.
Later in 1952 a general optim�l procedure
3
was developed by Huffman.
The Huftman method is relatively simple to implement;
however, its nature is quite subtle and is very much
obscured in the e�isting proofs which make use of other
individual results, such as the Kraft and MacMillan
inequalities.
Since the method is basically iterative
in nature, pa r�meters describing the ensemble code
developed by the procedure will be expressed in this paper
The main contibution was to de­
by recursion formulas.
velop a simple recurrence relation, show how i·t simplifies
the computation of. _the average length of a code,
and
finally provide the optimal code and procedure to obtain
this code.
18
A similar.,.rel�tion for_ entropy
developed,
and by comparision of,; the two recursion. formulas, a con­
cise justification of the lower bound of the •verage
1
e.g. Shannon, "A Mathematical Theory of
Communication," Bell System Technical Journal, Volume 27,
pp. 397- 423, 1948.
.
.
2
n. Fano, "The Transmission of Information,I,"
Massachusetts Institute of Technologz, Researgh
Laboratory of Electronics, Technical Report 65, 1949.
3
·
·
D.A.
I
Huffman, "A Method for the Construction of
Minimum Redundancy Codes," Proceedings of !!!!.�Institute 2!
Radio Engineert, Volume 40, pp. 1098-1101, September,1952.
.
'
'
.
·,
·
·----·---·- ·�..:..-------
'vii
:
::-:-----·...... -. -�-,�·,-,.
,..�
-
�
determining the oiaraoteristic pRrameters for each atop
optimality.
I
�e uono
�-
"'
very
oy
.,
samG
'
.
� ..
11AY
lDI�la
concepts,
•
OVRRVII:W
'I'1riEJ wi11
.
.' '
"
�
no�crlo�ng i:;no
G�propr1n�o ooncr1.
-'
slightly redir0o·�ea, will bo shmm
to gavern the optimality of the Huff�&n procedure.
then devel0ping
.
recursive relations
for expressing tho
derive optirnaltty of the method and to dete�Jin0
1
ThiEl
the
Rouove�,
for cla�ity,
196�i hy Abra:t:l80nc.
on an optimal
4
Hovmvo:r
cod.o by ShamJ.o:n
FJymbol Yli"thou.t 1:w,vi:n.g to
noxt symbol.
s
O!J.t&
\VD,o
of tho rost:e :l.ct im:lf:J
th.o equivalent of
Bet
its
exexdn.e v..ny of tho cli.gits of tho
4: N. 1�Jn:n.<,t1Jon 9 I��E9.?�r!�2&1o.?} 1�l}f�9.E;Y.:
1tcC+x"F�\V�=1I:til Bool:: (}(�lHIJr11l�l9 IT(JTt YtJ:t:1r�� 1�)fj�;c
�:mB. 1JSJil1�tJ;E;_,
4:
0
10
(
�3
s
4
1.10
1110
0
Ol
0111
Thio rel8tion is
I(X)
it
ensn�ou
ev�nts
is
Entropy
larger
�·"'
that
tho
is &dditive.
infort��tion obtained frm
H(X) ::::: 2: J>(X) log
X
P(X)
In acEti tion9
rnnltiplo
·=:Z P(X) log P(X)
X
is often referxod to an uncertainty because tha
its valuo,
tho
more
uncertain
ono
io about
tho
6
information to be transmitted by the source.
An eft1cient system will provide a maximwa amount of
information for each symbol transmitted.
J
In a simple caseJ
I
it one symbol is being transmitted, its statistical averag'
information,
, namely its entropy, will be m�i�,Ximwn when the
uncertaintly of whether it is a one or a z �ro is greatest.
This occurs when the probability of a one or a zero occur·
ring is one half.
H(X)
=
•.
The entropy for this case is
-(.5 log . 5 + . 5 log .5)
=
1 bit /symbol
The total system efficiency is a maximum if every
symbol represents one bit of information.
can only be obtained tor special cases.
·This maximum
These cases will
be dealt with in detail later in the paper.
In optimizing the efficiency of a system, one tries
to maximize the information per symbol tran�mitted which
corresponds to minimizing the number of s�bols required
to transmit a message.
Due to the statistical nature of
the information source, our goal is to minimize the average
l
number of digits required to transmit a symbol.
Referrin
. g
to the numb�r of digits that represent a symbol as the
.
I
length, a system will have maximum efficie�y if the
average length is minimum.
-..J
L
where L
�f
The average len gth is
'
·
=
p1
L
i
is :the length of the
i
pr obability of occurrence.
th
1-
symbol and Pi is its
Consider the ·symbols in Table I I with tbetj respective
·
__ _..._;..,,_,.______ �----··-:... __......._,_. __,_�--�--�-----..·�·----·-��··--·--·-;.----·-··�---'·---------··-·.. ·
'
·-·"'-"' � · · · ·
CODED Sl1.IBOLS AJTD P:nOJJA:Fl:CLITIES l"OR
8
0
2
.5
10
.. 25
1.10
.125
.,, 111
.125
�
L
i
�
l�75
Sect :i.o:n IV Q
les8
probable events.
'lJ
rigoron8 jomtif1oatio�.
In this w�.y,
if each digit reprooents
'l'AJ:ILE
II I
SYJ; iJOLS .ANl) I'ItOBAlHLI'l'lES Ii'OJ1 A CODE NO'l'
s4
s5
�1
�
:i.
--·-
- ----··-� -·
--------�· --
--------�---.·---··--------.. ---·-------·----- ...
a combined one half the time.
. --·�----·---------------·-'-�---·
.. ------- ...
-�·
Therefore, the first bit is
a one or a zero dependent upon whether the symbol is or is
not s •
1
Thus the first bit represents a decision of two
equally probable eyents and contains a maximum amount of
information, one bit.
Since the code is to be instanta­
neous, and S
is represented by zero, no other symbol can
1
have zero as a prefix. The symbols s 2, Sa , and s 4 have
one as a prefix and are further subdivided into equally
l
probable events since s
. and s
3
2 occurs one quarter of the time
and S 4 combined occur one quarter of the time.
Thus these are equally probable events and th.e second bit
represents the decision between s 2 and the combination sa
and s4•
84.
The third bit in a similar manner divides Sa and
·.
probabilities of occnxrenco.
blo�
ti.Yld E� ;':\> ;
,..,
will
and.
arbi�rarily
01.
aJCJ.{l S,.,,
;:_�
lated
ocmu.·s
wi t.h p:ro'>::;>,ly :i.J. :fc i;;y· tlu:oo t�:nllt11rJ
in Table
IV.
c
'l'ABLE nr
··8-vr•lwl
· -"'�···�-·"··s
s
s
s
s
00
l
01
ll
3
5
100
.1
101
·· ------�--·-------·----··-·
----·
.
---·-:---�-·
.
-' _:;....;_______�--
.. .:
---�·
_. .:.:......�
�
..
. . ____ -:-:·-.:.:----�}; :::.,_______
··-·--··---�---·
�-·- -·
.;"'</
L = 2.2
This code is not optimal. ·.:'The code tabulated in
Table V has a smaller average length.
The average length for this code is
5
L =
P iLi • .4{1) + .1(3) + .3(2)
�
,...__.
L
!
!
•
1(4)
+
.1(4)
2.1
yielding a resultant, smaller average length.
·An
with a minimum average length for a given set
! symbols
r
.
cienoy as do their counterparts in the Shannon-Fano method
l oode
!
=
+
optimal
of
with known probabilities of occurrence can always
be derived by the Huffman method.
..
..
CODE
s
s
s
s
s
1
.4
l
001
2
3
Cocle
o:t
.. 1
0001
.1
0000
III�
In devolopL1g hiG rnc'tho1, Hm'?fun111. impor:;ec1 :f'Jvu
:tOf.Jtrict�.on.s
on
a:r::
o};tiT,Jt>,l
ii.1Bt<=tn·trt:neous coc:�,e�
IHrJ p;;o��
menta c; 0oding digite.
once
are
the starting point of
arbitrarily ordered
in
a
lf}
a
sequence of
fashion such that
massQ
s
lf'P
3)
r""l
·� f,>·�.
·L "'
Cj •
U..!. c
& ""�
1(1)
T"/,J
'.·.i.r:·<J•
.l
:t'
�
"''
L(2) -
t�t''<.D
-.\..V
.._,
";.. .-,{''"'�""0
IJ···-�cvulO)
..Z .o
.A..!.
V
{"ioT) (c1E'i]
\.-1.",\.·.•
]J',,H
.,....., ·'·
, •.•.,){>
11!0>'"''"
-::11""
.O;VO
1.!_ '<::•
part
TI1e final
which
WOAD
:R�ot
ro�triction is
co
th
N��.�� �:.:·y1nl;v l lli t11
COd€;.
p:rof:tx
C\
m-::
f1
that each poseible sequence
BYTl'bGl
)?(1{i ��·�.c t }_fJil i�:t:l t11e
�
it G•YaJ.ci :eeplt:l,CJ<D tho
n�ve :es)�;t3
I c:>:t3�f�t11
Cl :f
·t11e
l8
step-by-stop basi
.
Bmallest probabilities,(represontoa
symbol.,
in the optimal
code.
19
.1
.1
s
s
2
.1
3
�3
In
�hiB
o&8e
s
1
It
now
t�cos th�oo digits
f:�:?ld s,} lJr;r;ittJtge
r�
�ho state coded 00.
20
inotoGd of two
00
---····-----·
FIGUH:8 .j,
probabilities.
/V
L
Thxs
/'\./
L
i.rap
A
.··
..
Thus
1.0
=
+ o6 + .4 + .3
result
.6
�
:eecru.:s
:nentecl
+
is
.4
=
2&3
obtai:tlG<l m1.�.eh ertr·d.cr
+
.4
e3(2)
+
c2(2)
.6
+
�3
+
f
than by tho
.2(3)
+
�lU:s}
hxe
o
the probabilities of the two statss
forming the combination state, L is the length from the
i
"-'
previous iteration,and L
is tb<value derived for the
i+l
"present iteration. For the first iteration L1 is
initialized at zero.
The following calculations are shown tor the example
used earlier in this section.
�
L
1
,...._..
L1
rJ
L1
/V
...,...
•
=
'*
+ P
L00
0
. .
+
.2
P
a
+
+
.1
b
.a
f'\-/
L = L + p + p
b
2')
l
,f..l
a
r-L = .a + .2 + .2
2
/'-'
L
2
=
La
=
La
=
L
a
=
L
;'\/4
L
4
=
L
4
=
!"-'
"-'
�
A/
/'-"
=
.7
,_/
L2
+
.7
+
+ p
a
b
.a + .a
+
p
p
1.a
,.../
L
a
1.3
2.a
+
=
a
.6
'"V
+
p
+
b
.4
L
The recursion formula for the a�erage length may
be used to justify the optimality of the Huffman coding
·
1 procedure.
I
I
l
To minimize the increase in average length due
to the formation of a combination state, the sum of the
;probabilities ot the two branches which form the new state
i
'
must be minimum.
This occurs when the two states or
' .
't
symbols with the 10\vest probabilities are chosen t.o be
combined.
L-
If the increase in average length is kept
... ···------·----··-----------·--··--··---·-·-·--·· ·---'···--·····-··-·······------···-··--·-··--·-------·-···-····..··---.. --.........-----
-···-····-··
'
· ·
.it
,
minimal for each combination, Li will be minim�.a:t each
step in the process and, since P
minimum,
t1
+1
a
+ P
b
is cho.sen as a
will also be minimum at each step.
l
Since th�
number of combinations required in a.Huffman code is
independent of which symbols or states are chosen at each
step for combination, the final average length will be
minimum for
the
case where the least probable branches
are chosen for combination at each step of the process.
J
THE J�N'PJWPY IN HUI•'Ji'!JAN COL IN'Cd-
"average lengtli,
viii
be developed fo� determining tha
--
entropy of
by the
a
set of
·-
-
--P n'
ie related to tha
n-1 events P,, P0,
;_
26
,-.,
Pn 2' P_
H
-- ---
·�
�·
•
J.
+ P
U.
or
H.
:o;:
'1
1-.1
• proceaf.:l
I
ru::e
information
.1.
x
lmJ.tlo
j
=
H
H
+
0
{ p.a
+
BCm.J:eoo
calcu.la.tcd.
H�
H.·
i+l
+
is eqtd.va1o:n:L
+ P )
b
(Pa
'"
.
H1
:.:-::.25
H2
-- Ill
H
..
z
H.,
F.>
•
•
-�
f(
- H2.
H
3
H
,
-.")
•.
·-
H
2!5 +
•
fi
+
1 7s
•
(
25
+
(P a + P h )
,=
n
to the entrojJy of the
<•
[
25) H
H
F.
. ,.>�
G
F.
od
Gs
-]
.
•
5
J
28
.5
o
G�
- -?
�
F-'
�o
""-·«··-'
Ji'IG-URJ\: 6
.12t5
log P,
1o[:;
II
-·
l. '75
l
•
5
+
.25
+
.125 log
.125
V'I.
P1100:F' OF rrHE LOWEH BOTJJ.m
+ p
h
{. ·:s
\ '·
p
+ p
a
H,
....}-
b
'Phe ma:;;d.mum.
tho ln.>anches
P
a
w:ill
the
m1m
of P
J!
a.
P-.
o
·1- p.
b
8_,
:�<
-
whm.). P
occurs
fo:e cnrer:·
eqnel
pb
!1;1:0.'"7;$'�·.t�·. :;::.c--;:::..'>0"�,.�
:Ot
s�
P
camb i:wJJ t :i.mn.
b
•
are
As otatetl by ]\::no,
equ :ip:rol101hle �
fen: e;:,,ch combtnoJ'tion n:n\;_
a:nd P"i_
o
\•,•ld eh
..
iL
30
..
:U
�H '\VLU equvol
:hl tnx·n eq_u.ctl
to
A .
t-::.
JJ
fo:r tha.t
. 31
tho en.t:!l:opy
-':\...-'
L-;
...
L
....-\..�'
1
r---
r_.
2
�
r--'
1,.,
D
/\...'
I,
3
··r
- .. 25
-·
. 25
·-·
. '([)
/"../
r...
2
0
-- .. ?5
c
�125
+
+
"
+
2[')
[."
·<=>
+
+
l
H
9�::·
e-�<::..)
l
H2
H
c:·
. �)
z
H...,
D
a·��
--·
0
( el25
+
o25
-- .25
+
(
�'f5
+
(
--- .?5
...
•
::::: :L75
H
3
-- l. '75
represents
Il
.125
one
bit of
information.
.
25
.o
�
�
��--
+
�f-
"1 2 rq (1)
... l4,#
�
)
2f�)"
,n
!::')
( '1
\ ..
(1)
)
r··-
-----------------·--·------
--------------------
--------------------
-·------- ---------------
EVALUAT ION OF EFF IC IENCY AT EACH
V I I.
THE
ITERAT ION OF
HUF�UN PROCESS
A measure of the efficiency of a code is generally
given by the ratio of the entropy of the source to the
Using the material developed
average length of the code.
in the previous sections a means of determining the
efficiency of each combination of symbols or states
Huffman coding is developed.
1
The efficiency for the
'n
th
i--
iteration is expressed below.
Ij Sinoe
AH
6L
Ni =
I
AH= (P
8
land
At= P
I
+
H
Pb )
[
p
Pa
P
a
+
'
pb
P
a
l
I
N. = H
1
a
+ p
b
[
P
a
Pa + P
b
'
I
p
P
a
b
+ p
b
b
+
-
J
pb
j
For the special case where the probabilities of
1I
! occurrence of the symbols to be coded are factors" of one
32
;.,.'.t
I'
a
to p · fen' oach i ttn.'o,·b ion J.:�er;rtl t i1:1g in JoY
.. .
1
'b
ig equr�,l
[r;, .5]
lJo:i.ng oqu.gl t(<: 1r
w11 icl't :l-8 e:qnal
'
at any step of thu codin� IJrocese N
�
i
to
if:i not equal
used
Ni
11
t
N
2
N
2
.'}L:)
::::
··�
..-�,
.,,.
3
NL]:
[<
IJ3
c3
c82
n
[�--:�
ۥ
'
-·
.l
�==
o4
1.00
[•-c0.o�, �
H
··-
:LOO
==
.. 1
·=="�:=,
�---•
{J
··
N
pl
H
.6
[-=;,_<��;
]
�-]
S
�
.,4:
�'"·>-�-"""'""
1.0
-·]
m:w.
to
If"
on.e,
in Figuro 3o
of eff ic i m:1.cy.
VIII.
DEl1IVA'riO:N Oii' 'J1H:l� MINHIDJH
ClOJ! iVa t iOi\1 fo 11 OY/8.
;'\../
-- r....
ll
whi.eh rolate::.;
at
the
1.
BtGge
L
:::::
This
+
average
is
p
b
length at
fJtag;e
ite�atod for N stages
i + 1
to
that
initiali�od by
n
/"V
whero S
a
I,
The opt hi1:e,l
I,*
+ p
leYogth
"--'
fh:Jiwted hy }".,*,
I'b
i+l
j_s the
of aJ.l
sy;:�\bolB
ilitiee o:f.'
35
iG
givo:n by
J
to be coded�
fl.·Bd P,,
OJ
re;;,rpcnt ive1y.
P
� P
� Pi for all Si contained in state s.
b
g
"
"
This poJ ley of oh:tor::dYtg the ems>Jl.legt pro'bsJJ:H :H:ieG
The
�elation for entropy is
recurrence
H . .,�"'
H.
-·
1 d.
l
[
for each conJ.bi:na:L.on i8 equc.J.
+ P
(P
b
a
Therefo:ce
o:e,
Hi
�
=
) H
o
+ P
b'
:Le.
a
·
p
a
+ p
b
J_,i for a.ll
i.
the e:o.t:ropy eqnalr; the avc:rago lm:�.gth
In this
(P
�anaral
c
a
cas�n
co.so
•
P
a
�,� P
b
+ p }
1J
TI1oL., L.
l
jHS
a
_ �====,
o.·=r.
,..,_,.
·
i
P
to P
1-efJS
tllCtl'J ·f:lJ.C; C��-Vf)!�f:·;.g;Ci
rv
1G:t1_[�tll��-�L.:-
fo• at laast one value of
-
vd.th equivc.tlo:nco oc:mn�:ring who:r;. P
"'"
et
Pb,
fo:e all
L
---- ----�---�----�-------------·"··-· .....: ·-···-·------
IX.
CONC LUS ION
The Shannon-Fano method suggested the idea of maximiz•
ing the amount of information contained in each digit of
\ the
code in order to obtain maximum efficiency.
l method
Their
fails to be optimal in general. because it attempts
to maximize the information contained in each of the
·,
beginning digits without accounting for the information
loss in all the digits comprising the code.
1 method
The Huffman
minimizes the increase in average length for each
tree combination.
The recurrence relations, developed, provide a means
for determining the characteristic parameters at each step
! of the Huffman process.
The recursion formula for average
\length greatly simplifies. the
length of the derived code.
\constructive
j
I
calculation ' of the mean
It is, also, the basis of the
proof showing the optimality of the process.
This proof is very straightforward, because of the
:simplicity of the recurrence relation for average length,
I
and its direct bearing on the Huffman process.
l
The iterative formaula tor entropy portrays much mean-·
r�� : _:_:_ �:�:l
I
w en
. -
_
��
nnc
n with the relation .tor averace.
38
, '
;
·,; '
1
r
1
.
.
--·
--
-- ---------
length.
- --·
--
--
- -- --- -- - - --
--
-
--
-------
- ------ -- -
�··· ····--··-·--··--·-·
--··
-·
-·--
. ·-·--�- ' -
··----- ·
Comparing the entropy and the average length at
each iteration, the amount of information obtained for the
associated increase in length can be determined.
It is
now possible to calculate the efficiency for each step of
the Huffman process, as well as the overall efficiency of
the derived code.
The heretofore complexity in the proof
of the lower bound is simplified by the comparison of the
two formulas.
It is of interest to note the similarity of
the two relations that differ by only a weighting factor
equal to the efficiency of the present iteration.
This has been the first knolv.n attempt to develop
'·
relations that describe the Huffman process sequentially.
This is quite appropriate because of the direct relation
of the recursion formulas developed to the iterative
nature of the Huffman process.
j simplicity
It is hoped that the
of the concepts developed will enable the reader
to obtain a clear understanding of the Ruftman coding
procedure.
IL.
______
:
--------------------·--·-----------·- ..--·-----------··------...---------------------
..·--·--·------��---·----------·-'··
PROOF OF THE A.DHI'PIVI'l1Y l1UI,1<: FOR EN'THOI'Y
H
[;=" ··-:;·��. ,
Pn�l
2J
i::o::'i
+ P
+
log P_
n·�l
.
(Pn-1
•
"
Ep
\
.
.
l;,1.wl
40
.
+ P
+ P ) log
n
log P
-(PX.t�<>J!.
+
�
n«' J.
nM
Pi
(P
log Pi
log P1�
....
11
.
n-1
+
Pn)
1 + P�, log P
>.s
+ p ) log; {p
:o.
n
·n-l
·n=2
·
·
·n
n-1
'
)
+ pn )
4:1
-l-
-{-·:
Pn
�<��a.:;..,-.n�•..i:'TA-"·"""'-"
+P
p
n�l
n
-P
n�
1
)"n·� 1.
p )
n.
11111
�='"'�"'""'"-�·==
pll"·
log
l+
1 0g
p
·
+P
:n ..-1
•.
n
l?n
Pr•-'i
+P.
•.
..
n.
l og
P:n
-
-�
__.
�- "'�r-""-..:�ox,..,,__=�
..."}
_t_,n
Ln·�1''-n
+�P
n
log P }
n
+
(P
n=l
+ P
n
)
.,.
log (P
n,�l
+ P )
!"1
+
(P
ll·�l
+ p )
n
.
__.._� ...--�----- �-- -- -- _.,
...-----------�--..----- ,...____
_....__ �- -·-·- ---------···----� �------·---�--
BIBLIOGRAPHY
1.
Shannon, C.E., "A Mathematical Theory of
Communication": Bell System Technical Journal,
Volume 27, pp. 397-423. 1949.
2.
Fano, R., "The Transmission of Information,!,"
Massachusetts Institute of Technology Research
Laboratories of Electronics.Technical Report_.§2.,
1 949.
..
3.
Fano, R., Transmission Qf Information, John Wiley
and Sons, Inc., New York, 1961.
4.
Huffman, D.A., "A Method for the Construction of
Minimum Redundancy Codes," Proceedings of �
Institute of Radio Engineers, Volume 4 0, No.10,
PP• 1098-1101, September. 1952.
5.
Abramson, N., Information Theorz � Codin g,
McGraw-Hill Book Company, Inc., New York, 1963.
6..
Reza, F.M., An Introduction .!!! Information Theory,
McGraw-Hill Book Company, Inc., New York, 1961.
7.
Panter, P.F., Modulation, Noise, and Spectral
Analysis, McGraw-Hill Book Company:-; New York, 1965.
8.
Hartley, R· V. I,., "Transmission of Information,'
�S�stem Technical Journal, Volume 7, PP• 535563, 1928.
1:1.
12.
13.
McMillan, B., " Two Inequalities Implied by Unique
Decipherability, " IRE Trans. Inform. Theory. Vol.
IT-2, pp. 115-116, December. 1956.
Peterson, w.w.,
and Sons, Inc.,
Error Correcting Codes,
New York, 1961.
.
Freund, J.E., Mathematical Statistics,
Inc., Englewood Cliffs, 1962.
1,.
Ash, n., Information Theo�
Publishers, New York,-r965.
15.
Bellman, n., Dynamic P rogramming,
Jersey, 1957
John Wiley
: . '
Prentioe•Hall,
International Science
Princeton,
New