Successive Refinement of Information Stanford University

Successive Refinement Of Information Stanford University-Free PDF

  • Date:17 Dec 2019
  • Views:18
  • Downloads:0
  • Pages:7
  • Size:892.47 KB

Share Pdf : Successive Refinement Of Information Stanford University

Download and Preview : Successive Refinement Of Information Stanford University


Report CopyRight/DMCA Form For : Successive Refinement Of Information Stanford University


Transcription:

270 IEEETRANSACTIONSONINFORMATIONTHEORY VOL 37 NO 2 MARCH1991. In Theorem 2 we prove that successive refinement from a. coarseAdescription J with distortion D to a finer descrip k l 2 R. tion X2 with distortion D can be achieved if and only if the X i DI. conditional distributions p a lx and p f lx which achieve. 1 X Xi R Di i z 1 2 AareMarkov compatible in the sense. that we can write X X2 X as a Markov chain, Section IV then uses these necessary and sufficient condi. tions to exhibit a counterexample for which successive refine. ment cannot be achieved In Section V we prove that all. finite alphabet distributions with Hamming distortion are. successively refinable and also exhibit two specific continu. ous valued problems in which successive refinement is Fig 2 Successive. refinementproblem,achievable,II BACKGROUND AND STATEMENTOFTHE ROBLEM. We recall the definition of the rate distortion function. Definition 1 Rate distortion function For a probability. mass function p x x E x and distortion function d x. on x X 2 the rate distortion function R D is given by X 229D2. R D pyj l Z 1 RO P, where the minimum is over all conditional pmfs p Plx. satisfying C p x p flx d x 2 I D The distortion rate. function D R is the inverse function of R D which can be Fig 3 Multiple descriptionsproblem. characterized as, 2 there exists a sequence of encoding functions i xn. 1 2nR and j y 1 2n Rz Rl and reconstruc, where the mini um is over all conditional pmfs p x x 1 functions gl 1 2nRl i an g2n 1 2nRI X.
2n R2 Rl f such that for X g X and,satisfying Z X X 2 R. The rate distortion theorem states that a rate R D de fo 2 g i X j X. scription of X Xi independent and identically distributed lim supEd X R se R. i i d 1 suffices to estimate the process to within distortion 5. D We now describe what we mean by successive refinement and. We consider a sequence of i i d random variables, Xl 9X where each Xi is drawn from a source alpha lim supEd X R D R 6. bet x We are given a reconstruction alphabet i x and. consider the distortion measure where D R is the distortion rate function. Definition 3 Successive refinement in general We say that. d yxi W 3 a problem defined by a source distribution p x and distor. The distortion measure on n sequences in xn xi is de tion measure d x 9 is successively refinable in general or. fined by the average per letter distortion simply successively refinable if successive refinement from. distortion D to distortion D is achievable for every. A Related Problems,where x x1 x2 xJ and 9 a 92 2, We say that we are successively refining a sequence of Our main tool is the achievable rate region for the multi. random variables X X when we use a two stage de ple descriptions problem investigated by Gersho Witsen. scription that is optimal at each stage First as shown in Fig hausen Wolf Wyner Ziv Ozarow El Gamal Cover Berger. 2 we describe the X sequence at rate R bits per symbol and Zhang and Ahlswede 2 6 In this problem a sender. incur distortion D Then we provide an addendum to the wishes to describe the same sequence of random variables. first message at rate R R bits per symbol so that the X1 X to more than one receiver The ith receiver. two stage resulting message now has distortion D We shall will receive description fi X E 1 2 nRi from which. say we have successively refined the sequence X X if it will produce an estimate Xi Xi2 Xi of the original. R R D and R2 R D In other words we demand message The distortion associated with representing the. that we achieve the rate distortion limit at each of the two source symbol x with the symbol P is given by d x a and. stages In general we will demand that we be able to achieve the distortion between the sequences x x x2 x. all points on the rate distortion curve and P P P P is given by 4. Definition 2 Successive refinement from distortion D to An important special case is shown in Fig 3 where there. distortion D We shall say that successive refinement from are three receivers two of which receive individual descrip. distortion D to distortion D is achievable D z D2 if tions and the third of which has access to both descriptions. Ecxmz ANDC VER UCCESSIVEREFINEMENTOFINF RMATI N 211. Information about the source is transmitted to receivers 0 applying theorems from 5 and 7 Let the source distribu. and 1 at rates R and R respedt ely and the two receivers tion p x and the distortion d x x be given Let R D be. individually generate estimates X0 and X with distortion defined as in 1. D and D rlspeztively When they pool their information a. Theorem 2 Markovian characterization of sukessive re. third estimate X with distortion D is generated with. finement Successive refinement with distortions D and D. D I D D I D The rate distortion region is the set of. DI 2 D can be achieved if and only if there exists a. achievable quintuples R R D D 0,conditional distribution a lx with. The successive refinement problem is a special case of the. multiple dessriptions problem in which there is no constraint Ed X I D 11. on Ed X X0 and in which we require R R D and,R R R D and.
We require the following achievable region established by Ed X D 12. El Gamal and Cover 1 51 such that, Theorem 1 A rate distortion quintuple is achievable if I WD 13. there exists a probability mass distribution,Z X R D 14. P X P l 4 1x,Ed X gm sD m 0 1 2,P 21 lx P f lX P w 15. Remark The last condition is equivalent to sayi g thzt. such that X ii can be wriiten a the Markov chain X X X1. R C 7 or equivalently as X X X,R I X Proof,Sufficiency Let p P lx and p lx satisfy ll 14. R R z X Z R 9 Let go be a dtimmy symbol some constant Fix the joint. pmf p p lx p l The joint description achiev,Ahlswede 7 showed in the no excess rate case i e.
able region of Theorem 1 becomes,R R R D that the conditions given by El Gamal and. Cover are necessary as well as sufficient Zhang and Berger R Z X O 16. 6 exhibit a simple counterexample that shows that the. conditions of El Gamal and Cover are sometimes not tight R Z X R D 17. when R R R D However successive refinement is R R Z X R Z R. indeed the case of no excess rate which is the constraint 18. under which 7 9 yield the eritire rate region Z X R D 19. Also relevant to the successive refinement problem is the. closely related conditional rate distortion problem formu where we have used 13 and 14 But the total rate R is. lated by R M Gray 8 lo which deals with the question given by R R RI Thus CR R R D R D is. of determining the minimum rate needed to describe a achievable. source at distortion D when side information Y is present Necessity Successive refinement requires R R Dl. See also Gray and Wyner ill Gray defines the conditional and R R R D This is the no excess rate condition. rate distortion function R D as of Ahlswede for which the region of Theorem 1 is the entire. achievable rate region Thus there must exist a co ditional. y D p j fy X lY 10 pmf p R P R lx with Ed X XI I D Ed X X 1 I D. where the minimum is over all p Plx y satisfying,ii R Dl 2 Z X 20. c p x y dx Y P X YM x I D Conditional rate distor, tion theory is relevant to our question because one might say R R D R D zZ X 21. that in successive refinement R R D and R R, R R D Of course Rx 2 is defined only when X is an and. i i d sequence which in genkral is not the case Also relevant R R R D Z X 1 2 Z Jil 22. is the work of Yamamoto 12 and Kaspi and Berger 13. The definition of the rate distortion function implies th. III ACHIEVABILITY OF SUCCESSIVE REFINEMENT, Z X ii 2 R D so 20 is satisfied if and only if Z X X1.
R Dl Expanding 22 by the chain rule yields,A The Markov Conditions. R D 2Z X R R Z R 23, Here we p rove that successive refinement from coar e. description X with distortion D to fine description X z x x z x 2 l2 J. with distortion D is achievable if and only if the individual. rate distortion solutions p P lx and p P x for D 2 D z x l 2 z x 24. are such that we can write X X X as a Markov chain 2z x 25. We do this by considering the successive refinement problem. as a special case of the multiple descriptions problem and by 2 R D 26. 272 IEEE TRAl ISACTIONS ON INFORMATION THEORY VOL 37 NO 2 MARCH 1991. where the last inequality follows from the definition of the. rate distortion function and inequality 25 follows from the. nonnegativity of mutual information Since the start and end. of the chain are equal all inequalities must be satisfied with. equality which implies that,I 2 2 0 27,z x IR 2 0 C 28. z x R l 0 29 I,Z X g2 R D 30,Fig 4 Choice of D and D in counterexample. Equation 29 is equivalent to the Markovity o f X X2. Xi while 28 forces the Markovity of X X1 X a, Thus from the above we must have X X2 Xi Xc with 0 p 1 and d x P Ix PI We assume for this.
Finally 27 requires X0 to be independent of 2 We example that p 3 2fi Let. conclude that the achievability of R R2 R D R D z eWD. guarantees the existence of a pmf p x p R lx satisfying 32. 15 Thus successive refinement is achievable only by joint where R D is the derivative of the rate distortion function. pmf s of the form p x p R l p f l P Po i e only if at D Then by the Kuhn Tucker conditions it can be shown. there exists p a such that X X2 2 0 that the solution to the rate distortion problem for distortion. B Codes for SuccessiveRefinement,Let 2 and p a 121 be probability mass functions. achieving the bound in Theorem 2 To generate the code. book for the first refinement we draw 2nRl i i d code vectors. according to the distribution lI p R J We index these and. code vectors a i E 1 2nR1 Then for each i we, generate a codebook for the second refinement with 2n Rz Rl. codewords drawn according to the conditional distribution. n p l i We index these code vectors z i j,i E 1 2nR1 j E 1 JMRz RI. We describe the first refinement of a source vector x if z 1 2X1 pII I 1 4Np 6 1 and. with the index i of the codeword that minimizes d x a. Next we describe the second refinement of x by the index I 1 I 1 Pb2. jE l 2n R2 Rl that minimizes d x Z i j Because, successive refinement is a special case of the multiple de 1 z2 p 1 2. scriptions problem the proof of Theorem 1 5 establishes x12. 1 p z2 l p 1, that this method of encoding will achieve the desired rates.
and distortions 1 If22 p 1 z 1, We can now see that the codes which achieve successive and. refinement have a tree structure where the coarse de Pk 1 2 0 1 21 36. scriptions occur near the root of the tree and the finer. descriptions near the leaves Although these tree structured if 1 z 1 2 1 p 1 4 p 6p 1. codes will usually only be optimal asymptotically in the limit Let D be in the region for which p Z 1 2 0 1 2. as the block length n grows to infinity it is possible to use Specifically let z2 e R Dd 1 2X1 p Let D D2 be. the idea of tree structured codes in practical finite block chosen to lie in the 3 symbol active region i e let zi eRYDl. length schemes for describing messages with successive re satisfy. finements One such method is described in 14 15,f l p Jpr 6p l z l a 37. IV COUNTEREXAMPLE, In this section we show that not all problems are succes See Fig 4. sively refinable We now provide a sketch of a counterexam We shall argue that we cannot find a necessarily 3 x 2. ple that has its roots in a problem described by Gerrish 16 transition matrix P P such that. which forms the basis for an exercise in Berger s textbook. 17 p 611 A detailed analysis of this counterexample can be P xl q EP xll P f l 38. found in 1181 Let x 2 1 2 3, This is because there is a bottleneck in 2 2 X since. Px P 2 I 31 X2 has only two states thus preventing p xlP from having. 2 the degrees of freedom necessary to satisfy 33 We consider. EQUITZ AND COVER SUCCESSIVE REFINEMENT OF INFORMATION 273. the matrix equation 3 X Laplacian absolute error distortion d x 2. The details are developed in 18,which we rewrite as.
A Gaussian Distribution with Squared Error Distortion. If X is Gaussian N O a then R D is achieved by,p f NO o D p xJz N D It follows from the. work of Gray and Wyner ill that this problem is succes. 1 P l P 4 sively refinable in our context It is easy to show for D D. A 1 A that,B 1 B 2 N O a DI,c 1 c Cl PM 1 P 43,1 z p 1 z P 94 N D 02 44. Finally we observe from 40 that PQ P is of the form P x1 N f D 45. F E 1 F E yields a joint density p x 4 x p p 21 p xl 2. having the desired marginal p x N 0 a and satisfying. the conditions of Theorem 2 thus guaranteeing the achiev. Note the equal entries in the second column Thus by ability of. inspecting the left hand side of 40 we see that Px12 has. the above form only if z l z 1 zl i e z1 1 But,z1 eR Dl 5 e R DmA 1 p l p 1 R R R D R D log log. Thus there exists no Markov chain d X2 X satisfy 1. ing the rate distortion conditional marginals p x P and 46. p xlx given in 33 and 35 So for 0 p 3 2 the, The code achieving these bounds has an especially nice. problem Px l p 2 p l p 2 d x 2 Ix 21 is, not successively refinable in general tree structure Let 2nR Dl 02 D n 2 Pi s be drawn i i d.
We can also characterize exactly when successive refine N O a D Z J Label them P l f 2 f 2 R D1. ment is achievable from distortion D to D One interesting Let 2n R Dz R Dl D1 D2 n 2 u s be drawn i i d N. case is described in the following theorem which is true for N O Dt D Z Label them u l u 2 e. Successive Refinement of Information William H R Equitz Member IEEE and Thomas M Cover Fellow IEEE 269 Abslruct The successive refinement of information consists of first approximating data using a few hits of information then iteratively improving the approximation as more and more information is supplied

Related Books