STEIN S METHOD FOR GEOMETRIC APPROXIMATION

Stein S Method For Geometric Approximation-Free PDF

  • Date:10 Dec 2019
  • Views:20
  • Downloads:0
  • Pages:7
  • Size:212.73 KB

Share Pdf : Stein S Method For Geometric Approximation

Download and Preview : Stein S Method For Geometric Approximation


Report CopyRight/DMCA Form For : Stein S Method For Geometric Approximation


Transcription:

708 EROL A PEKOZ, Our approach in contrast directlyyields errorbounds for estimatingP WE B for. any set B in the applicationsdiscussedtheseboundsspecializedto tail probabilitiesare. roughlyhalf thoseobtainableusingthe usualPoissonapproach and specializedto point. probabilitiesthey are better than the squareof bounds obtainableusing the Poisson. approachwith the triangleinequality Note that althoughthe approximations we obtain. are nearlyindenticalto Poissonapproximations our methodscan yield betterbounds. ForcaseswhereAis small see 15 for a methodof improvingthe actualapproximations. In Section2 we presenta theoremboundingthe errorsfor geometricapproximations. In Section3 we applyit to Markovchain hittingtimes and in Section4 we applyit to. patternwaitingtimes In Section5 we give a methodfor improvingthe approximations. in caseswhereneighboringtrialsarestronglypositivelycorrelatedso thatsuccessesoccur. 2 Main result, Let W be the numberof failuresbeforethe first successin a sequenceof dependent. Bernoullitrials with p 1 q P W 0 The main resultbelow gives bounds on the. error of the geometricapproximationto the distributionof W Let 1 V have the. distributionof W given W 0 and let Gphave a geometricdistribution startingat. zero with parameterp so that for k 0 P Gp k pqk,Theorem1 Withthe abovedefinitions. a IP WE B P Gp E B l qp l 1 qlBI P W V and,b d W Gp qp P W V whered X Y supA IP XEA P YEA I. To provethe theoremwe firstneedthe followinglemma our argumentcloselyfollows. the argumentfor the Stein Chenmethodgiven in 5 for Poissonapproximation. For any set B and any p 1 q we construct the function f fp B defined by f 0. 0 and for k 0 qf k 1 f k l keB P Gp E B and it can be easily verifiedthat. the solutionto the aboveequationsis givenby f k S eiB q Ze B ik q. Lemma 1 For any j 0 k f j f k l 1 qIB p, Proof Note thatf j f k ieB j k q k i B i j q j and sinceneithertermon.
the right can be larger than X B 0 1q 1 q BI p the result follows. Proof of Theorem1 Substitutingk W in the recurrencerelationfor f taking. expectations and noting that f O 0 we obtain,IP WE B P Gp E B I IqE f W 1 E f W I. IqE f W 1 E f W W O P W 0,IqE f W 1 f V 1 I,qE f W 1 f V 1 1. Since Lemma 1 gives If W 1 f V 1 1 1 q BI p lip the result follows. Stein s methodfor geometric approximation 709,3 Applicationto Markovchainhittingtimes. In this sectionwe consideran ergodicMarkovchainstartedaccordingto its stationary. distribution n and use a geometric distribution with parameter p 1 q EA n to. approximatethe distributionof the time W of the firstvisit to the set of statesA The. result is as follows where P denotes n step transitionprobabilitiesfor the Markov. Theorem 2 Let W min i Xi E A be the time of thefirst visit to A for the stationary. Markov chain Xi i 0 and let p 1 q SiEA rci Then,P WEB P Gp E B p 1 q IB i E Pi jn. Proof In lightof Theorem1 we proceedby creatinga couplingto boundP W V. First let Z X0 Xi be the desiredMarkovchain startedaccordingto its station. ary distribution and let Z Yo Yi be a coupled copy started accordingto the. stationary distribution restricted to A Letting W min i Xi E A and V. min i Y GA note that W has the same distributionas is describedin the theorem. and 1 V has the distributionof W given W 0 Next note that P W V. EAno P Xn j,EjEA Yn j P Yn j Xn j, We then couple Xn Yn as is done in 5 p 166 using the maximalcoupling of.
Griffeath 12 see also 17 so that P Xn Yn j j AP Yn j and therefore. P X j Yn yj j P X Yn j,thelast from,equality A similar calculation. where the next to last equality follows from E 7rcP 1 nj A similar calculation. gives P Yn j Xnj J lIq 2iEA i j P j We can then deduce P W V. lIq Ei jEA 7i jn IP 7l I and the result follows upon application of Theorem 1. Remark The bound given by Theorem2 can be comparedwith the relatedbound. of Theorem8 H presentedin 5 for a Poissonapproximationto the distributionof the. numberof visits to A by time n in the case wheretail probabilitiesof Ware of interest. 710 EROL A PEKOZ, Example This example is taken from 5 p 168 Let Xi X2 Xm be a collection. of m independenttwo stateMarkovchains on 0 1 each with P Xi n 1 0 Xi n 0. p e 1 p and P X n 1 1 I Xi n 1 ep 1 p for some 0 1 Note thatwhen. 0 the sequence Xi j j is just a sequence of i i d Bernoulli random variables but. for E 0 there is a greater tendency for runs to occur Starting from stationarity we are. interested in the distribution W the time when all of the Xi are simultaneously zero. Next define a Markov chain with state space 0 m 1 and n step transition probability. Pijn that coordinate at time n stores the value of Xi at time n Calculations from 5. show P W 0 pm and letting state j be the state where all the Xi are zero that. n 1 IPjn pm I 2ep1n 1 when e p m 1 l p and m 2 so Theorem 2 gives. d W Gp 2ep 1 e under the above conditions on E and m. 4 Application to sequence patterns, A biased coin is flipped in succession What is the distribution of the number of flips. preceding the first appearance of a given pattern of heads and tails Here we approximate. this distribution using a geometric distribution Brown and Ge 7 using reliabilitytheory. methods obtain the result of Corollary Ib below for the restricted class of non. overlapping patterns The results of 4 Theorem 8 F and 10 apply to general patterns. but only directly bound the errors when estimating tail probabilities of the waiting time. distribution Though these can then be used with the triangle inequality to bound other. probabilities Theorem 3 can give much better bounds. Let Xo X be a sequence of i i d Bernoulli trials and let some desired k digit binary. pattern be specified Let I be the indicator of the event that the desired pattern appears. in Xi Xi l Xi k l and let W min i Ii 1 be the number of trials preceding the. first appearance of the pattern Also for 1 i k l let ci P LJ 1 o 1 be the. probability that given an occurrence of the pattern there is also an overlapping occur. rence i trials later For example with Bernoulli 5 trials the pattern 11011 gives c. c2 0 c3 1 8 c4 l 16,Theorem 3 For k digit patterns with p q P W 0. a IP WE B P Gp E B 1 qIBI k l ICi pj and,b d W Gp i Ic p.
Proof Define a Markov chain with 2k states so that the state at time n encodes the. outcomes of the k consecutive Bernoulli trials starting with trial n Let A i consist. of the state corresponding to the desired pattern The result then follows from Theorem. Pn c when 1 n k 1,p when k n, We next give an immediate corollary for non overlapping patterns patterns which. have ci 0 for all i,Stein s methodfor geometric approximation 711. Corollary 1 For a non overlappingk digit pattern,a P W B P Gp E B k 1 1 qll p and. b d W Gp k 1 p, Remark The result of Corollary lb was obtained by Brown and Ge 7 Note that. for point probabilities Corollary la gives IP W k pqk I k l p2. For patterns where many overlaps are possible the bound from Theorem 3 can be. improved using a geometric with larger mean The following corollary illustrates this. as does Corollary 3 below, Corollary 2 Let W count the numberof trials preceding thefirst appearanceof a run.
of k consecutive I s in a sequence of i i d Bernoulli a trials Letting p 1 a ak we have. d W Gp k p, Proof Consider instead the non overlapping pattern consisting of a 0 followed by. k l s We then apply Corollary 1 and add ak because this new pattern sappearance time. differs from W 1 only if the first k trials are l s Finally note that a result similar to. Theorem 3 and Corollary 1 can be established using 1 W instead of W. Remark The bound in Corollary 2 is half the bound in 5 p 164 For a simple. derivation of the exact distribution of W defined as in Corollary 2 see 16. 5 Improvingapproximationswhen clumping occurs, For a Markov chain X when visits to A generally occur in clumps that is when. P is much larger than ij for i jE A and n small the bound of Theorem 3 can be. poor In this case the hitting time generally has mean much larger than lip where p. iE A gi In the case where the Markov chain is m dependent i e when Xi is independent. of Xj for ii j m much better bounds can be obtained as follows in the same spirit. as Corollary 2 above using a geometric with a larger but less easily calculable mean. Theorem 4 Let W min i Xi E A be the time of thefirst visit to A for the stationary. m dependentMarkov chain Xi i 0 letp 1 q EEA and let a P W m Then. d W G 2mp ma, Proof Consider instead starting the chain at time m and let Z. min i 0 Xi A Xi m 1 I A Xi A Xi E A note that it can be viewed as the. hitting time on a set of states A for a related stationary 2m dependent Markov chain. where the state at time n now encodes the values X m X n Xn. We next apply Theorem 2 to this related new chain letting PA7 and fj refer to this. new chain while noting P Z O P W m a Since this chain is 2m dependent we. have P j j for n 2m and also for i jE A we have PO 0when n m. Note that every state in A is of the form a0 a where ao0 A am i A am. k E A and for a given state j ao am E A there is a corresponding state k am E A. Since the original chain was m dependent we have for m n 2m and i jE A that. PJ P X k k fj where k is the state in A corresponding to jE A. 712 EROL A PEKOZ,These together imply,Z IPJ njlj p when lIn m. a when m n 2m,0 when n 2m, Theorem 2 then gives d Z Ga mp ma and the result follows noting that Z differs.
from W only if a visit to A happens before time 0 in the original chain an event of. probability at most mp, As mentioned before the bound from Theorem 3 can be poor for long patterns where. many overlaps are possible e g the pattern 101010101 To improve the bound in this. case Theorem 4 can be immediately applied a situation with k 1 dependence. Corollary 3 Let W count the number of i i d Bernoulli trials preceding the first. appearance of a given k digit binary pattern with p P W 0 and a P W k 1. d W G 2 k 1 p k 1 a, Remark A result related to Corollary 3 for Poisson approximation appears as. Theorem 2 4 in 11, To compute the value of a P W k 1 for k digit patterns the recursion in the. following lemma can be used, Lemma 2 For k digitpatterns with ci definedas in Theorem3 thefollowing recursion. holdsfor O i k 1 P W i p EJ c P W j, Proof Writing Xi E A for the event that the pattern occurs not necessarily for the.
first time starting with trial i,P XjEA p P XiGA W i Z P XiEA W j. P W i ci jP W j,and the result follows upon re arranging. Acknowledgements, I would like to thank the referee for many comments which greatly improved the. presentation of this paper I would also like to thank Sheldon Ross and Michael Klass. for very many helpful comments and inspirational discussions and thanks are also. due to Jim Pitman David Aldous and David Freedman for their encouragement and. Stein s methodfor geometric approximation 713,References. 1 ALDOUS D 1989 Probability Approximationsvia the Poisson ClumpingHeuristic Springer Berlin. 2 ALDOUS D 1982 Markovchains with almost exponentialhitting times Stoch Proc Appl 13. 3 BARBOUR A D CHENL H Y AND LOH W L 1992 CompondPoissonapproximation for non. negativerandomvariablesvia Stein smethod Ann Prob 20 1843 1866. 4 BARBOUR A D AND GRUBEL R 1995 The firstdivisiblesum J Theoret Prob 8 39 47. 5 BARBOUR A D HOLST L AND JANSON S 1992 Poisson Approximation OxfordUniversityPress. 6 BLOM G AND THORBURN D 1982 How manyrandomdigitsarerequireduntilgivensequencesare. obtained J Appl Prob 19 518 531, 7 BROWN M ANDGE G 1984 On the waitingtimeforthe firstoccurrenceof a pattern In Reliability.
Theoryand Models ed M Abdel Hameed E Cinlarand J Quinn. 8 CHRYSSAPHINOU 0 ANDPAPASTAVRIDIS S 1988 A limittheoremforthe numberof non overlapping. occurrencesof a patternin a sequenceof independenttrials J Appl Prob 25 428 431. 9 EHM W 1991 Binomialapproximation to the Poissonbinomialdistribution Statist Prob Lett 11. 10 GODBOLE A P 1991 Poissonapproximationsforrunsandpatternsof rareevents Adv Appl Prob. 23 851 865, 11 GODBOLE A P AND SCHAFFNER A A 1993 ImprovedPoissonapproximations for wordpatterns. Adv Appl Prob 25 334 347, 12 GRIFFEATH D 1976 Partialcouplingandloss of memoryforMarkovchains Ann Prob 4 550 558. 13 KEILSON J 1979 Markov Chain Models Rarity and Exponentiality Springer Berlin. 14 LOH W L 1992 Stein smethodand multinomialapproximation Ann Appl Prob 2 536 554. 15 PEKOZ E AND Ross S M 1994 Improving,Poissonapproximations. Prob Eng Inf Sci 8 449 462, 16 PEKOZ E ANDRoss S M 1995 A simplederivationof exact reliabilityformulasfor linearand. circularconsecutive k of n Fsystems J Appl Prob 32 554 557. 17 PITMAN J W 1976 On couplingof Markovchains Z Wahrscheinlichkeitsth 35 315 322. STEIN S METHOD FOR GEOMETRIC APPROXIMATION EROL A PEKOZ University of California Los Angeles Abstract The Stein Chen method for Poisson approximation is adapted to the setting of the geometric distribution This yields a convenient method for assessing the accuracy of the

Related Books