- Home
- Documents
- Note onthe regeneration-basedbootstrapfor atomic Markov ?· Note onthe regeneration-basedbootstrapfor…

Published on

14-Sep-2018View

212Download

0

Transcript

Note on the regeneration-based bootstrap foratomic Markov chains

Bertail, PatriceLaboratoire de Statistiques, CREST

Clmenon, StphanMODALX, Universit Paris X Nanterre

Laboratoire de Probabilits et Modles Alatoires,UMR CNRS 7599 - Universits Paris VI et VII

AbstractIn this paper, we show how the original Bootstrap method in-

troduced by Datta & McCormick (1993), namely the regeneration-based Bootstrap, for approximating the sampling distribution of sam-ple mean statistics in the atomic Markovian setting may be modied,so as to be second order correct. We prove that the drawback of theoriginal construction mainly relies on a wrong estimation of the skew-ness of the sampling distribution and that it is possible to correct it bysuitable standardization of the regeneration-based bootstrap statisticand recentering of the bootstrap distribution. An asymptotic resultestablishing the second order accuracy of this bootstrap estimate upto O(n1 log(n)) (close to the rate obtained in an i.i.d. setting) is alsostated under weak moment assumptions.

RsumDans cet article, nous montrons comment la mthode du bootstrap

regnratif introduite par Datta et McCormick (1993) peut tre modi-e pour obtenir des rsultats de validit au second ordre dans le cadrede lestimation de fonctionnelles linaires bases sur lobservation dunechaine de Markov stationnaire, atomique. Nous montrons que lamthode originale ne permet pas destimer correctement le coecientdassymtrie de la distribution mais quil est possible de corriger ceproblme par une standardisation et un recentrage adquats. Nousobtenons ainsi la validit au second ordre de cette forme de bootstrapavec une erreur de lordre OP(n1 log(n)); proche au log(n) prs ducas i.i.d., sous des conditions de moments trs faibles.

1 IntroductionAmong the numerous methods that have been suggested to adapt EfronsBootstrap to weakly dependent settings, the view underlying the construc-tion proposed in Datta & McCormick (1993) (see also Athreya & Fuh (1989))for bootstrapping sample mean statistics in the atomic Markovian frameworkis one of the most interesting ones. Curiously, the beautiful ideas introducedin this paper, based on the renewal properties of Markov chains with an atom,do not seem to be widely known and used in the statistical and econometricBootstrap literature. This may be partly explained by the fact that theyonly consider the restrictive case of Markov chains possessing a known atomunder rather strong assumptions regarding to ergodicity properties. More-over, because of some inappropriate standardization, the method proposed inDatta & McCormick (1993) is not second order correct and performs poorlyin the applications. The purpose of this paper is to explain why the originalregeneration-based Bootstrap procedure fails to be second order accurate onthe one hand and to show how it is possible to correct it by some specicstandardization and recentering on the other hand. It is noteworthy that theregeneration-based bootstrap, modied in this way, allows to get an accuracy,in the case when the chain is stationary, very close to the one obtained in thei.i.d. setting, which is not the case for other Bootstrap methods introducedto deal with the dependent case (see Gtze & Knsch (1996) for instance).For the sake of simplicity, we only focus here on the case of the samplemean statistic renormalized by its true asymptotic variance. In section 2,the atomic Markovian framework we consider is set out and some notationsare given for later use. In section 3 a preliminary result is established, whichprovides an explicit expression for the asymptotic skewness coecient of thesample mean statistic in our setting. Our proposal, based on this preliminaryresult, for correcting the original regeneration-based bootstrap is describedin section 4. An asymptotic result proving the second order accuracy with aremainder of order OP (n1 log(n)) of this bootstrap procedure is also stated.The proof is given in section 5.

2 Assumptions and notationHere and throughout, X = (Xn)n2N is a time-homogeneous positive recurrentMarkov chain valued in a countably generated state space (E; E) with tran-sition probability (x; dy) and stationary probability distribution (dy) (seeRevuz (1984) for an exhaustive treaty of the basic concepts of the Markovchain theory). For any probability distribution on (E; E) (respectively, forany x 2 E), let P (resp., Px) denote the probability on the underlying spacesuch that X0 (resp., X0 = x) and let E(:) (resp., Ex(:)) denote the

1

P-expectation (resp., the Px-expectation). In what follows, we will supposethat the underlying probability space is the canonical space of the Markovchain, that is by no means restrictive regarding to our results. Let us assumefurther that the chain possesses a known accessible atom A, i.e. a measur-able set A 2 E such that (A) > 0 and for all x; y in A, (x; :) = (y; :).We will denote by PA (respectively, by EA(:)) the probability measure on theunderlying space such that X0 2 A (resp., the PA-expectation). We denotethe consecutive return times to the atom A by

A = A(1) = inf fn > 1; Xn 2 Ag ;A(k + 1) = inf fn > A(k); Xn 2 Ag ; for k > 1:

Throughout this paper, I(A) denote the indicator function of the event A. Inthis setting, the stationary distribution may be represented as an occupa-tions measure (see Theorem 17.1.7 in Meyn & Tweedie (1996) for instance):for any B 2 E,

(B) = EA(A)1EA(AX

i=1

IfXi 2 Bg):

Moreover the study of the asymptotic properties of such a Markov chainis made much easier by applying the so-called regenerative method. Thisconsists in dividing its trajectories into blocks corresponding to pieces ofthe sample path between successive visits to the atom A, Bk = (XA(k)+1; :::;XA(k+1)); k > 1, and in exploiting the fact that, by virtue of the strongMarkov property, the Bks are i.i.d. random variables valued in the torusT = [1n=1En: In the sequel, we shall denote by l(Bk) = A(k +1) A(k) thelength of the block Bk; k > 1: And for any measurable function f : E ! 1:

We point out that the atomic setting includes the whole class of Harrisrecurrent Markov chains with a countable state space (for which, any recur-rent state is an accessible atom), as well as many other specic Markovianmodels, widely used for modeling storage and queuing systems for instance(refer to Asmussen (1987) for an overview).

3 Preliminary resultLet f be a real valued function dened on the state space (E; E) and setSn(f ) =

Pni=1 f (Xi). Under the assumption that the expectationEA(SA(jfj))

2

is nite, the function f is clearly -integrable (note that (f ) = E(f(X1)) =(A)EA(SA(f)) by the representation of using the atom A) and with theadditional assumption that the initial probability distribution is such thatP(A < 1) = 1, the regenerative method mentioned above allows to showstraightforwardly that n(f) = Sn(f )=n is a strongly consistent estimator ofthe parameter (f) under P: Sn(f )=n ! (f) P a.s., as n ! 1. More-over, under the further assumptions that the expectations EA( 2A); E(A);EA(SA(jf j2)) and E(SA(jfj)) are nite, the CLT holds too under P:

n1=2(Sn(f)=n (f )) d! N (0; 2f), as n! 1;

with a limiting variance 2f = (A)EA(SA(f (f ))2) (see Theorems 17.2.1and 17.2.2 in Meyn & Tweedie (1996) for instance).

Even if it entails to replace f by f (f), we assume that (f ) = 0in the remainder of this section. The following theorem gives two dierentforms for the asymptotic skewness of n1=2(Sn(f)=n); which determines themain term in its Edgeworth expansion (see Datta & McCormick (1993b)).

Theorem 1 If the seriesPi>1fE(f 2(X1)f (Xi+1)) + E(f(X1)f 2(Xi+1))g

andPi; j>1E(f (X1)f (Xi+1)f(Xi+j+1)) converge absolutely, then we have:

limn!1

n1E((Sn(f))3) = E(f(X1)3)

+ 31X

i=1

fE(f2(X1)f(Xi+1)) + E(f(X1)f 2(Xi+1))g

+61X

i; j=1

E(f(X1)f(Xi+1)f (Xi+j+1)): (1)

Moreover, if the expectations EA(4A) and EA (SA(jf j)4) are nite, 2f > 0and limjtj!1jEA(exp(itSA(f)))j < 1, then we have also:

limn!1

n1E((Sn(f ))3) = EA(A)1fEA(SA(f )3) 32fEA(ASA(f ))g: (2)

3

Proof. For all n> 1 we have by stationarity

n1E(Sn(f)3) = E(f(X1)3)

+ 3n1n1X

i=1

nX

j=i+1

E(f (X1)2f(Xji+1)) +E(f (X1)f (Xji+1)2)

+ 6n1n2X

i=1

n1X

j=i+1

nX

k=j+1

E(f(X1)f (Xji+1)f(Xkj+1))

= E(f(X1)3)+

3n 1n

nX

l=1

E(f(X1)2f(Xl+1)) +E(f (X1)f(Xl+1)2)+

6n 2n

n1X

l=1

nX

m=1

E(f(X1)f(Xl+1)f(Xm+l+1))

and thus one clearly gets (1) from the convergence of the right hand side asn! 1. Besides, under the assumption that the block moment conditions

EA(4A) < 1; EASA(jfj)4

where FXi denotes the -eld generated by the Xjs; for j 6 i: Finally, wehave

E( 3A) < 1EA(

AX

i=1

(A i)3) < 1EA( 4A) < 1:

In a similar fashion, we have

E(SA(jf j)3) = 1EA(AX

i=1

(AX

j=i

jf(Xi)j)3) 6 1EA(ASA(f )3)

6 1(EA(4A))1=4EA(SA(f)4)3=4 < 1

by using Hlder inequality. Moreover, we have (f) = (f). By therepresentation of the stationary distribution using the atom A again, wehave

E(AX

i=1

f (Xi)) = EA(AX

i=1

EXi(AX

k=1

f (Xk))) = EA(X

16i1ff 2(Xk) E(f2(Xk))g converges absolutely under P; we have thatP1k=1 f2(Xk) = 1 , P a.s. (note that f2(Xk) is not centered under P).

Thus, this crucial observation shows that exchanging the expectation andthe summation in

P1i=1E(f(X1)f

2(Xi+1)); as done in Datta & McCormick(1993a), is not possible. Observe further that such an illicit operation wouldallow to derive the false identity claiming that the sumE(f(X1)3) + 3

P1i=1fE(f 2(X1)f (Xi+1)) + E(f (X1)f2(Xi+1))g+

6P1i; j=1E(f(X1)f (Xi+1)f(Xi+j+1)) equals to the term1EA((SA(f )3),

(or equivalently that k3;f equals to 13f EA((SA(f )3)); on which the argu-

ment of Datta & McCormick (1993a), for studying the second order prop-erties of the regeneration-based bootstrap methodology they introduced, isbased. As will be shown precisely in the next section, this particularly entailsthat the regeneration-based bootstrap estimate of the sampling distributionof the sample mean statistic n(f ) = Sn(f)=n, originally proposed by Datta& McCormick (1993a) has an Edgeworth expansion, that does not match

5

with the expansion of n(f ) (which is due to the skewness term), and conse-quently is not second order accurate.

4 The stationary regenerative block-bootstrapThe preliminary result of section 3 clearly advocates for the following modi-cation of the regeneration-based bootstrap procedure introduced by Datta& McCormick (1993a) to deal with atomic Markov chains, which we callthe stationary regenerative block-bootstrap (SRBB). For estimating the sam-pling distribution H(x) = P(n1=21f (n(f)(f)) 6 x) of the studentizedsample mean statistic (see below the denition of the asymptotic varianceestimator 2n(f)) computed from observations X1; :::; Xn drawn from a sta-tionary version of the chain X, it is performed in four steps as follows.

1. Count the number of visits ln =Pni=1 IfXi 2 Ag to the atom A up to

time n. And divide the observed sample path X (n) = (X1; ::::;Xn) intoln + 1 blocks, valued in the torus T = [1n=1En; corresponding to thepieces of the sample path between consecutive visits to the atom A:

B0 = (X1; :::; XA(1)); B1 = (XA(1)+1; :::; XA(2)); :::;Bln1 = (XA(ln1)+1; :::; XA(ln)); B

(n)ln = (XA(ln)+1; :::; Xn):

2. Draw an array of ln 1 bootstrap data blocks (B1;n; :::; Bln1;n) inde-pendently from the empirical distribution Fn = (ln 1)1

Pln1i=1 Bi of

the blocks B1; :::; Bln1, conditioned on X(n): Practically the bootstrapblocks are taken with replacement from the primary blocks.

3. From the bootstrap data blocks generated at step 2, reconstruct apseudo-trajectory by binding the blocks together, getting the recon-structed SRBB sample path

X(n) = (B0;n; B1;n; :::;Bln1;n; Bln;n):with

B0;n = B0 and Bln;n = B(n)ln :

Whereas the number of blocks ln 1 is xed (conditionally to the datasample), the length of the reconstructed segment (B1;n; :::;Bln1;n) ofthe pseudo-trajectory is random. We denote by n =

Pln1i=1 l(B1;n) the

length of this segment.

4. Compute the SRBB statistic, with the usual convention regarding toempty summation,

Sn(f ) =lnX

j=0

f (Bj;n);

6

and the following estimate of 2f ,

2n(f) = (A(ln) A)1ln1X

j=1

(f (Bj) en(f)l(Bj))2;

with en(f) = (A(ln) A)1Pln1j=1 f (Bj): Then, the recentered distri-

bution oftn = n

1=2Sn (f) Sn(f)n(f)

;

conditioned on X(n), is the SRBB distribution

HSRBB(x) = P (tn E(tnjX(n)) x j X(n))where P (: j X(n)) (respectively, E(: j X(n))) denotes the conditionalprobability (resp., the conditional expectation) given X(n).

We point out that the bootstrap estimator HSRBB(x) of H(x) diersfrom the bootstrap estimator originally proposed by Datta & McCormick(1993a) in two ways. First, the standardization of the bootstrap statistic de-pends on the random length n of the reconstructed bootstrap data segment,whereas the standardization n1=2(Sn(f )Sn(f))=n(f) is used in Datta &McCormick (1993a). Secondly, the bootstrap distribution is recentered so asto be unbiased. As will be shown below, this random standardization actu-ally allows to recover the correct skewness coecient k3;f at the price of anadditional bias, that may be rectied by recentering suitably the statistic tnof interest (observe that, because of the random standardization by n1=2,recentering the distribution does not amount to recenter the SRBB statisticSn(f )).

The construction of the estimator 2n(f ) naturally relies on the ex-pression 2f = (A)EA((SA(f) (f)A)2) for the asymptotic variance, itsproperties are studied in Bertail & Clmenon (2003). Besides, we have notused the rst and last (non regenerative) data blocks B0 and B(n)ln in thecomputation of our estimate 2n(f ), because this would make its study muchmore tricky, while being all the same from the estimation point of view inthe stationary framework considered here.

We also emphasize that one may naturally compute a Monte-Carloapproximation to HSRBB(x) by the following scheme: repeat independentlythe procedure above Q times, so as to generate tn;1; :::; tn;Q, and compute

H(Q)SRBB(x) = Q1

QX

q=1

Iftn;q Q1QX

p=1

tn;p 6 xg:

The following theorem establish the second order validity of the SRBBestimator up to order OP(n1 log(n)); which is close to the rate OP(n1) thatcan be obtained in the i.i.d case.

7

Theorem 2 Assume that the chain X fullls the following conditions.

(i) Conditional Cramer condition:

limt!1EAjEA(exp(itSA(f))jA)j < 1:

(ii) Non degenerate asymptotic variance: 2f > 0:

(iii) Block moment conditions:

EA( 4A) < 1; EA(SA(jfj)6) < 1:

(iv) Non trivial regeneration set: EA(A) > 1:

Then, the following Edgeworth expansion is valid uniformly over R;

n = supx2R

jH (x) E (2)n (x)j = O(n1); as n! 1; (3)

with

E(2)n (x) = (x) n1=2k3;f6

(x2 1)(x);

k3;f = EA(A)1fEA(SA(f )3) 32fEA(ASA(f ))g=3f :

And the SRBB estimator is second order accurate in the following sense

Sn = supx2R

jHSRBB(x) H(x)j = OP(n1 log(n)) (4)

uniformly over R, as n! 1: The proof essentially relies on the Edgeworth expansion (E.E. in abbre-

viated form) obtained in Malinovskii (1987). And dealing with the Bootstrappart mainly reduces to study the E.E. of a bootstrapped V -statistic of degree2 based on i.i.d. r.v.s (the bootstrap blocks). The validity of E.E. for V -statistics has been proved in Gtze (1979), Bickel, Gtze & van Zwet (1986)for instance. The accuracy of the Bootstrap for U-statistics of degree 2 iseasy to obtain up to oP (n1=2): But further conditional Cramer conditionsare generally assumed to check the validity up to OP(n1). Here we use theresults of Lai & Wang (1993), proving the validity of the Bootstrap of U-Vstatistics up to OP(n1); under conditions which reduce to the conditionalCramer condition (i) in our case. The validity of the SRBB under weakerCramer conditions will be investigated elsewhere.

When f is bounded, (iii) reduces to the condition EA(6A) < 1, whichtypically holds as soon as the strong mixing coecients sequence decreasesat a polynomial rate n for some > 5 (see Bolthausen (1982)).

8

5 Proof of the main resultThe proof of the E.E. (3) for the non studentized sample mean may be foundin Malinovskii (1987) (see Theorem 1, refer also to Bertail & Clmenon(2003)). Notice that the conditional Cramer condition implies the usualCramer condition limjtj!1jEA(exp(itSA(f)))j< 1 and that the bias vanishesin the stationary case. Consider the recentered variables for j > 1;

F (Bj) = f(Bj) (f )l(Bj);F (Bj;n) = f(Bj;n) en(f )l(Bj;n):

Notice that the mean length of the bootstrap data blocks Bj;n; j > 1; forgiven X(n) is

lB =defE(l(Bj;n) j X(n)) = (ln 1)1

ln1X

k=1

l(Bk);

and observe further that E(F (Bj;n) j X(n)) = 0 and

V (F (Bj;n) j X(n)) =1

ln 1ln1X

k=1

F (Bk)2 = lB2n(f) =def b2F ;

denoting by V (: j X(n)) the conditional variance for given X (n). Note thatthe empirical estimator 2n(f) of the asymptotic variance is essentially a boot-strap estimator of the variance of the recentered blocks, rescaled by an esti-mator of EA(A); namely lB: The following technical results will be useful inthe proof. Lemma 3 is a standard result due to Chibisov (1972).

Lemma 3 Assume that Wn admits an E.E. on the normal distribution up

to O(n1 log(n)); > 0; as n ! 1: Assume further that Rn is such that,for some > 0; P (njRnj > log(n)) = O(n1 log(n)) or O(n1) as n!1; then Wn + Rn has the same E.E. as Wn up to O(n1 log(n)).

Lemma 4 Under the hypotheses of Theorem 2, we have for some constant

> 0;

P(n1=2nl1n

> (log(n))1=2) = O

n1

, as n! 1: (5)

Proof. Following the argument given in Clmenon (2001) based on theFuk & Nagaevs inequality for sums of independent unbounded r.v.s (see alsoTheorem 6.1 in Rio (2000) for a proof based on block mixing techniques),

9

there exists constants c0 and c1 such that the following probability inequalityholds for all n;

P(ln=n 1

> x) 6 c0fexp(

nx2

c1 + xy)+

nPA(A > y) +PA(A > n=2) + P(A > n=2)g:

On the one hand, choosing y = n1=2 and bounding the last three terms atthe right hand side by Chebyshevs inequality (given that the expectationsEA(2A) and E(A) are nite), one gets that, for a constant > 0 largeenough

P(ln=n 1

> ) = O(n1), as n! 1; (6)

and on the other hand with the choice x = (logn=n)1=2 and y = (n= logn)1=2and using Chebyshevs inequality again (given that the expectations EA( 4A)and E(A) are nite), one obtains that

P(n1=2ln=n 1

> (logn)1=2) = O(n1), as n! 1: (7)

Now, by combining bounds (6) and (7) ; the proof is nished by straightfor-ward calculations.

Notice rst that, because of the recentering of Sn(f) by the originalstatistic Sn(f); the data of the rst and last (non regenerative) blocks B0and B(n)ln disappear in the numerator. Hence, we may rewrite the bootstrapversion of the studentized sample mean the following way

tn =Pln1j=1 F (Bj;n)

(Pln1j=1 l(Bj;n))1=2n(f)

(8)

=Pln1j=1 ff (Bj;n) en(f)l(Bj;n)g(ln 1)1=2 (1 +Ln)1=2 bF

with

Ln = lB1f(ln 1)1

ln1X

j=1

l(Bj;n) lBg

Using standard bootstrap results in the i.i.d. case (see Singh (1981) for thelattice case), we have for a constant > 0 large enough,

P ((ln 1)L2n > log(ln) jX (n)) = O(l1n ); as n! 1:

It follows from lemma (3) with = 1 that up to O(l1n log(ln)); we canlinearize (8) and the problem reduces to nd the E.E. of

etn = (ln 1)1=2b1F

(ln1X

j=1

F (Bj;n)f112(ln 1)

1ln1X

k=1

(l(Bk;n) lB)=lB)g)

10

This may be seen as a bootstrapped V -statistic of degree 2 based on thei.i.d. blocks Bj;n, j > 1 or . The main (linear) part of the correspondingU -statistic is F (Bj;n)=b1F ; the (degenerate) quadratic term is given by

n(Bj;n;Bk;n) =12b1F

(F (Bj;n)

l(Bk;n) lBlB

) + F (Bk;n)l(Bj;n) lB

lB

):

The validity of the Bootstrap for general U or V statistics is proved in Lai& Wang (1993), up to OP(n1) under assumptions on the second gradient ofthe U -statistics, which are easier to check than the usual conditional Cramerconditions or conditions on the eigenvalues of the second order gradient ofthe U -statistic (see also Bickel, Gtze & van Zwet (1986)). The conditionalCramer condition used here implies their Cramer type condition (see p 521 oftheir paper, as well as related results in Bai & Rao (1991) for the validity ofE.E. under conditional Cramer type conditions). Using the arguments in Lai& Wang (1993), one may thus check that, conditionally to X(n), etn admitsup to O(l1n log(ln)) an E.E. of the form (see also Barbe & Bertail (1995) forthe form of the E.E. up to oP(n1=2)),

P tn x j X(n)

= (x)

(3)(x)6pln 1

f 1ln 1

ln1X

j=1

ff(Bj) en(f)l(Bj)gb3F

3

g

x(2)(x)

2pln 1

f 1ln 1

ln1X

j=1

ff(Bj) en(f )l(Bj)g(l(Bj) lB

bF lB)g

+O(l1n log(ln)): (9)

Now from lemma 4, we obtain (unconditionally) as n! 1;

1(ln 1)1=2

=EA(A)1=2

n1=2+ OP(n1 log(n)1=2), (10)

and similarlyl1n log(ln) = OP(n1 log(n)): (11)

Now under assumption (iii); by the SLLN and the CLT for the i.i.d.blocks we have as n! 1 (see also Bertail & Clmenon (2003))

1ln 1

ln1X

j=1

ff(Bj) en(f)l(Bj)g3b3F

= EA(SA(f)3)

EA(A)3=2f3+ OP(n1=2) (12)

and

1ln 1

ln1X

j=1

(f(Bj) en(f)l(Bj))(l(Bj) lB

bF lB) = E3=2A f

1 + OP (n1=2);

(13)

11

as n ! 1, provided that the denominator is dened, which is the case assoon as ln > 1. Therefore we have P(ln 1) = O(n1) as n ! 1 (seelemma 4 for instance) and combining the conditional E.E. (9) with the ap-proximations (10), (11), (12), (13), it follows that the Bootstrap distributionhas in P probability an E.E. of the form

P tn x jX (n)

= (x) n1=2k3;f(x2 1)(x)+ n1=2EA(A)1f1=2 (x) +OP(n

1 log(n)):

Notice the bias n1=2EA(A)1(f)1=2 which appears because of the ran-dom standardization. Recentering by the conditional expectation of tn givenX(n) immediately leads to the asymptotic result (4) of Theorem 2.

References[1] Asmussen, S. (1987). Applied Probability and Queues. John Wiley & Sons,

NY.

[2] Athreya, K.B., Fuh, C.D. (1989). Bootstrapping Markov chain: countablecase. Tech Rep. 89-7, Institute of Statistical Science, Academia Sinica,Taiwan.

[3] Bai, Z.D., Rao, C.R. (1991). Edgeworth expansion of a function of samplemeans. Ann. Statist., 19, 1295-1315.

[4] Barbe, Ph., Bertail, P. (1995). The Weighted Bootstrap. Lecture Notes inStatistics, 98, Springer Verlag, New-York.

[5] Bertail, P., Clmenon, S. (2003). Edgeworth expansions for suitably nor-malized sample mean statistics for atomic Markov chains. Submitted forpublication.

[6] Bickel, P.J., Gtze, F. , van Zwet , W. R. (1986). The Edgeworth expan-sion for U-statistics of degree 2. Ann. Statist. , 14, 1463-1484.

[7] Bolthausen, E. (1982). The Berry-Esseen Theorem for strongly mixingHarris recurrent Markov Chains. Z. Wahrsch. Verw. Gebiete, 60, 283-289.

[8] Chibisov, D. M. (1972). An asymptotic expansion for the distribution ofa statistic admitting an asymptotic expansion. Theory Probab. Appl., 17,620-630.

[9] Clmenon, S. (2001). Moment and probability inequalities for sums ofbounded additive functionals of regular Markov chains via the Nummelinsplitting technique. Stat. Prob. Letters, 55, 227-238.

12

[10] Datta, S., McCormick W.P., (1993a). Regeneration-based bootstrap forMarkov chains. The Canadian Journal of Statistics, 21, No.2, 181-193.

[11] Datta, S., McCormick W.P., (1993b). On the rst-Order EdgeworthExpansion for a Markov Chain. J. Mult. Anal. , 44, 345-359.

[12] Gtze, F. (1979). Asymptotic expansions for von-Mises functionals. Zeit.Warsh. Verw. Gebiete, 50, 333-355.

[13] Gtze, F., Knsch, H.R. (1996), Second order correctness of the block-wise bootstrap for stationary observations, Ann. Statist., 24, 1914-1933.

[14] Malinovskii, V. K. (1985). On some asymptotic relations and identitiesfor Harris recurrent Markov Chains. Statistics and Control of StochasticProcesses, 317-336.

[15] Malinovskii, V. K. (1987). Limit theorems for Harris Markov chains I.Theory Prob. Appl., 31, 269-285.

[16] Meyn, S.P., Tweedie, R.L., (1996). Markov chains and stochastic stabil-ity. Springer, Berlin.

[17] Lai, T. L. , Wang, J. Q. (1993). Edgeworth expansions for symetricstatistics with applications to bootstrap methods. Statistica Sinica, 3,517-542.

[18] Revuz, D (1984). Markov chains. North-Holland, 2nd edition.

[19] Rio, E. (2000). Thorie asymptotique des processus alatoires faiblementdpendantes. Springer Verlag.

[20] Singh, K. (1981). On the asymptotic accuracy of Efrons Bootstrap.Ann. Statist. , 9, 1187-1195.

13