Augmented Truncations of Infinite Stochastic Matrices Diana Gibson; E. Seneta Journal of Applied Probability, Vol. 24, No. 3. (Sep., 1987), pp. 600-608. Stable URL: http://links.jstor.org/sici?sici=0021-9002%28198709%2924%3A3%3C600%3AATOISM%3E2.0.CO%3B2-U Journal of Applied Probability is currently published by Applied Probability Trust.

Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at http://www.jstor.org/about/terms.html. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your personal, non-commercial use. Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at http://www.jstor.org/journals/apt.html. Each copy of any part of a JSTOR transmission must contain the same copyright notice that appears on the screen or printed page of such transmission.

The JSTOR Archive is a trusted digital repository providing for long-term preservation and access to leading academic journals and scholarly literature from around the world. The Archive is supported by libraries, scholarly societies, publishers, and foundations. It is an initiative of JSTOR, a not-for-profit organization with a mission to help the scholarly community take advantage of advances in technology. For more information regarding JSTOR, please contact [email protected].

http://www.jstor.org Fri Sep 14 14:28:00 2007

J. Appl. Prob. 24, 600-608 (1987) Printed in Israel Q Applied Probability Trust 1987

AUGMENTED TRUNCATIONS OF INFINITE STOCHASTIC MATRICES DIANA GIBSON* AND E. SENETA,* University ofSydney

Abstract We consider the problem of approximating the stationary distribution of a positive-recurrent Markov chain with infinite transition matrix P, by stationary distributions computed from (n X n) stochastic matrices formed by augmenting the entries of the (n X n) northwest comer truncations of P, as n-x. STATIONARY DISTRIBUTION; AUGMENTATION; LAST-EXIT UPPER-HESSENBERG; LOWER-HESSENBERG

PROBABILITIES;

1. Introduction

We are concerned throughout with approximating the stationary distribution n of an infinite positive-recurrent Markov chain on the positive integers N, with transition matrix P , through the finite northwest corner truncations of P . Let (,f denote the truncation of size n. It is aesthetically pleasing to try to approximate the stationary distribution n = (71,) by a sequence of stationary We consider (,,n obtained from an n X n stochastic distributions {(,,n),"=,. where (,g" 2 (,f elementwise, and ask for what kinds of P and what matrix is it true that (,,n n. (By convergence of probability sequences {(,f),"=, vectors we mean convergence in I, which is equivalent to elementwise (see Wolf (1975), Lemma l).) In this paper, we prove that for a Markov matrix P or an upper-Hessenberg P any sequence {(,,4},"=, will do; that certain methods of constructing work for all P ; and that for lower-Hessenberg P we must be somewhat careful in generalizing these. The motivating papers in the investigation of this problem are Seneta (1980) and Wolf (1980), Section 5, although earlier papers by both authors play a role. Returning to our basic context of positive-recurrent P, with (n X n) northlet west corner truncation ,,?, and (n X n) stochastic ( , f where ,,? 2 ,,,P,

(,4

-

(,4

Received 16 June 1986; revision received 7 August 1986. * Postal addfess: Department of Mathematical Statistics, University of Sydney, NSW 2006, Australia.

60 1

Augmented truncations of infinite stochastic matrices

I$), (,)lLk),(n,@) denote the last-exit probabilities from state i to state j , and Lij(z), (,$,(z), (,$,(z) the corresponding generating functions, 1 z I _I 1. (See Seneta (1981), Chapters 5 and 6 for amplification on these and the following introductory remarks.) Note that (1.1)

n,ln,

= Lj,(l)

and similarly if Cn is any essential class of indices (states) of (,f

where (,)a= {(,)n,)is the corresponding stationary distribution of (,f. Finally recall that as n -- co

In consequence of these relations, last-exit generating functions will play a central role in our discussion. 2. General augmentation for special matrices

Lemma 2.1. Let (,f be any (n X n) stochastic matrix with (,f 2 (,f, and suppose for all sufficiently large n, (,f has a unique essential class Cn, which contains, for all such n , a fixed pair of indices i, j. Let (,,n then denote the unique stationary distribution of (,f. Then as n -- co

Proof.

Since(,fi(,f (n$v(l)~(n$ij(l) i f n 2max(i,j).

Now for n so large that i, j E C,, from (1.2)

But lim (n$,j(l) = Lj,(l) = n,In, n-a;

=

llLj,(l) = lim l/(,,$,,(l) n-a;

using (1.1) and (1.3). Therefore limn,, (,)nj/(,)niexists and equals njlni. This result leaves open the general question of convergence of (,,ato n for a positive-recurrent P , which as we shall see from Section 3, does not necessarily hold under the conditions of the lemma. However, it does hold if the infinite matrix P has special structure.

Definition 2.1.

A stochastic matrix P = {p,) is said to be a Markov matrix

602

DIANA GIBSON AND E. SENETA

if the elements of at least one column are bounded away from 0, i.e. there exists a j, and an E > 0 such that p,, > E, all i. Such a matrix has single essential class, which is positive-recurrent, aperiodic, and contains j,. Theorem 2.1. Let P be a Markov matrix and for each n EN, let ,,,P be an (n X n) stochastic matrix satisfying (,$ 2 (,f. Then for all n sufficiently large ($ , has a unique stationary distribution (,,n and ,,)n -- n as n z.

-

(,4

Proof. is a Markov matrix for all n sufficiently large to take in the column (say j,th) uniformly bounded from 0 in P . The rest of the proof is precisely as in Seneta (1980), 92 or Seneta (1981), Theorem 7.3, where the unnecessary assumption is made that the column j, is augmented in ,,,P to form

(,,4

Definition 2.2. A stochastic matrix P = { p , } is said to be upper-Hessenberg if p , = 0 if i >j 1.

+

Since any Markov chain governed by such a P in passing from a state i to a state j, where i >j, must pass through every intermediate state, it follows that 1',," '(,,

/(k'

,

n Li>j,

and

EN.

Therefore for such i , j

In the sequel the blanket assumption that P is positive-recurrent is to be understood. Theorem 2.2. Suppose P i s upper-Hessenberg and for each n E N let (,,Pbe an n X n stochastic matrix satisfying ($, L (,f. Then ($, has unique stationary distribution (,,n and (,,a+ n as n -- x. Proof. Since P i s irreducible, all entries on its subdiagonal are positive, i.e. p, + > 0, Vi EN. Hence j -- 1 with respect to (,f for all j E (2.. . , n}. So (,,P has just one essential class, C,, say, and 1E C,. Note that U;=, C, = N since any index j communicates with 1 with respect to ,,$' for large enough n . Take i = 1 in (2.1) and sum over j to obtain

,,

But (,+,,(I) 5 L,,(l) any j 5 n , so by dominated convergence and (1.3) a

lim "'X

C

JEC,

(n+l,(l> =

C LIj(1) ]=I

Augmented truncations of infinite stochastic matrices

Also, from (2.2),

=

l/n,

by (1.1) again.

Thus

Since j E C, for large enough n , we can use (2.3) together with Lemma 2.1 to show (n)71, =

(n)njl(n)111- njlzl -n, 1 1/11,

asn-co.

A version of this theorem (where (,f is formed from (,,Pby augmenting only the last column of (,,P but leading to a stronger conclusion) was proved by Golub and Seneta (1974) (see Seneta (1981), Lemma 7.3). Indeed, then cn,!$) = (,)I$) = I$), 1 ij < n , so (,)nJ/(,)ll,= ~r,ln,, 1 5 j S n . (Similar notions apply to the generalized renewal matrix treated in the same sources.)

3. Linear augmentation for general matrices Consider the method, which we shall call linear augmentation, of constructing a stochastic (n X n ) matrix (,f >= ,,? suggested in Seneta (1980):

where (,,a is a probability n-vector, and (,,lis an n-vector of 1's. Seneta (1980); (198 1) Section 7.2, showed that ,,,P thus formed has unique essentia class, and correspondingly unique stationary distribution given by

Let ,,J be the n-vector with unity in the ith position, zeros elsewhere, l i i i n .

Theorem 3.1. For fixed i 2 1, and n >= i , let (,f be formed from (,,Pby linear augmentation using (,,a= (,J (i.e. by increasing the elements of its ith column only), and let (,)n be the unique stationary distribution of (,,)P". Then as n co, (,,n-n.

-

-

That (,)q/(,)llit q l n i as n co was proved by Seneta (1967), (1968) in a different guise; see Seneta (1980). Theorem 3.1 as a whole was proved by Wolf

604

DIANA GIBSON .AND E. SENETA

(1975), Satz 3, essentially using this fact; see also Wolf (1980), Section 5, and Allen et al. (1977). The result can be extended as follows: we omit the proof here (as elsewhere) for brevity. Theorem 3.2. Let a = {a,); be a probability vector with C;=, a, = 1 for some fixed finite N, and let (,,aconsist of the first n entries of a , n 1N. Let ,,,P be formed by linear augmentation of (,f using (,,a, n 2 N. Then ,,,n--n as n -- x , where ,,,n is the unique stationary distribution of ,,$. That arbitrary linear augmentation is not always successful, and the need to restrict the manner of growth of the probability vector ,,,a as n -- x,is demonstrated by the following example, where (,,a= ,,$, (so augmentation of ,,f to form ,,$ occurs only in the last column). Example. Consider a stochastic renewal matrix

where 0 < p, < 1, Vz EN. P is clearly irreducible. Define a, = 1 and a, = ll;=,p,, J E N . It is easy to see that P positive-recurrent is equivalent to I,*=, a, < x (e.g. Seneta (198 l), Section 5.6). In this case, the stationary equations yield n, = l/C,"=, a,, n, = a,- ,n,,j E N . Fix N L 3 and define

if j =0 (mod N)

1 - (l/j2) (j - 1)4

.-

((j - 1)2 - 1) (j

1

i f j = l (modN)butj f 1.

+I ) ~

Then for j 2 I (llj

+ 1)'

i f j s 0 (mod N)

(j2 - l)lj4 if j =0 (mod N) so P is positive recurrent.

Augmented truncations of infinite stochastic matrices

605

Notice that ,,fis irreducible for all n EN, and that the conditions of Lemma 2.1 are satisfied. The stationary equations (,,nl = (,,nl (,$ give

since

But for n =0 (mod N), a, - ,/q, Hence (,,nl%nl as n -- co.

=

1.

4. Lower-Hessenberg P

Dejinition 4.1. A stochastic matrix P = {pi,) is said to be lower-Hessenbergifp, = 0 , j > i 1. Such matrices satisfy a property dual to (2.2). Specially, iffik),(,,fik'denote the first-passage probabilities from state i to state j, then J;Sk' = (,,$', i < j 9 n , whence

+

We should also note the properties dual to (1.1) and (1.3) for positiverecurrent P (Seneta (1981), Chapter 5): as n -- co

Although there is an obvious duality between upper- and lower-Hessenberg P , property (2.2) of the former is far more pertinent to our problem than property (4.1) of the latter, because it links the left Perron-Frobenius structure of the truncations with that of the infinite matrix. The example of Section 3 shows how difficulties may arise with positive-recurrent lower-Hessenberg P, in contrast to Theorem 2.2 for such upper-Hessenberg P. If, however, as suggested by the example and Theorem 3.1 we require that the sequence {(,$),"=, be constructed by linear augmentation (3.1) using a sequence {(,,a),"=, which is more 'stable' than the sequence {(,,A),then the desired convergence of the corresponding stationary distributions obtains for lower-Hessenberg P.

Theorem 4.1. Suppose that P is lower-Hessenberg and let {(,,a),"=, be a tight sequence of probability vectors with C:=, (,,aj = 1 Vn E N . If (,,n is defined by (3.1) for each n EN, then (,,n -- w as n -- co.

606

DIANA GIBSON AND E. SENETA

5. On applications and interpretation Wolf (1980) mentions two examples of positive-recurrent matrices P arising in queueing systems for which several specific methods of augmentation to form (,@from (,f'will result in (,,n -- w. A special case is the upper-Hessenberg matrix P which arises as an imbedded chain of an MIGI1 queueing system: thisisdefinedbyp,, = a ,-,, 1 S J ; ~ ,=, a , - , + , , 2 9 i 9 j 1;p, =Oothelwise, where {a,), j LO, is a probability distribution for which we assume all elements are positive and C,ja, < 1, which renders P positive-recurrent. Thus by Theorem 2.2 any augmentation may be used. Some numerical investigations on this matrix for various {a,) are contained in Allen et al. (1977) but only insofar as approximation as n -- co of n,ln,, j L 1, is concerned. Below we give the results of numerical investigation of ,,n, -- n, where a, = p'l(1 p)'+', j L 0, for p = 0.5, 0.75, 0.9, 0.95, and various n as shown in Table 1. This special structure of {a,) is chosen because the form of n is known analytically: n, = (1 - p)pl-', i L 1. (Previous numerical investigations of ,,,n -- n for P which are both of Markov and generalized-renewal structure only, are reported in Seneta (1980).) Table 1 shows the I, distance between (,,n and n for the various methods of augmentation used, which are (i) Linear augmentation (3.1) with a, =(,,A (i.e. augmentation of first column of (,f' only). (ii) Linear augmentation with a, = (,,A(i.e. last column only).

+

+

Method of approximation

n

(1)

(ii)

(iii)

(iv)

(4

607

Augmented truncations of infinite stochastic matrices

(iii) Linear augmentation with a, = (,,lln. (iv) Normalization of rows of (,f: (,pi,= (,gJC;, (,gik, i, j = I,. . , n. (v) Augmentation of diagonal entries of (,f only: (,#,, = (,g,, i # j;

,

.

It is noticeable that, for fixed P and n, (ii) provides the best 1, approximation to n while (i) provides the worst. For an upper-Hessenberg matrix and augmentation (ii) (,,nj = n,lC;,, nk as noted at the end of Section 2, so this result is not altogether surprising. Further plausible arguments as to why (ii) should be the 'best' augmentation may be provided on account of P being stochastically monotone. For fixed n, the stationary distribution vector of each (,,Pwas found as the unique solution of the (n X n) non-singular equation system

(cf. Seneta (1980), Section 3). Augmentation methods can sometimes be given a natural physical interpretation in relation to a Markov chain {X,) described by the underlying infinite matrix P. For example, linear augmentation of (,f by the probability n-vector ,,,a1to form (,f (see (3.1)) amounts to saying that as soon as the Markov chain {X,) leaves the set of states E = (1, 2,. . , n ) it is immediately returned to E with probability distribution (,,a1over E, irrespective of the state from which the exit occurred. The new Markov chain described by ,,,P is then called, in applied literature, the 'return process', and the stationary'distribution vector given by (3.2) describes (in relation to {X,)) relative proportions of mean time spent in various of the states (1, 2,. . ., n ) before exiting from this set. Augmentation of diagonal elements only as in (v) above amounts to immediately returning the process to the state of E from which exit from E is made, whenever exit is made. Another augmentation which has occurred in the literature is 'the Markov chain X, watched in E', where, if the chain {X,) leaves E, the next time-point in the chain described by (,,P is that when the original chain reappears in E . For an upper-Hessenberg (positive-recurrent) P, this corresponds to forming (,$ by augmenting the last column of (,,P (since re-entry into E must occur through state n).

.

6. Concluding remarks Proofs of Theorems 3.1 and 4.1, together with a generalization of this work to convergence of quasistationary distributions of (,f to w are available from the authors in a more extensive account (Gibson and Seneta (1986)). A discussion of the theory in the situation where Pis stochastically monotone will be given elsewhere.

DIANA GIBSON AND E. SENETA

References ALLEN,B., ANDERSSEN, R. S. AND SENETA,E. (1 977) Computation of stationary measures for infinite Markov chains. TIMS Studies in the Management Sciences, Vol. 7. Algorithmic Methods in Probability, ed. M. F. Neuts, North-Holland, Amsterdam, pp. 13-23. GIBSON,D. AND SENETA,E. (1986) Augmented truncations of infinite stochastic matrices. Technical Report, Department of Mathematical Statistics, University of Sydney, NSW 2006, Australia. GOLUB,G . H. AND SENETA,E. (1974) Computation of the stationary distribution of an infinite stochastic matrix of special form. Bull. Austral. Math. Soc. 10, 255-26 1. SENETA, E. (1 967) Finite approximations to infinite non-negative matrices. Proc. Camb. Phil. Soc. 63, 983-992; Part 11: Refinements and applications 64 (1968), 465-470. SENETA,E. (1 980) Computing the stationary distribution for infinite Markov chains. Linear Algebra Appl. 34, 259-267. SENETA,E. (1981) Non-Negative Matrices and Markov Chains, 2nd edn, Springer-Verlag, New York. WOLF,D. (1 975) Approximation homogener Markoff-Ketten mit abzahlbarem Zustandraum durch solche mit endlichem Zustandraum. In Proceedings in Operations Research 5, PhysicaVerlag, Wurzburg, pp. 137-146. WOLF,D. (1 980) Approximation of the invariant probability measure of an infinite stochastic matrix. Adv. Appl. Prob. 12, 710-726.

Augmented Truncations of Infinite Stochastic Matrices ...

Sep 14, 2007 - This result leaves open the general question of convergence of (,,ato n for a .... in queueing systems for which several specific methods of augmentation to form (,@from ( .... TIMS Studies in the Management Sciences, Vol. 7.

216KB Sizes 0 Downloads 102 Views

Recommend Documents

Product of Random Stochastic Matrices and ... - IDEALS @ Illinois
As a final application for the developed tools, an alternative proof for the second .... This mathematical object is one of the main analytical tools that is frequently ...

Product of Random Stochastic Matrices and Distributed ...
its local time τi using its own Central Processing Unit (CPU) clock. Ideally, after the calibration, each processor's local time should be equal to the Coordinated. Universal Time t. However, due to the hardware imperfections of CPU clocks, differen

Product of Random Stochastic Matrices and ... - IDEALS @ Illinois
for the degree of Doctor of Philosophy in Industrial Engineering ...... say that {W(k)} is an identically independently distributed chain, and we use the abbreviation.

POSITIVE DEFINITE RANDOM MATRICES
We will write the complement of α in N as c α . For any integers i .... distribution with n degrees of freedom and covariance matrix , and write . It is easy to see that ...

Shrinkage Estimation of High Dimensional Covariance Matrices
Apr 22, 2009 - Shrinkage Estimation of High Dimensional Covariance Matrices. Outline. Introduction. The Rao-Blackwell Ledoit-Wolf estimator. The Oracle ...

On inversion of Toeplitz matrices - ScienceDirect.com
Comparison of the latter two theorems reveals that in Theorem 1.3 triangular .... [4] G. Heinig, On the reconstruction of Toeplitz matrix inverses from columns ...

Matrices
Matrix • Two-dimensional rectangular array of real or complex numbers. • Containing a certain number “m” of rows and a certain number of “n” columns.

Augmented Reality.pdf
Page 3 of 45. Augmented Reality.pdf. Augmented Reality.pdf. Open. Extract. Open with. Sign In. Main menu. Displaying Augmented Reality.pdf. Page 1 of 45.

Layered: Augmented Reality - Mindshare
In partnership with Neuro-Insight, we used Steady State Topography .... “Augmented Reality: An Application of heads-up display technology to manual ...

RATE OF CONVERGENCE OF STOCHASTIC ...
of order 2, we define its expectation EP(W) by the barycenter of the measure ... support of the measure (Y1)∗P has bounded diameter D. Then, for any r > 0, we ...

Augmented Reality.pdf
Loading… Whoops! There was a problem loading more pages. Retrying... Whoops! There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Main menu. Whoops! There was

RATE OF CONVERGENCE OF STOCHASTIC ...
The aim of this paper is to study the weak Law of Large numbers for ... of order 2, we define its expectation EP(W) by the barycenter of the measure W∗P (the.

Part B: Reinforcements and matrices
market. Thus both matrix types will be studied. Moulding compounds will be briefly overviewed. For MMCs and CMCs, processing methods, types of materials, ...... A major application is garden decks (USA), picture frames and the ...... Primary processi

Part B: Reinforcements and matrices
Polymeric Matrix Composites (PMCs), Metal Matrix Composites (MMCs) and Ceramic. Matrix Composites (CMCs) will be discussed. For PMCs, synthetic and natural fibres as well as mineral particulate reinforcements will be studied. Polymeric matrices both,

Infinite performance - Intel - Media13
Performance testing. Evaluate core applications' performance and scalability when running on the latest Intel® technology. SOLUTIONS. • Processing power.

TWO INFINITE VERSIONS OF NONLINEAR ...
[5] A. Grothendieck, Sur certaines classes de suites dans les espaces de ... geometric analysis (Berkeley, CA, 1996), volume 34 of Math. ... Available online at.

Infinite Campus.pdf
Campus Portal has listed. The mobile app will ask for the district ID: ZLSBJB. Page 1 of 1. Infinite Campus.pdf. Infinite Campus.pdf. Open. Extract. Open with.

Effect of Noise Covariance Matrices in Kalman Filter - IJEECS
In the Kalman filter design, the noise covariance matrices. (Q and R) are .... statistical variance matrix of the state error, the optimality of the Kalman filter can be ...

Extremum of Circulant type matrices: a survey
Mar 15, 2008 - Study of the properties of the eigenvalues of random matrices emerged first from data analysis and then from ..... helps to visualize the general non-Gaussian case. 3.1 Spectral radius and ..... a measurable map. N : (Ω,F,P) ...

Extremum of Circulant type matrices: a survey
Mar 15, 2008 - behaviour near the “edge”: of the extreme eigenvalues, spectral norm and spectral radius. ...... Inc., Melbourne, FL, second edition, 1987. .... Bidhannagar Government College, Sector I, Salt Lake, Kolkata 700064, India.

On Computing Determinants of Matrices Without Divisions
Jun 14, 1995 - 06077 and under Grant No. CDA-88-05910. In Proc. Internat. Symp. Symbolic Algebraic Comput. IS-. SAC '92, P. S. Wang, ed., ACM Press, pp.

Infinite performance - Intel - Media13
quad data rate (QDR) InfiniBand network. TECHNOLOGY ... University of Coimbra evaluates performance and scalability benefits of the latest Intel®technology.

Infinite Jigsaw Puzzle - PDFKUL.COM
Moai 3, trade Missions Level 2 ~~~Game Guide ~~~. Montezuma Blitz 62 838 ... Forest Legends: The Call of Love Gameplay Trailer (PlayStation 3/PC) Demo Deep in the heart of an enchanted forest, a forbidden love blooms. ... ~~~Game Guide ~~~ Developers

OSR Dungeon Encounter Matrices Alpha.pdf
99 Vapor Rat Tome of Horrors Complete. 100 Yienhool Swords & Wizardry Monster Book. Whoops! There was a problem loading this page. Whoops! There was ...