SIMULATION OF IMPROVED GAUSSIAN T I M E HISTORY By Loren D. Lutes,' Fellow, ASCE, and Jim Wang, 2 Student Member, ASCE INTRODUCTION
A conventional technique for simulating a Gaussian time history generates the Gaussian signal by summing up a number of sine waves with random phase angles and either deterministic or random amplitudes. After this simulated process has been used as the input to some system, unknown response expectations are generally estimated from time averages for stationary processes. However, the values of the expectation estimates based on single time histories may exhibit a considerable amount of scatter or statistical inaccuracy because of two facts: only a limited number of terms have been used in the frequency summation; and the time history is only of finite length. This statistical inaccuracy is usually not a serious problem for estimating mean and variance, but it may be a problem in estimating probabilities of failure, which often are more affected by the tails of the probability distribution. Presented herein is an approach from which an improved Gaussian time history can be obtained, in which the time-average moments are exactly equal to the corresponding theoretical expectations for a Gaussian process. It is shown that the joint moments and the moments of extrema of the improved time history also agree with the corresponding properties for the Gaussian process, providing stronger evidence for considering the time history as a representative sample from a Gaussian process.
PROBLEM FORMULATION
A standard procedure, commonly referred to as the Gaussian technique, for simulating a Gaussian time history generates the random Gaussian signal from a one-sided power spectral density (PSD) Gx(co), according to the equation N
X{t) = 2
V2Gx(a>,)Aw, sin (u(f + 6 , )
(1)
in which Aw, = the frequency interval; to, = the midpoint of the frequency interval; and 0, = the random phase angle, uniform on [0,2ir]. Time-average expectation estimates are calculated by EQC) = I,j=lX]'/N, where X, = X(tj). It is not uncommon that the time-average moments evaluated from time 'Prof, of Civ. Engrg., Texas A&M Univ., College Station, TX 77843-3136. 2 Grad. Res. Asst., Texas A&M Univ., College Station, TX. Note. Discussion open until June 1, 1991. To extend the closing date one month, a written request must be filed with the ASCE Manager of Journals. The manuscript for this paper was submitted for review and possible publication on February 2, 1990. This paper is part of the Journal of Engineering Mechanics, Vol. 117, No. 1, January, 1991. ©ASCE, ISSN 0733-9399/91/0001-0218/$1.00 + $.15 per page. Paper No. 25407. 218 Downloaded 18 Jun 2010 to 129.15.14.53. Redistribution subject to ASCE license or copyright. Visit
http://w
histories simulated using the conventional Gaussian technique show considerable variations, and the variations grow as m grows. In engineering applications, it is often necessary to look at moments higher than the variance. For example, fatigue damage generally varies like some measure of stress to a power m. For welded connections or points of high stress concentration, m is usually about 3 or 4 (Lutes et al. 1984). However, if the stress is a nonlinear function, say a cubic function, of a Gaussian process X, then the fatigue damage may involve moments of X as high as 9 or 12. Improved simulation might be done by replacing the deterministic amplitude terms in Eq. 1 with independent Rayleigh variables with the same mean-square value, or by increasing the number of terms of the cosine functions in the summation, or by using more sophisticated techniques (Yamazaki and Shinozuka 1990). More commonly, a long simulated time history or an ensemble of time histories is used to get stable moment estimates in simulation studies. In this situation, the procedure of numerically simulating the response of a complicated nonlinear system will become increasingly time consuming even though the simulation of the Gaussian time history itself may be done very efficiently by an FFT procedure. The technique presented here to obtain a sequence having exact Gaussian moments is to use a nonlinear transformation of a weakly non-Gaussian sequence. This is the inverse of a commonly used technique for generating a non-Gaussian process by a nonlinear transformation of a Gaussian process. In particular, the initial nearly Gaussian time history is simulated using Eq. 1. The question then is whether it is possible to modify this nearly Gaussian time history by a nonlinear transformation to obtain a single representative time history whose time-average moments agree with the expectations for a Gaussian process, even though the time history is of limited length. FITTING TIME-AVERAGE MOMENTS
A stationary Gaussian process {X(t)} with zero mean has the moment property E{Xm) = 1 • 3 • 5 . . . (m -
1)CT™,
for m = 2, 4, 6,
(2)
in which . Then X(t) may be obtained by a transformation function of Y(t), i.e. X(t) = g[Y(t)]
(3)
where g(*) = a nonlinear monotonic function. This nonlinear transformation can be approximated by a polynomial expansion as in
X{t) = 2 P„£/"(0
(4)
in which U(t) = [Y(t) — ftyl/oy with zero mean and unit standard deviation. To relate the unknown coefficients in Eq. 4 to the time-average moments 219 Downloaded 18 Jun 2010 to 129.15.14.53. Redistribution subject to ASCE license or copyright. Visit
http://ww
GENERATE TIME SEQUENCE: (Y„ Y2, • •.Y • N)
+ OBTAIN INITIAL X SEQUENCE:
X, = Y i
FIND MEAN AND VARIANCE: nK = — f,Xj
al = —
I,(.Xrfix)-
I OBTAIN STANDARDIZED SEQUENCE: (U„U 2 ,- •.u N }
EVALUATE MOMENTS:
{AJ 3 ,/J 4 ,-,/l m „}
IE
I C H E C K FOR C O N V E R G E N C E OF
NO
EVALUATE HERMITE COEFFICIENTS:
_Xr/ix U
-"k = ^ |
i
U
?
(/J3,^4,---,/i„
YES (h 3 ,h 4 ,---,h m+1 )
* STOP
hk = - - I H e , ( ( U i )
IE MODIFY THE X SEQUENCE:
X, = U, - X h ^ H e ^ U p
T FIG. 1.
Flowchart for Transformation and Iteration
of U(t), Eq. 4 can be rearranged into a Hermite series (Winterstein 1988) as follows X = U - 2 hn+lHen(U)
(5)
in which * , » = (-!)« exp -
- exp
n = 0, 1, 2,
(6)
For example, //<,„(«) = 1, Hei(u) = u, He2(u) = u2 — 1, Hej{u) — u3 — 3w, Hei(u) = w4 — 6w2 + 3, etc. The coefficients «„ can be approximated by E{He„[(U(t)]}
(7)
which directly relates to the nth moment of the U(t) process, for example, «3 = u,3/6 and n4 = ((JL4 - 3)/24, in which JJL„ = E(U"). Note that Eq. 7 is only a first-order approximate formulation, but high accuracy in matching the moments can be achieved using an iterative approach, as diagrammed in Fig. 1. A more complicated second-order formulation is also available (Winterstein et al. 1989), but it does not seem to be warranted here. If the (m - l)th-order Hermite polynomial is used, the iteration will converge toward a final time history, which has its first m timeaverage moments exactly equal to those of the standard Gaussian process. In fact, numerical results show that higher moments (e.g., m + 2, m + 4, etc.) may also be quite close to the theoretical values even though they would 220 Downloaded 18 Jun 2010 to 129.15.14.53. Redistribution subject to ASCE license or copyright. Visit
http://w
TABLE 1. Comparison of Moments Approach
(D
(2)
m = 4 (3)
n = 6 (4)
12 (5)
(6)
(7)
(a) E(Xm) Theoretical Original simulation Improved simulation
1 1.000 1.000
3 2.903 3.000
Theoretical Original simulation Improved simulation
1.874 1.868 1.872
7.497 7.125 7.420
Original simulation u. Coefficient of variation Improved simulation (j. Coefficient of variation
7.935 0.0139 7.947 0.0118
125.8 0.0473 127.3 0.0208
15 13.58 15.00
105 85.02 105.0
945 645.29 938.0
10,395 5,595.2 9,955.8
(i.) £(zizr') 44.98 38.58 44.09
359.9 270.8 347.5
3,598.9 2,261.7 3,422.1
43,188 20,071 40,348
(c) E(Rm) (ensemble of 25 records)
0
FIG. 2.
0.2
2,930.1 0.1458 2,968.4 0.0324
9.205E4 0.2967 9.528E4 0.0504
0.4 0.6 0.8 Correlation Coefficient, p
3.549E6 0.4828 3.826E6 0.0763
1.445E8 0.6987 1.581E8 0.1061
1.0
Comparison of Joint Moments
not generally converge to the exact values since moment information only to order m has been used in the transformation. It usually takes from two to about 12 iterations to get the desired results. To illustrate the procedure, numerical results have been obtained from a simulated narrow-band time history, which contained approximately 2,600 221 Downloaded 18 Jun 2010 to 129.15.14.53. Redistribution subject to ASCE license or copyright. Visit
http://ww
cycles with approximately 28 points per cycle. The PSD had a band width of 20% of the average frequency and was approximated by 40 frequency components. Results are shown in Table 1, which lists the moment values to the 12th order before and after the iteration using seventh-order Hermite polynomials. Note that only the moments to eighth order are matched exactly, but the 10th and 12th moments are also greatly improved. The total nonlinear transformation resulting from the iterations is evaluated numerically and shown in the inset of Fig. 2. In this particular example, the moments of the original time history are lower than the theoretical values, as can be seen from Table 1 and the inset of Fig. 2, where the original time history covers —3.5CT to +3.5CT while the improved time history covers —4CT to +4CT after the nonlinear transformation. Here it should be noted that while the time-average moments have been "tuned" to the Gaussian values, the frequency spectrum of the original time history may well be distorted by such a nonlinear transformation. However, numerical results from Fourier analysis have shown that the changes in the spectrum are not significant as long as the original time history is not strongly non-Gaussian. JOINT MOMENTS AND PEAK MOMENTS
For the Gaussian process {X(t)}, the random variables Xit X2, ..., XN are jointly Gaussian. In general, the following joint-moment properties hold for zero mean Gaussian processes. £(X,X2 ... X*,-i) = 0
(8)
M
E{XXX2 ...Xln)
= Yd E(.XjXk) ... E(XrXs)
(9)
where the summation in Eq. 9 is to be taken over all different ways in which 2« elements may be grouped into n pairs. The number of terms M in the summation is given by (2n)! M = ^—^ 2"n!
(10)
This property of the joint moments is unique to Gaussian processes. It can be used to calculate theoretical values for the joint moments of the type E(Xrx*) by letting X, = X2 = . . . = Xm = X, and Xm+1 = Xm+2 = ... = Xm+k = Xj in Eq. 9. For example E(Xtf) = 15a£p
(11) 2
E(XJXJ) = 30&1 + 4p )
(12) 2
EQCfX}) = 3a|p(3 + 2p )
(13)
where p = E[XiXj]/al is the correlation coefficient. These joint moments can be estimated by taking time averages of the time history, i.e., E(XkXk+r) = '2k=;xkXk+r/(N - r). Thus, if a simulated time history is a good approximation of an ideal sample from a Gaussian process, its time-average joint moments should agree with the theoretical values given on the right-hand sides of Eqs. 11, 12, and 13. The joint moments have been evaluated for 222 Downloaded 18 Jun 2010 to 129.15.14.53. Redistribution subject to ASCE license or copyright. Visit
http:/
the original time history and for the improved time history for different p values. The results are shown in Fig. 2, along with the theoretical joint moment curves. It can be seen that the joint moments of the improved time history agree well with the theoretical values for the joint Gaussian distribution. The process formed by the extrema of {X(t)}, the envelope process, is usually of more interest than the process itself for problems like first-excursion probability, fatigue damage, and crack propagation. For narrow-band Gaussian processes, the envelope process follows the well-known Rayleigh distribution, but more generally it is governed by the S.O. Rice distribution. The moments of the peak distribution can be expressed as E{Z\Z\"-1) = (2) m / 2 a
(14)
where Z denotes the peak of the Gaussian process and a is the irregularity factor (Lutes et al. 1984). For a perfectly Gaussian time history, the peak moments should agree with those given by Eq. 14, and this can be used as another test of normality. Care must taken, though, since the simulated time history represents a discrete sampling of a continuous process. A simple way to improve the estimates of the peaks and valleys (Z) of the continuous process is to use parabolic interpolation through each set of simulated values z,-_i, z,, z1+1 with z, being either larger or smaller than both z,_! and z,+ 1. The maximum or minimum of the parabola is then used in the time average evaluation of the £(Z|Z|m_1) moment. Comparisons with theoretical values of peak moments from the original and improved time histories are also shown in Table 1 for m = 2 to 12. It can be seen that the improved time history gives much better peak moment estimates. Application of this method to the rain-flow ranges commonly used in fatigue calculations has shown that the sample variations in the rain-flowrange moments are significantly reduced by the modification procedure. Furthermore, the range moments of the improved time histories compare well with an ensemble average for the original process, as can be seen in Table 1, which gives mean and coefficient of variation values. More detailed discussion of the fatigue results is not possible within the confines of this technical note. SUMMARY AND CONCLUSIONS
Moment calculations for a limited length time history simulated by the conventional Gaussian technique make the time history appear to be weakly non-Gaussian. An improved Gaussian time history with time-average moments exactly equal to the expectations for a Gaussian process can be obtained by a nonlinear transformation of a weakly non-Gaussian time history, and this transformation can be found through a simple iteration procedure. It has been shown that the improved time history can be considered representative of a Gaussian process in the sense that: (1) Its time-average moments agree with the expectations of the Gaussian distribution; (2) its timeaverage joint moments agree with the joint moments from the joint Gaussian distribution; and (3) its peak moments agree with the S.O. Rice distribution. 223 Downloaded 18 Jun 2010 to 129.15.14.53. Redistribution subject to ASCE license or copyright. Visit
http://ww
ACKNOWLEDGMENT
This research was partially supported by the Offshore Technology Research Center at Texas A&M University, through the National Science Foundation Engineering Research Centers Program, grant number CDR87-21512. APPENDIX.
REFERENCES
Lin, Y. K. (1967). Probabilistic theory of structural dynamics. McGraw-Hill, Inc., New York, N.Y. Lutes, L. D., Corazao, M., Hu, S. J., and Zimmerman, J. (1984). "Stochastic fatigue analysis." J. Struct. Engrg., ASCE, 110(11), 2585-2601. Winterstein, S. R. (1988). "Nonlinear vibration models for extremes and fatigue." J. Struct. Engrg., ASCE, 114(10), 1772-1790. Winterstein, S. R., De, R. S., and Bjerager, P. (1989). "Correlated non-Gaussian models in offshore structural reliability." Proc, Fifth Int. Conf. on Structural Safety and Reliability, San Francisco, Calif., 239-242. Yamazaki, F., and Shinozuka, M. (1990). "Simulation of stochastic fields by statistical preconditioning." J. Struct. Engrg., ASCE, 116(2), 268-287. Yang, J.-N. (1973). "On the normality and accuracy of simulated random processes." J. Sound Vib., 26(3), 417-428.
224
Downloaded 18 Jun 2010 to 129.15.14.53. Redistribution subject to ASCE license or copyright. Visit
ht