Chapter 1
Statistical Analysis and Spectral Method
Statistical and spectral analyses are commonly used to characterize random phenomena which are seemingly chaotic, irregular, and unpredictable. They can also be used as the effect tools to study and describe turbulent flow, since the turbulent flow is random.
Many different computational approaches have been developed to solve turbulence problem, which is a very difficult problem. The most widely used models include statistical models and pseudospectral methods that are based on statistical and spectral analysis, respectively. In the statistical models, such as turbulent-viscosity models, e.g., model, Reynolds-stress models, probability-density function (PDF) methods, and large eddy simulation (LES), the turbulent flow is described in terms of some statistics. In the pseudospectral methods, the turbulent flow is described in terms of the spectral coefficients in the spectral space.
However, in order to extract the essential kinetic and dynamic characteristics of turbulence and interpret it properly from experimental and numerical results, the statistical and spectral analyses are also used to deal with and to analyze these data, amount of which has grown through application of new experimental and computational tools.
Throughout the book, the turbulent control problems are solved numerically by the pseudospectral methods, the experiment and numerical dada involved are dealt with by statistical or spectral analysis, and the discussions on turbulent flow are based on the statistical and spectral descriptions. Therefore, the statistical analysis and spectral method are introduced briefly in Chapter 1.
In Section 1.1, statistical analysis is presented, and in Section 1.2, the statistical representation of turbulent flow is discussed. Sections 1.3 and 1.4 are concerned with spectral series expansions for Fourier series and other orthogonal basis. The fundamental concepts and technical of spectral method, a numerical method for partial differential equations, are introduced in Section 1.5 and its spectacular applications to Navier-Stokes (N-S) equations to turbulent flows are discussed in Section 1.6.
1.1 Statistical Analysis and Spectral Method[1-3]
1.1.1 Average Value
The fluid velocity field in turbulent flow is random, which varies significantly and irregularily in both position and time, and described by random variables. Considering a random event expressed by a random variable , which is statistically steady, the experimental measurements are taken on repetitions at a specified position under the same set of conditions, and measured independent curves, , are obtained. The curve is called a realization of random events .
As shown in Figure 1.1, the time average for one realization is defined as
1.1 Figure 1.1 Time average and ensemble average of random variables
Therefore, a random variable is decomposed into a mean and a fluctuating part , representing the deviation from the mean, such that
1.2 The mean value of a fluctuating quantity itself is zero.
Similarly, the space average is define as
1.3 The ensemble average of the realizations under the same set of conditions is defined by
1.4 Hence, (1.5).
1.1.2 Probability Density and Statistical Moments
1.1.2.1 Probability Density
A realization of random events at a specified position is shown in Figure 1.2.
Figure 1.2 Indication function
Define indicator function
1.6 where is a given value and is an arbitrary value. Then we have
is a time during which .
that is, a ratio between total duration of certain conditions satisfied and total duration of averaging, representing the percentage of time spent by under the given level. is called the cumulative distribution function (CDF) () and monotonically increases as increase, as shown in Figure 1.3.
Figure 1.3 Cumulative distribution function
The probability density function (PDF) is defined to be the derivative of CDF:
1.7 which represents the probability of events with . That is,
where denotes probability. It satisfies the normalized condition
1.8 Obviously, the average (or mean) of the random variable can be obtained by PDF
1.9 1.1.2.2 Statistical Moments
The mean values of the various powers of are called moments. The moment is defined by
1.10 where the first moment , that is, , is the familiar mean value.
The central moment is defined by
1.11 where is fluctuation. The first central moment , of course, is zero. The second central moment is called the variance, characterizing the magnitudes of the fluctuations with respect to its mean, its square root is the standard deviation, often called root-mean-square (rms), which is the measure of the width of .
The skewness associated with third central moment is defined by
1.12 which gives an ideal of asymmetry in about the origin. If the values of all odd moments are zero, is symmetric about mean. If , the distribution cure of shifts toward the negative fluctuation direction, the tail of the curve on the left side is longer. The negative fluctuation prevails.
The kurtosis associated with fourth central moment is defined by
1.13 which is a measure of whether the curve of is peaked or flat relative to a normal distribution induced later. The curve of with high kurtosis has a distinct peak near the mean, declines rather rapidly, and has heavy tails.
1.1.2.3 Characteristic Function
A random variable can be written in the complex exponential form, that is, , where is the wave number. The mean called the characteristic function is
1.14 Differentiating Eq. (1.14) with respect to k, we have
1.15 This means moments are related to derivatives of characteristic function at the origin .
The characteristic function can be written as a Taylor series of moments
1.16 therefore the characteristic function in principle can be determined from all derivatives.
On substituting Eq. (1.16) into Eq. (1.14) yields
1.17 Then, is given from the determined characteristic function, .
1.1.2.4 Normal Distribution
In many cases, there is a probability density function called the standard normal distribution (or standard Gaussian distribution) expressed by
1.18 Obviously, we have and .
1.19 then, it is said to be the normal distribution (or Gaussian distribution), in which the first central moment is , rms is , all odd central moments are zero, and even moments are expressed by
1.20 where "!" denotes the double factorial. Also, skewness, and kurtosis, .
The characteristic function is
1.21 The shape of Gaussian distribution is symmetric about the peak at , and there exist the grads near the peak, as shown in Figure 1.4. Skewness and kurtosis , redefined as here, for any arbitrary PDF represent the deviation from the symmetric shape of Gaussian distribution.
Figure 1.4 Normal distribution function
Let us consider statistically independent random variables , where are said to be independent if . We assume that all have identical probability densities and that their mean values are zero. Then, the sum of all has a Gaussian probability density, that is, defined by is a normal distribution function, which is called the central limit theorem.
1.1.3 Correlation Function
1.1.3.1 Auto-Correlation Function
Only the distributions of fluctuations at one point in time or space are discussed in previous section. Therefore, the relations between neighboring fluctuations will be discussed further here.
Using superscript , denoting the measured position, the indicator function is defined by
1.22 where and are two given values and is an arbitrary value at the measured position. and represent different times.
The cumulative probability distribution function is
1.23 The correlated probability density function is
1.24 It is the fraction time that the random variable is between and at time , as well as is between and at time . It can also be written as
1.25 which represents a statistical mass of a square area shown in Figure 1.5, if is regard as a density.
Figure 1.5 Correlated probability density
It also satisfies the following equations:
1.26 1.27 The central moment...