Part a)
Discrete-time continuous state Markov processes are widely used. Autoregressive processes are a very important example.
Actually, if you relax the Markov property and look at discrete-time continuous state stochastic processes in general, then this is the topic of study of a huge part of Time series analysis and signal processing.
The most famous examples are ARMA processes, the Conditionally Heteroscedastic models, a large subclass of Hidden Markov models
Part b)
Given, random process, "X(t) = Kcos(wt) t \u2265 0"
The expected value of K is, "E[K]=\\frac{1+0}{2}=0.5"
The variance of K is, "\\sigma_K^2 = \\frac{(1-0)^2}{12}=\\frac{1}{12}"
Thus
"E[K^2] = \\sigma_K^2+ [E[K]]^2 \\\\\nE[K^2] = \\frac{1}{12} + (0.5)^2 \\\\\nE[K^2] = \\frac{1}{ 3}"
The mean of X(t) is,
"E[X(t)] = E[Kcos(wt)] \\\\\nE[X(t)] = E[K]cos(wt) \\\\\nE[X(t)] = \\frac{1}{2} cos(wt)"
The autocorrelation function of X(t) is,
"R_{xx}(t, s) = E[X(t)X(s)] \\\\\nR_{xx}(t, s) = E[K^2cos(wt)cos(ws)] \\\\\nR_{xx}(t, s) = E[K^2]cos(wt)cos(ws)\\\\\n R_{xx}(t, s) = \\frac{1}{3} cos(wt)cos(ws)"
The autocovariance function of X(t) is,
"C_{xx}(t, s) = R_{xx}(t, s) \u2014 E[X(t)]E[X (s)]\\\\\nC_{xx}(t, s) = \\frac{1}{3}cos(wt)cos(ws) -\\frac{1}{4} cos(wt)cos(ws)\\\\\n C_{xx}(t, s) = \\frac{1}{12}cos(wt)cos(ws)"
Comments
Leave a comment