next up previous
Next: Implementation Notes Up: Description of Simulation Previous: Mixer

Parameter Identification

For on-line parameter identification, these simulations use a sequential least squares algorithm [3] (sometimes called recursive least squares, although it's not recursive) with a forgetting factor. This particular algorithm is known to have conditioning problems when the inputs vary slowly [4].

Equations 7 and 8 give the parameter vector and input vector, respectively, for the least squares algorithm.

\begin{displaymath}
{\mathbf\theta} = \left[\begin{array}{ccccccccc}
m_0 & m_1 & m_2 & m_3 & m_4 & m_5 & m_6 & m_7 & \ldots
\end{array}\right]^T
\end{displaymath} (7)


\begin{displaymath}
{\mathbf w} = \left[\begin{array}{ccccccccc}
1 & \alpha & \d...
... & \delta_e^3 & \alpha\delta_e^2 & \ldots
\end{array}\right]^T
\end{displaymath} (8)

The output is $C_{M{\mathrm est}}={\mathbf w}^T{\mathbf \theta}$, and the output error is $C_{M{\mathrm est}} - C_{M_0}$.

The update formula used is given by Equations 9 and 10.

\begin{displaymath}
{\mathbf P}_{k+1} = \lambda^{-1}{\mathbf P}_k -
\frac{\lamb...
...}_k}
{1+\lambda^{-1}{\mathbf w}_k^T{\mathbf P}_k{\mathbf w}_k}
\end{displaymath} (9)


\begin{displaymath}
{\mathbf \theta}_{k+1} = {\mathbf \theta}_k
+ {\mathbf P}_k{\mathbf w}_k(C_{M{\mathrm est}} - C_{M_0})
\end{displaymath} (10)

where ${\mathbf P}_k$ is the covariance matrix at step $k$, and $\lambda$ is the forgetting factor. ${\mathbf P}_0$ is initialized to $\epsilon I$, where $\epsilon$ is a small positive.


next up previous
Next: Implementation Notes Up: Description of Simulation Previous: Mixer
Carl Banks 2002-05-17