next up previous
Next: Testing and Results Up: Searching for Lyapunov Functions Previous: Fitness

Lyapunov Search using Genetic Programming

And so, having examined the genetic programming, and how to measure fitness, we formulate an algorithm to search for a Lyapunov function. I have implemented this algorithm in Mathematica.

The input to the genetic programming procedure is the following: the equations of motion of the system, $f(x)$, the list of state variables $x_1,x_2,\ldots$, the domain of interest $D$ (for my implementation, $D$ is given by limits on the state variables), a list of mathematical functions that will be used to construct the initial random population (called the nonterminal set), and certain genetic programming parameters. The genetic programming parameters include the population size, the maximum number of generations to run, the number of test points, and other minor parameters.

The first task is to generate a random population. This requires two sets of buildings blocks: the terminal and nonterminal sets. The terminal set is the set of terminal nodes that can be used in a random function. In my implementation, this set includes the state variables $x_1$ through $x_n$, and the integers ranging from $-10$ to $10$. I have found it very beneficial to also include the components of the right-hand-side of the equations of motion: $f_1(x)$ through $f_n(x)$. (Although the equations of motion can be complex expressions, the components always remain a single node in the tree; crossover cannot break them up.) Thus, the terminal set for my implementation is:

$\displaystyle T$ $\textstyle =$ $\displaystyle \{x_1,x_2,\ldots,x_n,f_1(x),f_2(x),\ldots,f_n(x),$  
    $\displaystyle \;-10,-9,\ldots,9,10\}$  

Although it might be beneficial to be able to adjust this set, in my implementation this is not currently possible.

The nonterminal set is the list of function input by the user. The user should almost always include the four arithmetic operations: addition, subtraction, multiplication, division. The square function ($\cdot ^2$) is often useful because of the requirement that the Lyapunov function be positive definite. Depending on the function, other operations might be included. For example, for the damped pendulum, the sine and cosine functions might be useful because the sine appears in the equations of motion. The damped pendulum nonterminal set might look like this:

\begin{displaymath}
N=\{+,-,\times,\div,\cdot ^2,\sin,\cos\}
\end{displaymath}

There are limitless possibilities in choosing the nonterminal set, and deciding which functions to include is an art. Too many unnecessary functions decreases performance; however, not including a really useful one is crippling.

Using these two sets, the implementation generates trees recursively: the Mathematica function creating the random tree chooses a node from one of the sets randomly, and if the node is nonterminal, calls itself to generate subtrees for each of the arguments. To generate trees of manageable size (for it is easy to see that this can create very large functions), the chance of selecting a nonterminal node decreases exponentially with the depth of the recursion.

Then, having generated the population, the implementation measures the fitness of each individual. First, it evaluates each individual with all states set to zero. The result, if there is one, is to be subtracted from the individual when it is tested for positive definiteness. Individuals for which this result is undefined have their fitness set to zero.

The implementation then approximates the fitness of the rest of the individuals. In both cases, it tests a random set of points for satisfaction of the two criteria $V(x) > 0$ and $\dot V(x) = (\partial
V/\partial x)^T f(x)\leq 0$. (Incidentally, because Mathematica can differentiate symbolically, it can evaluate $\dot V(x)$ without using finite differences.) The approximate fitness is the ratio of points satisfying $V(x) > 0$ plus the ratio of points satisfying $\dot
V(x)\leq 0$, divided by two.

The number of points tested depends on how well the points satisfy the criteria. The implementation first tests a set of random points for one criterion. If this ratio is less than 1/2, then the ratio is accepted as the approximation. Otherwise, it tests another set of points. If less than 3/4 of the points in both sets satisfy the criterion, then that ratio it is accepted as the approximation. Otherwise a third set is tested. If less than 7/8 of all the points satisfy the criterion, that ratio is accepted at the approximation. Otherwise a fourth set is tested, and so on. This method frees the computer from wasting time on the least fit individuals, while giving precise results for the most fit individuals. The number of random points per set, and maximum number of sets, are settable parameters.

Having measured each individual's fitness, the genetic programming procedure breeds a new population. To breed a fitter population, the procedure tends to choose individuals with higher fitness, but also allows some less-than-optimal individuals to breed to retain genetic diversity. My implementation uses tournament selection, where it randomly selects some number (a settable parameter) of individuals. Of those selected, the individual with the highest fitness gets to breed.

The size of the new population is the same as the size of the original one3. The implementation creates most of the individuals in the new population by crossing two parents, both of whom have been selected by tournament. A small percentage of new individuals undergo mutation after crossover. The mutation rate is usually very small, less than two percent, and is a settable parameter.

Some individuals in the new population are not formed by crossover, but are instead a copy of a single parent (but still subject to mutation). The purpose of such asexual reproduction is to save the population from serious regression in the case of an unluckily bred generation. A small percentage of individuals in the new population is formed this way. The percentage should be small number, around ten percent, and it is a settable parameter.

After the generation of the new population, the algorithm loops back to measuring its fitness. It continues looping, and hopefully finding better individuals with each generation, until some termination criterion is met. My implementation has two termination criteria. One is when a maximum number of generations, specified by the user, has been run. The other is the appearance of an individual with a perfect score: all of the points tested satisfy the Lyapunov criteria. The procedure stops and returns the fittest individual when termination occurs.

Because genetic programming is a heuristic method, there is no guarantee that it will find a true Lyapunov function. This problem is deepened by the fitness measure not being completely accurate. We hope that the problem is not serious.


next up previous
Next: Testing and Results Up: Searching for Lyapunov Functions Previous: Fitness
Carl Banks 2002-05-17