Automatic Generation of Problem Solving Techniques using a Genetic Programming Approach

Ricardo A. Garcia Machine Listening Group, MIT Media Lab.

Date: April 26, 2000

Introduction

It would be very interesting if the user could present a "problem" to the computer, and the computer returned not just the "solution" to that problem, but also the "technique" or process that was used to arrive to the solution. Well, this is what we are trying to do now: to find a way to present problems to the computer and ask it to find a way to solve it, and specially, to give us the "method" used to solve the problem.

The method that we will discuss is called genetic programming (GP) as proposed by Koza, et al. [ 1 ] and it is meant to generate a computer program that will solve a problem given a high level statement about it, without any (or limited knowledge of how to solve it. To make things easier, we will introduce a simple problem of training a creature in a simple world to achieve a simple goal, and discuss all the aspects that have to be taken in account in using a GP approach for this goal.

 

The "toy" problem

Our goal is to design a system that shows adaptive behavior, as the behavior found in simple life form organisms.

Figure 1

Imagine a "creature"(triangle) in a two dimensional grid (Figure 1), with some "food" (dots) and some "poison" (asterisks). The creature needs to eat as much food as it can, and don’t eat poison at all. The creature will have some sensors (to see, smell or touch the food, poison and walls) and some actuators (to move and eat). Our goal is to find an algorithm or program that will keep alive these creatures in different environments (different "worlds" with various sizes and food/poison distributions).

We have control over the world sizes and food/poison distributions in both, the training and the testing, but we are supposed to have no control over the actual creature once the simulation is running. Also, the idea is to find an algorithm that will perform very well in different situations without changing any parameters on it (that is the adaptive part).

 

Solving Problems

To solve a problem in general, we must to have clear some things:

  • The problem itself: well, this sound dumb, but is very important. We have to know what is the real problem that we are trying to solve. It is important to "separate" what is the knowledge that we have before solving the problem, and the knowledge that we expect to have after solving the problem. Also, all the surrounding knowledge that is related to the problem is very important. In GP this knowledge is actually the one that will help us find a solution (or at least narrow the search).
  • Problem/solution architecture: the problem and the actual solution have an architecture that should be known in some instances. Usually, the problem architecture is given by the actual problem (that means that we can’t change it) but the architecture of the solution is not known (usually). If the user decides to select a particular architecture for the solution, then the task is to search for the optimal parameters to make this solution be the one that solves our particular problem. In GP, the goal is to leave the architecture of the solution open, and make this topology be part of the solution that will be found as well as the parameters associated with the architecture. Therefore, the space to be searched is bigger (with all the allowable topologies plus all the possible parameters in each topology), but the flexibility in topologies will be a very important characteristic.
  • Some way to test the solution: Every solution has to be tested under supervised conditions to "rank" it and decide if it is suitable for the problem at hand.
  • Definition of Inputs and Outputs: Because if the architecture of the solution is open, we need some standard way to put information into the system and also for retrieving information. This can be seen as a "black box" system, where we don’t care about the system inside, but we have some standard interfaces to talk the box.

Open Architecture systems

Imagine a "problem" of finding a computer program that adds two numbers. If we want to use an automatic program maker that creates this computer program, several approaches can be taken. Maybe the most "low level" one is to start with a bunch of random bits and hope that they will be understood by the processor when we run it. This has a lot of drawbacks like that the program will be dependent on the processor, the chance of having "bad" instructions (with no meaning) is very high, etc. Also, the searching space will grow exponentially with every new "bit" that is added (i.e. a program 1000 bits long will have 2^1000 possibilities).

If instead of having a system that put bits together, the system uses higher level commands (i.e. opcodes or even C instructions), the search space is reduced, because there are no bad instructions, and the program can be platform independent.

Also, in an even higher level approach, the solution can be presented in a flow diagram composed by "functional blocks" and "connections", and additionally, with few (but strict) constrains (in the types of connections between blocks, types of blocks allowed, loops, etc). This system is expected to find at least one solution to the problem (and in many cases, several topologies with identical results).

These functional blocks can represent simple operations, or higher (and non linear) processes. Figure 2 shows an example of a block diagram that represents a program.

Figure 2

 

Also, this approach has an implicit advantage when applied to computer programs: it is inherently multi thread (several processes or flow paths at the same time). This will be a crucial point at the time of evolving a system, because the "order" of processing the blocks will be decided automatically when needed.

 

Genetic Programming

The Genetic Programming idea is based in the biological theory of evolution and selection of the fittest. In this theory, just the fittest individuals (the ones that performed better) in the present generation, will be allowed to reproduce for the future generation.

Each generation is composed then with the "best" individuals of the previous generation and with mutations or crossovers between these individuals. Usually, the population between generations is kept constant.

Figure 3

The individuals in a GP run are "programs" that are expected to solve a problem. These programs are composed by the "functional blocks" and the connections (topologies) mentioned earlier. Because a topology can have one or several loops and strange structures, it is better not to try to evolve the actual program itself (the topology) but to evolve a "description" of each topology. That means, that the GP evolves a "program" to write a program. In that way, the evolved program can be a simple (and linear, with no loops) syntactic description of the actual "topology" of the program that will solve our original problem.

Once a "syntactic description" of the program is evolved, it is translated into the actual program (i.e. compiled) and tested to see if it solves the problem. The result of this test is called "fitness score" (following the biology language) and is related to the measure of "how much does the evolved program resembles the desired output". This gives us a way to quantify (indirectly) how good are each one of the evolved programs and take decisions based on this. The "template" or "ideal fitness" that is used to compute the final fitness can be composed of several cases (different input/output templates) and the average performance is the fitness score for the individual. This enforces that the fittest individuals accomplish the desired behavior. Also, it gives a good termination criteria, to know when an acceptable individual is found (with an acceptable fitness value).

The programs (individuals) have to be changed between runs. After the fitness score is computed for all the individuals, the fittest are selected to be the base of the next generation. Usually, the population between generations is kept the same. Some of the selected programs are just copied intact. The rest of individuals are derived from the selected individuals performing some operations on them. These operations are usually: mutation and crossover. Mutation takes one individual and changes (in the syntactic description) some of the instructions at random (deletes, or introduce new ones), and this is reflected in the topology of the new individual. Crossover takes two (or more) "parent" individuals and copy part of the syntactic description of one in the other.

When a new population is created, is then evaluated again and the process keeps going this way until the desired result is found (or the GP decides is time to stop). This can be seen in the Figure 3.

 

Taking all of this in account then, to solve a problem we have to keep in mind:

  • Construction Blocks: What kinds of construction blocks are needed to be used in a topology and find a suitable solution of the problem. The number of construction blocks, and the kind of operations that they perform, the kind and number of input/output and in general all the details involved with the blocks are very important.
  • Syntactic Description: Find (decide) a syntactic description that can represent ANY allowed block configuration. This is the actual "program" that will be grown by the GP. It is usually a text based or an acyclical graphic tree.
  • Fitness Function: The grown "syntactic descriptions" have to be translated into executable programs (i.e. compiled). Then, each one of these programs is run using one (or several) test input/output sets. With each input set, the real output is compared with the expected one and a score is given. Usually, penalty points are used when the grown system doesn’t behave in the desired way. With this criteria, a score of "0" is a perfect match. With several input/output sets, the average of all the trials is the "fitness score" for that individual. Note that is in this part that the knowledge about the "desired behavior" is introduced. This can be seen as the "training set" in classical non-linear search.
  • Selection for the breeding: Selection of the individuals that are going to be copied, mutated or crossed over for the next generation.
  • Genetic Operations: Define the mechanism to perform each one of the genetic operations (copy, mutation or crossover) in the syntactic descriptions.
  • Stop Criteria: Decide when is a good moment to stop and pick a winner.

The first generation of a genetic program is usually composed by "random" programs, and then, the scores obtained in the fitness measurement are very high (a lot of penalty points), and then tend to go down with each generation.

Role of previous knowledge and constrains

Most of the problems have some kind of constrains, especially in the number and kind of inputs and outputs that the grown system or program is expected to have. These are usually incorporated as "default" blocks in each program, and they can’t be modified by the GP. This is one way to force the grown programs to fit a desired interface model.

 

Implementation of our "toy" problem

Now, let’s talk again about the "toy problem" using the GP language described above and discuss about the implication of all the parts.

The idea is to evolve a "creature" that will wonder around a fictitious world (a grid) with some "food" and "poison". The creature is expected to avoid get stuck with a wall (the boundaries of the squared world), and it will survive if eats as much "food" as possible and no "poison" at all.

The creature can be seen as a "program" that will be run in a simulated environment. The fitness function can be composed of many "random" worlds (in size, and food/poison distribution). Then, each evolved creature is released (the program run) and tested if the creature ate the food and no poison at all. The programs (creatures) with better performance will survive.

These evolved programs are composed by "functional blocks". They have three big types:

Inputs (sensors): these blocks have no "input" connection. They receive information from the exterior and put a signal in their "output" connection. The proposed input sensors for these creatures are:

  • Food Smell: a sensor that gives a signal related to the amount and distance of food.
  • Poison Smell: signal related to the amount and distance of poison.
  • Touch: signal related to the type of object in front, ie: wall, food or poison.
  • Vision: gives a signal if it "sees" food/poison.

Process: these blocks just take input signals and do some processing with them, to put the result at their output.

  • Add: add two signals
  • Mult: Multiply.
  • Split: duplicate a signal (one input, two outputs).
  • Comp: compares: if A then B else C.
  • Memory: store/retrieve a past value.

Outputs (Actuators): these blocks just have inputs, and they have interaction with the "exterior world". They will respond to control signals and will perform some action.

  • Move: will make the creature to advance or move in a pre-determined direction (defined by other control input).
  • Eat: Eat the food/poison that is in front.

 

 

Things to be covered in the next Project Update:

Syntactic description and functional blocks

The syntactic description can be a text or graphic based description of all the blocks and their connections, but this should be done in a non-cyclical fashion, which means that the description shouldn’t have any loops. The purpose of this will be explained and two different approaches.

Embryos and Topology modification functions

A popular way to face the grown of a topology is to have an Embryo and perform topology modifications (adding branches, loops, blocks), and literally "grow" the solution from there. Therefore, the GP evolved program will be composed of instructions of "how to modify" the original embryo.

The embryo is one of the simplest topology that can be made satisfying the input/output requirements, but that actually performs no useful computation at all. It is useful just to "mutate" or derive topologies from it.

 

Figure 4

 

Evaluation of topologies

Before assigning the fitness score, the syntactic description has to be translated (i.e. compiled) into a computer program that can be run. Because of the lack of restrictions in the topology grown, it is very probable that "weird" topologies emerge, and all of them have to be tested to decide the fitness score. A "general" object oriented approach is recommended to evaluate the flow of information between the blocks that belong to the actual topology. In this way, even infinite loops are "allowed" as long as they receive or produce information; but this will not destabilize the system.

 

Implementation and Performance

Hopefully, an implementation of this "toy" problem will be possible. It will use the C++ Genetic Programming library modules that are being developed by Ricardo Garcia as part of his research at the Machine Listening Group, at MIT Media Lab, as part of his research in the Automatic Generation of Sound Synthesis Techniques.

References:

 

[ 1 ] Koza, J. R., Bennett, F.H., Andre, D., Keane, M.A. Genetic Programming III Darwinian Invention and Problem Solving, Morgan Kaufmann Publishers, San Francisco, CA, 1999

 

[ 2 ] Koza, J. R., Bennett, F.H., Andre, D., Keane, M.A. Automated Synthesis of Analog Electrical Circuits by Means of Genetic Programming, IEEE Transactions on Evolutionary Computation, Vol. 1. No 2. July 1997.

 

[ 3 ] Digital Biology Website:http://www.digitalbiology.com/

 

[ 4 ] The Genetic Programming Notebook, by Jaime A. Fernandez. http://www.geneticprogramming.com/

 

[ 5 ] UCLA Artificial Intelligence. http://www.cs.ucla.edu/csd/fields/ai.html

 


Ricardo A. Garcia  MIT Media Lab © 2001