When used for function optimization, a genetic algorithm manipulates a population of strings using probabilistic genetic-like operators like pairwise string recombination and string mutation to produce a new population with the intent of generating strings which map to higher function values. The members of the population act as a primitive memory for the algorithm, and the genetic operators are so chosen that manipulating the population often leads the algorithm away from unpromising areas of the search space and towards promising ones, without the algorithm having to explicitly remember its trail through the search space.
Unlike other subsymbolic search algorithms, like neural networks and simulated annealing, the probabilistic primitives that genetic algorithms use to manipulate their population and their lack of an explicit memory make them very fast on contemporary hardware. Various theoretical results on genetic algorithms have been proved that are of fundamental significance to their operation and much is known about their basic behavior. However, there are few formal results about the tradeoffs they make. Recently, Rawlins has focused on some basic questions. For example, we know little about the tradeoff these algorithms make between rapid convergence in, and wide exploration of, the search space. The analytic task is to discover exactly what assumptions about the search space any search algorithm implicitly makes. This will let us make principled judgments about which algorithms are useful for which problems, and what problem encoding should be used, without having to first program the algorithm and run it.
(with Gary Parker) ``Cyclic genetic algorithms for the locomotion of hexapod robots,'' Proceedings of the Sixth International Symposium on Robotics and Automated Manufacturing (ISRAM '96), Montpelier, France, May, 1996.
(with Terry Jones) ``Reverse hillclimbing, genetic algorithms, and the busy beaver problem,'' Proceedings of the Fifth International Conference on Genetic Algorithms, 70--83, Morgan Kaufmann, 1993.
(with Sushil Louis) ``Pareto optimality, GA-easiness, and deception,'' Proceedings of the Fifth International Conference on Genetic Algorithms, S. Forrest (ed.), 118--123, Morgan Kaufmann, 1993.
(with Sushil Louis) ``Why genetic algorithms?,'' Proceedings of the Fifth Midwest Artificial Intelligence and Cognitive Science Society Conference, T. E. Ahlswede (ed.), MAICSS, 1--5, 1993.
(with Sushil Louis) ``Syntactic analysis of convergence in genetic algorithms,'' Foundations of Genetic Algorithms 2, 141--151, D. Whitley (ed.), Morgan Kaufmann, 1993.
(with Sushil Louis) ``Designer genetic algorithms: Genetic algorithms in structure design,'' Proceedings of the Fourth International Conference on Genetic Algorithms, R. Belew and L. B. Booker (eds.), Morgan Kaufmann, 53--60, 1991.
A large part of computer science can be classified as the development and refinement of search algorithms. The classical algorithms (for example, binary search, interpolation search, quad trees) all work well only when the searched for target can be well-specified and the search space is well-defined. These criteria don't hold in two important cases: in robotics when robots must explore an unknown terrain and in artificial intelligence when the beginning user of an expert system or knowledge base has little idea how best to structure a query. Rawlins has investigated both areas by developing theoretical limits on what is achievable depending on the state of knowledge of the navigator (whether robot or human), developing optimal search algorithms when various pieces of information are known, and producing a general (but mainly heuristic) adaptive algorithm to aid human search of large information spaces of very high dimension (in particular, searching large corpuses of pictures in a database, or searching for electronic documents on the world wide web).
(with Yue-Herng Lin and Marc VanHeyningen) ``Pic1: A visual database interface,'' International Journal of Expert Systems 8, (3): 237--245, 1995.
(with Marc VanHeyningen) ``Pic1: Image searching by interactive evolution,'' Proceedings of the Seventh Annual Florida Artificial Intelligence Research Symposium, Douglas D. Dankel II (ed.), 315--319, 1994.
(with Ricardo Baeza-Yates and Joseph Culberson) ``Searching in the plane,'' Information and Computation 106, 234--252, 1993.
As computer hardware has improved over the past two decades, computer science has started seriously exploring the possibility of real parallel machines as an aid to computation. Now that parallel machines are available even to general practitioners, the need for parallel algorithms has grown. Progress has been stymied however by the difficulty of transferring expertise with writing large sequential programs to a parallel domain with a few hundred processors. One such apparently ``inherently sequential'' problem is the Matrix Chain Order Problem. Rawlins has been working with Phil Bradford to find efficient parallel algorithms for this problem and they have managed to bring the solution down to polylogarithmic time with a number of processors linear in the size of the matrix problem. This result is still mainly of theoretical interest only since the resulting algorithm is quite complex. Work continues on this problem.
(with Phillip Bradford and Gregory Shannon) ``Efficient matrix chain ordering in polylog time with linear processors,'' SIAM Journal on Computing, (to appear) 1996.
(with Phillip Bradford and Venkatesh Choppella) ``Lower bounds for the matrix chain ordering problem'' (extended abstract), Proceedings of LATIN '95: Theoretical Informatics}, Ricardo Baeza-Yates, Eric Goles, and Patricio V. Poblete (eds.), Lecture Notes in Computer Science # 911, 112--130, Springer-Verlag, 1995.
(with Phillip Bradford and Gregory Shannon) ``Efficient matrix chain ordering in polylog time with linear processors,'' (extended abstract), Proceedings of the 8th IEEE International Parallel Processing Symposium, H. J. Siegel (ed.), IEEE Press, 234--241, 1994.