Teaching
Projects

Teaching projects


Programming and cognitive science

For several years, I have been exploring the idea of teaching basic Java programming in conjunction with some basic ideas in cognitive modeling. The goal has been to teach each new programming concept as it is needed to implement a particular algorithm.

Of course implementing an algorithm such as competitive learning or minimax adversarial search from scratch is not a reasonable assignment in a beginning class, so the approach used is to start with an artificial life program with a graphical user interface, Smarts, and have students modify the program. For example, in the genetic algorithm module, the creatures in the artificial world know how to move about, see the area in front of them, and take actions, but they don't know what actions are appropriate for what they see. The genetic algorithm is missing several critical procedures. Once the students implement these, the creatures can evolve to behave appropriately.

It turns out that programming and cognitive science meet sometimes in interesting ways. For example, the object orientation of languages such as Java originated with inheritance hierarchies in AI, and it is natural to relate them in the class.

The course and the software are described at the course website.

Exploring learning and evolution through artificial life

I have also been extending the Smarts program to allow cognitive science students to investigate some adaptive algorithms in more depth. The ultimate goal is a set of modules within which students can explore genetic algorithms, competitive learning, reinforcement learning, and supervised categorization. What all of these modules will share is the same two-dimensional world in which simple agents encounter and potentially act on other agents, food, and obstacles. One module, designed to introduce reinforcement learning, was tested at the Indiana University Summer Cognitive Science Workshop in June 2000.

In the reinforcement learning module, agents experience the world through their sensory systems, including touch, vision, and sonar. On the basis of their sensory input, they perform one of a small set of actions and receive a reward or punishment based on the consequences. Each agent has a neural network which uses the Q-learning algorithm to adjust the values associated with particular combinations of sensory inputs and actions. Students have control over the parameters that determine what rewards and punishments result from different actions taken in different situations. They can also control three parameters that affect the learning process itself.

In the genetic algorithm module, agents evolve in response to different properties of the environment. Built into their genomes is hard-wired knowledge about what actions to take in what situations, and the success of particular actions in particular situations leads some of the agents to have advantages over others and to be more likely to mate. As in the reinforcement learning module, students can control the punishments and reinforcement resulting from the actions that the agents take in different situations. They also have control over two of the parameters affecting the genetic algorithm itself.

The latest version of the program, as well as documentation, is available on the Smarts website.

Introductory linguistics

I have taught introductory linguistics, either to undergraduate or graduate students, many times and have never been happy with the textbooks. Linguistics is still a relatively divided field, and since there is no way to cover the basic controversies in any introductory textbook, the writer has to take a stand. Unfortunately most of them don't admit this to the reader; it is if what they claim throughout the book is established fact. Further, the stand taken by linguistics textbook writers is almost always an outmoded version of Chomsky's theory. For example, students are often forced to learn about syntactic transformations that no one believes in any more. While these are probably of historical interest to syntacticians, they are unlikely to be so for students who are not going on in linguistics at all. But worse from my own perspective is the fact the stand in these books is a generative one in the first place. There are coherent alternatives to generative linguistics, and they have the advantage that they are more closely related to other fields within cognitive science, especially psychology, and that they are more accessible to students with no linguistics background.

All of this becomes most irritating for me in the treatments of language acquisition, an area close to my heart. The usual approach is to assume that the view associated with generative grammar, that Universal Grammar is part of everyone's genetic endowment, is close to established fact. Again there is a coherent, increasingly supported alternative, one that relies on powerful statistical learning and the idea that the environment has much more regularity than we were originally led to believe. In any case, the Jury is Still Out.

I am in the process of writing an introductory linguistics textbook on the web. I am using the book for my undergraduate this semester. It differs from all but one or two textbooks in explicitly taking a cognitive-functional stand. It differs from most textbooks in covering a relatively small number of topics relatively thoroughly. It further differs in drawing examples for a set of ten languages rather than an unconstrained set. This has the advantage that students may begin to see how different aspects of these languages work together. Because these are all languages I know something about, it also avoids the inevitable mistakes that creep into textbooks (at least in all of the ones I've seen) when data come from secondary sources. Finally, it makes use of sound and movies in ways that are not possible in a paper book.

The latest version of the book, How Language Works, is here. Comments are welcome.

I am also developing a software package that will help introduce these concepts in a novel way. Rather than starting with data from real languages, the students will first experiment with artificial "mini-languages" using the software. These will allow them to focus on fundamental properties of language, such as compositionality and phonemes, without having to worry about the complexities of real languages. Once the concepts are familiar to them, they will be exposed to those concepts in the context of real languages.