Saturday, October 13, 2012

Schedules of Reinforcement

Schedules and Patterns of Response
Patterns of response develop as a result of the organism interacting with a schedule of reinforcement (Ferster & Skinner, 1957). These patterns come about after an animal has experience with the contingency of reinforcement (SD : R Sr arrangement) defined by aparticular schedule. Subjects are exposed to a schedule of reinforcement, and, following an acquisition period, behavior typically settles into a consistent or steady-state performance (Sidman, 1960). It may take many experimental sessions before a particular pattern emerges, but once it does, the orderliness of behavior is remarkable.
The first description of schedule performance was provided by B. F. Skinner (1938) in his book, The Behavior of Organisms. In the preface to the seventh printing of that book, Skinner writes that “the cumulative records . . . purporting to show orderly changes in the behavior of individual organisms, occasioned some surprise and possibly, in some quarters, suspicion” (p. xii).Anysuspicion was put to rest when Skinner’s observations were replicated in many other experiments (see Morse, 1966, for a review of early work on schedules of reinforcement). The steady-state behavior generated when a fixed number of responses is reinforced illustrates one of these patterns. For example, a hungry rat might be required to press a lever 10 times to get a food pellet. Following reinforcement, the animal has to make another 10 responses to produce the next bit of food, then 10 more responses, and so on. In industry this requirement is referred to as piece rate. When organisms (rat, pigeon, or man) are reinforced after a fixed number of responses, a pause-and-run pattern of behavior develops. Responses required by the schedule are made rapidly and result in reinforcement. Following each reinforcement, there is a pause in responding, then another quick burst of responses. (See the section on fixed ratio later in this chapter for more detail.) This pattern repeats over and over and occurs even when the size of the schedule is changed. A pause-and-run pattern has been found for many species including horses (Myers & Mesker, 1960), chickens (Lane, 1961), a vulture (Witoslawski, Anderson, & Hanson, 1963), and children (Orlando & Bijou, 1960).
Schedules and Natural Contingencies
In the everyday environment, behavior is often reinforced on an intermittent basis. That is, operants are reinforced occasionally rather than each time they are emitted. Every time a child cries, he or she is not reinforced with attention. Each time a predator hunts, it is not successful. When you dial the number for airport information, you get through sometimes, but often the exchange is busy. Buses do not immediately arrive when you go to a bus stop. It is clear that persistence is often essential for survival, and therefore being able to account for such behavior on the basis of the schedule that maintains it is a major discovery. In concluding his review of schedule research, Dr. Michael Zeiler (1977) states: it is impossible to study behavior either in or outside the laboratory without encountering a schedule of reinforcement: whenever behavior is maintained by a reinforcing stimulus, some schedule is in effect and is exerting its characteristic influences. Only when there is a clear understanding of how schedules operate will it be possible to understand the effects of reinforcing stimuli on behavior. (p. 229)
Consider a bird foraging for food. The bird turns over sticks or leaves and once in a while finds a seed or insect. These bits of food occur only every now and then, and the distribution of reinforcement is the schedule that maintains the animal’s foraging behavior. If you were watching this bird hunt for food, you would probably see the animal’s head bobbing up and down. You might also see the bird pause and look around, change direction, and so on. This sort of activity is often attributed to the animal’s instinctive behavior patterns. However, labeling the behavior as instinctive does not explain it. Although biology certainly plays some role in this episode, perhaps more importantly, so does the schedule of food reinforcement.
Dr. Carl Cheney and his colleagues created a laboratory analog of foraging that allowed pigeons to choose between two food patches by pecking keys (Cheney, Bonem, & Bonem,1985). The density of food available from pecking either key was based on two concurrent progressive ratio schedules of reinforcement that increased or decreased with the amount of foraging. As reinforcers were removed from one patch, they became more scarce and therefore required more responses to produce; this was a progressively increasing ratio schedule (or depleting patch of food). Concurrently, the number of responses for each reinforcement decreased in the other patch (or a repleting patch of food). As would be expected, this change in reinforcement density up and down generated switching back and forth from patch to patch as density decreased and increased. However, in order to change patches, the center key had to be pecked, which simulated travel time and effort between patches (the side keys). The researchers found that the cost of hunting (the schedule of reinforcement for pecking) in a patch, the effort (number of responses) required to change patches, and the rate of replacement in the alternative patch all contributed to the likelihood that an animal would change patches. This research is an interesting laboratory model of animals foraging in the wild that uses schedules of reinforcement to simulate several natural contingencies. Schedules of intermittent reinforcement play an important role in the regulation of human social interaction. In this case, the behavior of one person affects what another individual does and vice versa. For example, Paul asks his friend Erin, who is looking out the window, if the pizza delivery person has arrived yet. The operant is Paul’s question, “Is the pizza here?” Reinforcement for the question is the reply from Erin. Importantly, Erin’s reply is not certain and depends on many factors. Erin may not hear the question; she may be preoccupied with other things; she may have just had an argument with Paul and refuse to talk. No matter what the reason, Paul’s question may not be reinforced on this occasion. Of course, most of the time Erin answers when asked a question. This means that Paul’s verbal behavior is on an intermittent schedule of social reinforcement. Thus, one reason schedules are important is that they approximate some of the complex contingencies that operate with humans in the everyday environment. This type of interactive verbal conversation is cleverly simulated with pigeons in the video Cognition, Creativity and Behavior (Baxley,1982).

No comments:

Post a Comment