Presence and absence of stimulus control in temporally defined schedule (original) (raw)

Inhibitory stimulus control in concurrent schedules1

Journal of the Experimental Analysis of Behavior, 1970

Six pigeons were exposed to two keys, a main key and a changeover key. Pecking the main key was reinforced on a variable-interval 5-min schedule when the key was blue and never reinforced when the key displayed a vertical line on a blue background. Each peck on the changeover key changed the stimulus displayed on the main key. Each subject was given two generalization tests, consisting of presentations on the main key of six orientations of the line on the blue background, with no reinforcements being given. In one test changeover-key pecks changed the stimulus; in the other test the changeover key was covered and the experimenter controlled stimulus changes. Both responses to the six stimuli and time spent in the presence of the stimuli gave U-shaped gradients when the changeover key was operative. With most subjects, absolute rates of responding to each stimulus produced unsystematic gradients, whether or not the changeover key was operative.

Temporal control of periodic schedules: signal properties of reinforcement and blackout1

Journal of the Experimental Analysis of Behavior, 1974

Pigeons were exposed to periodic food-reinforcement schedules in which intervals ended with equal probability in either reinforcement or brief blackout. The effects on the pattern of key pecking of sequential probability of reinforcement, interval duration, and time to reinforcement opportunity were investigated in three experiments. The major results were: (1) at short absolute interval durations, time to reinforcement opportunity determined both postreinforcement and postblackout pause (time to first key peck within an interval); (2) at long intervals, postblackout pause was consistently shorter than postreinforcement pause, even if both events signalled the same time to the next reinforcement opportunity (omission effect); (3) when reinforcement and blackout signalled different times to the next reinforcement opportunity, within the same experiment, there was some evidence for interactions analogous to behavioral contrast.

Contingency and Stimulus Change in Chained Schedules of Reinforcement

Journal of the Experimental Analysis of Behavior, 1980

Higher rates of pecking were maintained by pigeons in the middle component of threecomponent chained fixed-interval schedules than in that component of corresponding multiple schedules (two extinction components followed by a fixed-interval component). This rate difference did not occur in equivalent tandem and mixed schedules, in which a single stimulus was correlated with the three components. The higher rates in components of chained schedules demonstrate a reinforcing effect of the stimulus correlated with the next component; the acquired functions of this stimulus make the vocabulary of conditioned reinforcement appropriate. Problems in defining conditioned reinforcement arise not from difficulties in demonstrating reinforcing effects but from disagreements about which experimental operations allow such reinforcing effects to be called conditioned.

Conjunctive schedules of reinforcement II: response requirements and stimulus effects1

Journal of the Experimental Analysis of Behavior, 1975

Responding of three pigeons was maintained under conjunctive fixed-ratio, fixed-interval schedules where a key peck produced food after both schedule requirements were completed. The individual schedule requirements were then successively removed and reinstated with responding maintained under the following conditions: conjunctive fixedratio, fixed-time; fixed-time; and fixed-interval schedules. Patterns of responding changed in accord with the successive removal of the schedule requirements. Compared to the conjunctive fixed-ratio, fixed-interval schedule, pause duration increased and response rate decreased under conjunctive fixed-ratio, fixed-time schedules and under fixed-time schedules alone. Overall mean rates of responding were highest and pause duration lowest under fixed-interval schedules. When changes in the keylight colors were correlated with completion of the fixed-ratio, the end of the fixed-interval, or both of these conditions, the pattern of responding was modified and indicated a greater degree of control by the individual schedules. Although two birds showed large increases in interreinforcement time when they were initially exposed to the conjunctive schedule, when responding stabilized this measure was largely invariant for all birds across most schedule conditions.

IRT–Stimulus Contingencies in Chained Schedules: Implications for the Concept of Conditioned Reinforcement

Journal of The Experimental Analysis of Behavior, 2007

Two experiments with pigeons investigated the effects of contingencies between interresponse times (IRTs) and the transitions between the components of 2-and 4-component chained schedules (Experiments 1 and 2, respectively). The probability of component transitions varied directly with the most recent (Lag 0) IRT in some experimental conditions and with the 4th (Lag 4) IRT preceding the most recent one in others. Mean component durations were constant across conditions, so the reinforcing effect of stimulus change was dissociated from that of delay to food. IRTs were longer in the Lag-0 than in the Lag-4 conditions of both experiments, thus demonstrating that stimulus change functioned as a reinforcer. In the Lag-0 conditions of Experiment 2, the Component-1 IRTs increased more than the Component-2 IRTs, which in turn increased more than the Component-3 IRTs. This finding runs counter to the conditioned-positive-reinforcement account of chained-schedule responding, which holds that the reinforcing effect of stimulus change should vary in strength as an inverse function of the delay to the unconditioned reinforcer at the end of the chain because conditioned reinforcement is due to first-or higher-order classical conditioning. Therefore, we present other possible explanations for this effect.

Automaintenance without stimulus-change reinforcement: Temporal control of key pecks

Journal of the Experimental Analysis of Behavior, 1979

Yoked pairs of experimentally naive pigeons were exposed to a modified autoshaping procedure in which key pecking by the leader birds postponed both keylight termination and access to grain for the leader and the follower bird. Key pecking developed and was maintained in all birds and continued through two reversals of roles in the yoked procedure. Although temporal control developed more slowly in follower birds, asymptotic temporal distributions of key pecking were similar for all birds in both leader and follower roles; maximum responding occurred soon after keylight onset and decreased to a minimum prior to reinforcement. Response distributions for both leader and follower birds were described by mathematical model of temporal control. Follower birds received response-independent reinforcement, and the development by these birds of temporal distributions which are minimal immediately prior to reinforcement is without precedent in Pavlovian appetitive conditioning. However, maintenance of key pecking by the leader birds, whose responses postponed both stimulus-change and food reinforcement, supports an interpretation of autoshaped and automaintained key pecking as responding elicited by signaled grain presentation.

Reinforcement Schedules: The Role of Responses Preceding the One That Produces the REINFORCER1

Journal of the Experimental Analysis of Behavior, 1971

In a two‐key pigeon chamber, variable‐interval reinforcement was scheduled for a specified number of pecks, emitted either on a single key or in a particular sequence on the two keys. Although the distribution of pecks between the two keys was affected by whether pecks were required on one or on both keys, the total pecks emitted was not; the change from a one‐key to a two‐key requirement simply moved some pecks from one key to the other. Thus, each peck preceding the one that produced the reinforcer contributed independently to the subsequent rate of responding; the contribution of a particular peck in the sequence was determined by the time between its emission and the delivery of the reinforcer (delay of reinforcement), and was identified by the proportion of pecks moved from one key to the other when the response requirement at that point in the sequence was moved from one key to the other.

The Response‐Reinforcement Dependency in Fixed‐Interval Schedules of REINFORCEMENT1

Journal of the Experimental Analysis of Behavior, 1970

Pigeons were exposed to four different schedules of food reinforcement that arranged a fixed minimum time interval between reinforcements (60 sec or 300 sec). The first was a standard fixed‐interval schedule. The second was a schedule in which food was presented automatically at the end of the fixed time interval as long as a response had occurred earlier. The third and fourth schedules were identical to the first two except that the first response after reinforcement changed the color on the key. When the schedule required a peck after the interval elapsed, the response pattern consisted of a pause after reinforcement followed by responding at a high rate until reinforcement. When a response was not required after the termination of the interval, the pattern consisted of a pause after reinforcement, followed by responses and then by a subsequent pause until reinforcement. Having the first response after reinforcement change the color on the key had little effect on performance. Pos...

Preference and discrimination between response-dependent and response-independent schedules of reinforcement1

Journal of the Experimental Analysis of Behavior, 1975

Four Asian quail (Coturnix coturnix japonica) wvere exposed to concurrent-chain schedules, the terminial links of which were either variable-interval 30 sec and variable-time 30 sec, or fixed-interval 30 sec and fixed-time 30 sec. Except for one bird that exhibited a preference for the variable-interval schedule over the variable-time schedule, no consistent preferences were demonistrated for response-dependent or response-independent schedules. However, response rates were three times greater on response-dependent than on responseindependent schedules. The discrinmination between terminal-link schedules wVas rapidly recovered after the schedcule positions wvere reversed. Casual observations revealed that the birds engaged in stereotypic circling and pecking while the response-independent schedules wvere operative.

Waiting in pigeons: the effects of daily intercalation on temporal discrimination

1992

Pigeons trained on cyclic-interval schedules adjust their postfood pause from interval to interval within each experimental session. But on regular fixed-interval schedules, many sessions at a given parameter value are usually necessary before the typical fixed-interval "scallop" appears. In the first case, temporal control appears to act from one interfood interval to the next; in the second, it appears to act over hundreds of interfood intervals. The present experiments look at the intermediate case: daily variation in schedule parameters. In Experiments 1 and 2 we show that pauses proportional to interfood interval develop on short-valued response-initiated-delay schedules when parameters are changed daily, that additional experience under this regimen leads to little further improvement, and that pauses usually change as soon as the schedule parameter is changed. Experiment 3 demonstrates identical waiting behavior on fixed-interval and response-initiated-delay schedules when the food delays are short (<20 s) and conditions are changed daily. In Experiment 4 we show that daily intercalation prevents temporal control when interfood intervals are longer (25 to 60 s). The results of Experiment 5 suggest that downshifts in interfood interval produce more rapid waiting-time adjustments than upshifts. These and other results suggest that the effects of short interfood intervals seem to be more persistent than those of long intervals. Key words: linear waiting, timing, fixed-interval schedules, response-initiated delay schedules, key peck, pigeons One of the most reliable aspects of performance on any reinforcement schedule is the postreinforcement pausing observed when reinforcers are delivered at regular time intervals. Independent of any response-reinforcer contingency, birds and mammals (including humans, under some conditions) learn to postpone food-related responses after each food delivery for a time proportional to the typical interfood interval (temporal control: Chung &