The Response‐Reinforcement Dependency in Fixed‐Interval Schedules of REINFORCEMENT1 (original) (raw)
Related papers
Journal of the Experimental …, 1992
Key pecking of 4 pigeons was maintained under a multiple variable-interval 20-s variable-interval 120-s schedule of food reinforcement. When rates of key pecking were stable, a 5-s unsignaled, nonresetting delay to reinforcement separated the first peck after an interval elapsed from reinforcement in both components. Rates of pecking decreased substantially in both components. When rates were stable, the situation was changed such that the peck that began the 5-s delay also changed the color of the keylight for 0.5 s (i.e., the delay was briefly signaled). Rates increased to near-immediate reinforcement levels. In subsequent conditions, delays of 10 and 20 s, still briefly signaled, were tested. Although rates of key pecking during the component with the variable-interval 120-s schedule did not change appreciably across conditions, rates during the variable-interval 20-s component decreased greatly in 1 pigeon at the 10-s delay and decreased in all pigeons at the 20-s delay. In a control condition, the variable-interval 20-s schedule with 20-s delays was changed to a variable-interval 35-s schedule with 5-s delays, thus equating nominal rates of reinforcement. Rates of pecking increased to baseline levels. Rates of pecking, then, depended on the value of the briefly signaled delay relative to the programmed interfood times, rather than on the absolute delay value. These results are discussed in terms of similar findings in the literature on conditioned reinforcement, delayed matching to sample, and classical conditioning.
Journal of the Experimental Analysis of Behavior, 1990
Two experiments with pigeons examined the relation of the duration of a signal for delay ("delay signal") to rates of key pecking. The first employed a multiple schedule comprised of two components with equal variable-interval 60-s schedules of 27-s delayed food reinforcement. In one component, a short (0.5-s) delay signal, presented immediately following the key peck that began the delay, was increased in duration across phases; in the second component the delay signal initially was equal to the length of the programmed delay (27 s) and was decreased across phases. Response rates prior to delays were an increasing function of delay-signal duration. As the delay signal was decreased in duration, response rates were generally higher than those obtained under identical delay-signal durations as the signal was increased in duration. In Experiment 2 a single variable-interval 60-s schedule of 27-s delayed reinforcement was used. Delay-signal durations were again increased gradually across phases. As in Experiment 1, response rates increased as the delay-signal duration was increased. Following the phase during which the signal lasted the entire delay, shorter delay-signal-duration conditions were introduced abruptly, rather than gradually as in Experiment 1, to determine whether the gradual shortening of the delay signal accounted for the differences observed in response rates under identical delay-signal conditions in Experiment 1. Response rates obtained during the second exposures to the conditions with shorter signals were higher than those observed under identical conditions as the signal duration was increased, as in Experiment 1. In both experiments, rates and patterns of responding during delays varied greatly across subjects and were not systematically related to delay-signal durations. The effects of the delay signal may be related to the signal's role as a discriminative stimulus for adventitiously reinforced intradelay behavior, or the delay signal may have served as a conditioned reinforcer by virtue of the temporal relation between it and presentation of food.
Animal Learning & Behavior, 1976
Four pigeons pecked for food reinforcement on variable interval L-min schedules and on the variable-interval I-min components of multiple. concurrent. and pseudoconcurrent schedules. The pseudoconcurrent schedule provided only one schedule of reinforcement; but. any reinforcer could be collected by responding on either of two keys. The rate of responding generated by the variable interval schedule was not greater than the rates of responding generated by the components of the complex schedules. But. the rate of reinforcement obtained from the variable interval schedule was greater than the rates of reinforcement obtained from the components of the multiple schedule. These results may contradict the equation proposed by Herrnstein (19701. The equation predicts that the rate of responding generated by a schedule of reinforcement will be greater when the schedule appears alone. than when it appears as one component of a complex schedule.
Journal of the Experimental Analysis of Behavior, 1998
Three experiments were conducted to test an interpretation of the response‐rate‐reducing effects of unsignaled nonresetting delays to reinforcement in pigeons. According to this interpretation, rates of key pecking decrease under these conditions because key pecks alternate with hopper‐observing behavior. In Experiment 1, 4 pigeons pecked a food key that raised the hopper provided that pecks on a different variable‐interval‐schedule key met the requirements of a variable‐interval 60‐s schedule. The stimuli associated with the availability of the hopper (i.e., houselight and keylight off, food key illuminated, feedback following food‐key pecks) were gradually removed across phases while the dependent relation between hopper availability and variable‐interval‐schedule key pecks was maintained. Rates of pecking the variable‐interval‐schedule key decreased to low levels and rates of food‐key pecks increased when variable‐interval‐schedule key pecks did not produce hopper‐correlated stim...
Key pecking of 4 pigeons was maintained under a multiple variable-interval 20-s variable-interval 120-s schedule of food reinforcement. When rates of key pecking were stable, a 5-s unsignaled, nonresetting delay to reinforcement separated the first peck after an interval elapsed from reinforcement in both components. Rates of pecking decreased substantially in both components. When rates were stable, the situation was changed such that the peck that began the 5-s delay also changed the color of the keylight for 0.5 s (i.e., the delay was briefly signaled). Rates increased to near-immediate reinforcement levels. In subsequent conditions, delays of 10 and 20 s, still briefly signaled, were tested. Although rates of key pecking during the component with the variable-interval 120-s schedule did not change appreciably across conditions, rates during the variable-interval 20-s component decreased greatly in 1 pigeon at the 10-s delay and decreased in all pigeons at the 20-s delay. In a control condition, the variable-interval 20-s schedule with 20-s delays was changed to a variable-interval 35-s schedule with 5-s delays, thus equating nominal rates of reinforcement. Rates of pecking increased to baseline levels. Rates of pecking, then, depended on the value of the briefly signaled delay relative to the programmed interfood times, rather than on the absolute delay value. These results are discussed in terms of similar findings in the literature on conditioned reinforcement, delayed matching to sample, and classical conditioning.
A Quantitative Analysis of the Responding Maintained by Interval Schedules of REINFORCEMENT1
Journal of the Experimental Analysis of Behavior, 1968
Interval schedules of reinforcement maintained pigeons' key‐pecking in six experiments. Each schedule was specified in terms of mean interval, which determined the maximum rate of reinforcement possible, and distribution of intervals, which ranged from many‐valued (variable‐interval) to single‐valued (fixed‐interval). In Exp. 1, the relative durations of a sequence of intervals from an arithmetic progression were held constant while the mean interval was varied. Rate of responding was a monotonically increasing, negatively accelerated function of rate of reinforcement over a range from 8.4 to 300 reinforcements per hour. The rate of responding also increased as time passed within the individual intervals of a given schedule. In Exp. 2 and 3, several variable‐interval schedules made up of different sequences of intervals were examined. In each schedule, the rate of responding at a particular time within an interval was shown to depend at least in part on the local rate of reinfor...
Waiting in pigeons: the effects of daily intercalation on temporal discrimination
1992
Pigeons trained on cyclic-interval schedules adjust their postfood pause from interval to interval within each experimental session. But on regular fixed-interval schedules, many sessions at a given parameter value are usually necessary before the typical fixed-interval "scallop" appears. In the first case, temporal control appears to act from one interfood interval to the next; in the second, it appears to act over hundreds of interfood intervals. The present experiments look at the intermediate case: daily variation in schedule parameters. In Experiments 1 and 2 we show that pauses proportional to interfood interval develop on short-valued response-initiated-delay schedules when parameters are changed daily, that additional experience under this regimen leads to little further improvement, and that pauses usually change as soon as the schedule parameter is changed. Experiment 3 demonstrates identical waiting behavior on fixed-interval and response-initiated-delay schedules when the food delays are short (<20 s) and conditions are changed daily. In Experiment 4 we show that daily intercalation prevents temporal control when interfood intervals are longer (25 to 60 s). The results of Experiment 5 suggest that downshifts in interfood interval produce more rapid waiting-time adjustments than upshifts. These and other results suggest that the effects of short interfood intervals seem to be more persistent than those of long intervals. Key words: linear waiting, timing, fixed-interval schedules, response-initiated delay schedules, key peck, pigeons One of the most reliable aspects of performance on any reinforcement schedule is the postreinforcement pausing observed when reinforcers are delivered at regular time intervals. Independent of any response-reinforcer contingency, birds and mammals (including humans, under some conditions) learn to postpone food-related responses after each food delivery for a time proportional to the typical interfood interval (temporal control: Chung &
Latency and frequency of responding under discrete-trial fixed-interval schedules of reinforcement1
Journal of the Experimental Analysis of Behavior, 1974
Pigeons' key pecking was studied under a number of discrete-trial fixed-interval schedules of food reinforcement. Discrete trials were presented by briefly illuminating the keylight repetitively throughout the interreinforcement interval. A response latency counterpart to the fixed-interval scallop was found, latency showing a gradual, negatively accelerated decrease across the interval. This latency pattern was largely invariant across changes in fixed-interval length, number of trials per interval, and maximum trial duration. Frequency of responding during early trials in the intervals varied, however, with different schedule parameters, being directly related to fixed-interval length, inversely related to number of trials, and complexly affected by conjoint variations of fixed-interval length and number of trials. Response latency thus was found to be simply related to elapsed time during the interval while response frequency was complexly determined by other factors as well.
Food duration and signal-controlled responding by pigeons
Bulletin of the Psychonomic Society, 1985
Six pigeons were exposed to a procedure in which a tone preceded response-dependent grain. Pecks during the tone omitted grain for that trial. The magnitude of reinforcement, as manipulated by feeder duration, had no effect on the frequency of omission responding. The results are interpreted in the context of Scalar Expectancy Theory.
Journal of the Experimental Analysis of Behavior, 1969
The effect of several reinforcement schedules on the variability in topography of a pigeon's key-peck response was determined. The measure of topography was the location of a key peck within a 10-in. wide by 0.75-in. high response key. Food reinforcement was presented from a magazine located below the center of the response key. Variability in response locus decreased to a low value during training in which each response produced reinforcement. Variability increased when fixed intervals, variable intervals, random intervals, or extinction were scheduled.