首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A progressive-ratio reinforcement schedule, in which successive reinforcements required an additional 50 responses, was programmed on one key. A response on a second key reset the progressive-ratio schedule to the first step. Before punishment, all pigeons consistently reset the schedule after reinforcement on the first step, thereby minimizing the number of responses required for reinforcement. Punishment was a brief electric shock contingent upon each response on the reset key. The first effect of punishment was to change the frequency of extra responses on the reset key. Under higher intensities of punishment, the pigeons completed the advanced steps of the progressive-ratio schedule before resetting to the first step. Completions of advanced steps were accompanied by decreases in the overall rate of responding and the rate of reinforcement. When the punishment contingency was removed, the major features of pre-punishment performance were recovered.  相似文献   

2.
The effects of experimental history on responding under a progressive-ratio schedule of reinforcement were examined. Sixteen pigeons were divided into four equal groups. Groups 1 to 3 were trained to peck a key for food under a fixed-ratio, variable-ratio, or differential-reinforcement-of-low-rate schedule of reinforcement. After training, these pigeons were shifted to a progressive-ratio schedule, later were shifted back to their original schedule (with decreased rates of reinforcement), and finally were returned to the progressive-ratio schedule. Pigeons in Group 4 (control) were maintained on the progressive-ratio schedule for the entire experiment. To test for potential "latent history" effects, pigeons responding under the progressive-ratio schedule were injected with d-amphetamine and given behavioral-momentum tests of prefeeding and extinction. Experimental histories affected responding in the immediate transition to the progressive-ratio schedule; response rates of pigeons with variable-ratio and fixed-ratio histories were higher than rates of pigeons with differential-reinforcement-of-low-rate and progressive-ratio-only histories. Pigeons with differential-reinforcement-of-low-rate histories, and to a lesser degree pigeons with variable-ratio and fixed-ratio histories, also had shorter postreinforcement pauses than pigeons with only a progressive-ratio history. No consistent long-term effects of prior contingencies on responding under the progressive-ratio schedule were evident. d-Amphetamine and resistance-to-change tests failed to reveal consistent latent history effects. The data suggest that history effects are sometimes transitory and not susceptible to latent influences.  相似文献   

3.
In Experiment 1, two conditions were compared: (a) a variability schedule in which food reinforcement was delivered for the fourth peck in a sequence that differed from the preceding N four-peck sequences, with the value of N continuously adjusted to maintain reinforcement probability approximately constant; and (b) a control condition in which the variability constraint was dropped but reinforcement probability remained constant. Pigeons responded approximately randomly under the variability schedule but showed strong stereotyped behavior under the control condition. Experiments 2 and 3 tested the idea that variability is the outcome of a type of frequency-dependent selection, namely differential reinforcement of infrequent behavior patterns. The results showed that pigeons alternate when frequency-dependent selection is applied to single pecks because alternation is an easy-to-learn stable pattern that satisfies the frequency-dependent condition. Nevertheless, 2 of 4 pigeons showed random behavior when frequency-dependent selection was applied to two pecks, even though double alternation is a permissible and stable stereotype under these conditions. It appears that random behavior results when pigeons are unable to acquire the stable stereotyped behavior under a given frequency-dependent schedule.  相似文献   

4.
In Experiment 1, pigeons chose between variable- and fixed-interval schedules. The timer for 1 schedule was reset by a reinforcement on that schedule or on either schedule. In both cases, the pigeons timed reinforcement on each schedule from trial onset. The data further suggest that their behavior reflects 2 independent processes: 1 deciding when a response should be emitted and responsible for the timing of the overall activity, and the other determining what this response should be and responsible for the allocation of behavior between the 2 response keys. Results from Experiment 2, which studied choice between 2 fixed-interval schedules, support those 2 conclusions. These results have implications for the study of operant choice in general.  相似文献   

5.
Pigeons pecked a key and rats pressed a lever for food reinforcement under large values of the differential-reinforcement-of-low-rate schedule. Each subject was tested under 10 different schedule values ranging from 1 to 45 min and was exposed to each schedule value at least twice. The mean interresponse time and mean interreinforcement time increased with the schedule value according to power functions. Response-probability functions were computed for schedule values below 20 min and showed an increase in response probability as a function of time since the last response in most cases. Mean responses per reinforcer increased as a function of schedule value for the rats, but decreased as a function of schedule value for the pigeons. The proportion of responses with interresponse times shorter than 1 sec were an increasing function of schedule value for the pigeons, but did not vary as a function of schedule value for the rats.  相似文献   

6.
The effect of the size of the floor area of the operant test chamber on behavior was tested using a standard-size test chamber and a test chamber with one-fourth of the floor area of the standard chamber. Two groups of pigeons were tested under a differential-reinforcement-of-low-rate 15-sec schedule or a variable-interval 60-sec schedule. Both groups of pigeons had higher response rates while in the smaller floor area. Pigeons under the differential-reinforcement-of-low-rate schedule also showed a decrease in rate of reinforcement, an increase in ratio of responses to reinforcements, and an alteration in interresponse-time-per-opportunity distributions when tested in the reduced floor-area condition. These effects are similar to those found under physical restraint, indicating that amount of floor space available for locomotion interacts with schedule behavior and that physical restraint may be regarded as the lower limiting value of amount of floor area available for locomotion.  相似文献   

7.
The relative magnitude and relative frequency of reinforcement for two concurrent interresponse times (1.5 to 2.5 sec and 3.5 to 4.5 sec) were simultaneously varied in an experiment in which pigeons obtained grain by pecking on a single key. Visual discriminative stimuli accompanied the two time intervals in which reinforcements were arranged by a one-minute variable-interval schedule. The resulting interresponse times of each of three pigeons fell into two groups; "short" (1.0 to 2.5 sec) and "long" (3.0 to 4.5 sec). Steady-state relative frequencies of these interresponse times were orderly functions of both reinforcement variables. The combined effects of both independent variables were well summarized by a linear function of one variable, relative access to food. Unlike corresponding two-key concurrent variable-interval schedules, the present schedule did not produce an equality between the relative frequency of an operant and either the relative magnitude or the relative frequency of reinforcement of that operant. A tentative account is provided for this difference between one-key and two-key functions.  相似文献   

8.
Three experiments were conducted to test an interpretation of the response-rate-reducing effects of unsignaled nonresetting delays to reinforcement in pigeons. According to this interpretation, rates of key pecking decrease under these conditions because key pecks alternate with hopper-observing behavior. In Experiment 1, 4 pigeons pecked a food key that raised the hopper provided that pecks on a different variable-interval-schedule key met the requirements of a variable-interval 60-s schedule. The stimuli associated with the availability of the hopper (i.e., houselight and keylight off, food key illuminated, feedback following food-key pecks) were gradually removed across phases while the dependent relation between hopper availability and variable-interval-schedule key pecks was maintained. Rates of pecking the variable-interval-schedule key decreased to low levels and rates of food-key pecks increased when variable-interval-schedule key pecks did not produce hopper-correlated stimuli. In Experiment 2, pigeons initially pecked a single key under a variable-interval 60-s schedule. Then the dependent relation between hopper presentation and key pecks was eliminated by arranging a variable-time 60-s schedule. When rates of pecking had decreased to low levels, conditions were changed so that pecks during the final 5 s of each interval changed the keylight color from green to amber. When pecking produced these hopper-correlated stimuli, pecking occurred at high rates, despite the absence of a peck-food dependency. When peck-produced changes in keylight color were uncorrelated with food, rates of pecking fell to low levels. In Experiment 3, details (obtained delays, interresponse-time distributions, eating times) of the transition from high to low response rates produced by the introduction of a 3-s unsignaled delay were tracked from session to session in 3 pigeons that had been initially trained to peck under a conventional variable-interval 60-s schedule. Decreases in response rates soon after the transition to delayed reinforcement were accompanied by decreases in eating times and alterations in interresponse-time distributions. As response rates decreased and became stable, eating times increased and their variability decreased. These findings support an interpretation of the effects of delayed reinforcement that emphasizes the importance of hopper-observing behavior.  相似文献   

9.
Four experiments examined the relationship between rate of reinforcement and resistance to change in rats' and pigeons' responses under simple and multiple schedules of reinforcement. In Experiment 1, 28 rats responded under either simple fixed-ratio, variable-ratio, fixed-interval, or variable-interval schedules; in Experiment 2, 3 pigeons responded under simple fixed-ratio schedules. Under each schedule, rate of reinforcement varied across four successive conditions. In Experiment 3, 14 rats responded under either a multiple fixed-ratio schedule or a multiple fixed-interval schedule, each with two components that differed in rate of reinforcement. In Experiment 4, 7 pigeons responded under either a multiple fixed-ratio or a multiple fixed-interval schedule, each with three components that also differed in rate of reinforcement. Under each condition of each experiment, resistance to change was studied by measuring schedule-controlled performance under conditions with prefeeding, response-independent food during the schedule or during timeouts that separated components of the multiple schedules, and by measuring behavior under extinction. There were no consistent differences between rats and pigeons. There was no direct relationship between rates of reinforcement and resistance to change when rates of reinforcement varied across successive conditions in the simple schedules. By comparison, in the multiple schedules there was a direct relationship between rates of reinforcement and resistance to change during most tests of resistance to change. The major exception was delivering response-independent food during the schedule; this disrupted responding, but there was no direct relationship between rates of reinforcement and resistance to change in simple- or multiple-schedule contexts. The data suggest that rate of reinforcement determines resistance to change in multiple schedules, but that this relationship does not hold under simple schedules.  相似文献   

10.
Staddon and Simmelhag's proposal that behavior is produced by “principles of behavioral variation” instead of contingencies of reinforcement was tested in two experiments. In the first experiment pigeons were exposed to either a fixed-interval schedule of response-contingent reinforcement, an autoshaping schedule of stimulus-contingent reinforcement, or a fixed-time schedule of noncontingent reinforcement. Pigeons exposed to contingent reinforcement came to peck more rapidly than those exposed to noncontingent reinforcement. Staddon and Simmelhag's “principles of behavioral variation” included the proposal that patterns (interim and terminal) were a function of momentary probability of reinforcement. In the second experiment pigeons were exposed to either a fixed-time or a random-time schedule of noncontingent reinforcement. Pecking showed a constant frequency of occurrence over postfood time on the random-time schedule. Most behavior showed patterns on the fixed-time schedule that differed in overall shape (i.e., interim versus terminal) from those shown on the random-time schedule. It was concluded that both the momentary probability of reinforcement and postfood time can affect patterning.  相似文献   

11.
A fixed-interval schedule of reinforcement was modified by dividing each interval into 4-sec trial periods. No more than one response could occur during each trial because the operandum was inactivated for the remainder of any trial in which a response occurred. For example, under a 28-sec schedule, no more than seven responses could be emitted between reinforcements. Probabilities of responding by pigeons under six values of this discrete-trial fixed-interval schedule were best described by a two-state model: responding was either absent or infrequent immediately after reinforcement; then, at some variable time after reinforcement, there was an abrupt transition to a high and constant probability of responding on each trial. Performances under the discrete-trial procedure were less affected by uncontrolled sources of variance than performances under equivalent free-operant fixed-interval schedules.  相似文献   

12.
The purpose of this study was to examine effects of d-amphetamine on choice controlled by reinforcement delay. Eight pigeons responded under a concurrent-chains procedure in which one terminal-link schedule was always fixed-interval 8 s, and the other terminal-link schedule changed from session to session between fixed-interval 4 s and fixed-interval 16 s according to a 31-step pseudorandom binary sequence. After sufficient exposure to these contingencies (at least once through the pseudorandom binary sequence), the pigeons acquired a preference for the shorter reinforcement delay within each session. Estimates of the sensitivity to reinforcement immediacy were similar to those obtained in previous studies. For all pigeons, at least one dose of d-amphetamine attenuated preference and, hence, decreased estimates of sensitivity to reinforcement immediacy; in most cases, this effect occurred without a change in overall response rates. In many cases, the reduced sensitivity to reinforcement delay produced by d-amphetamine resulted primarily from a decrease in the asymptotic level of preference achieved within the session; in some cases, d-amphetamine produced complete indifference. These findings suggest that a reduction in the sensitivity to reinforcement delay may be an important behavioral mechanism of the effects of psychomotor stimulants.  相似文献   

13.
Three behavioral options were available to food-deprived pigeons: (1) pecking one key resulted in food reinforcement according to a 50-response progressive-ratio schedule, (2) pecking a second key reset the progressive-ratio schedule to the initial progressive-ratio step, and (3) pecking a third key produced a 3-min timeout period. Pecks on the reset key were shocked. Under low and intermediate shock intensities, timeouts were not produced; under high shock levels, timeouts were produced regularly. Timeouts occurred during the initial period of a progressive-ratio step and were more frequent during the longer steps of the progressive-ratio schedule. Response-produced timeouts under these experimental conditions could be interpreted either as an escape from aversive behavioral options or as a low-probability behavior emerging when the food reinforcement schedule exerted weaker control.  相似文献   

14.
In a multiple schedule, exteroceptive stimuli change when the reinforcement schedule is changed. Each performance in a multiple schedule may be considered concurrent with other behavior. Accordingly, two variable-interval schedules of reinforcement were arranged in a multiple schedule, and a third, common variable-interval schedule was programmed concurrently with each of the first two. A quantitative statement was derived that relates as a ratio the response rates for the first two (multiple) variable-interval schedules. The value of the ratio depends on the rates of reinforcement provided by those schedules and the reinforcement rate provided by the common variable-interval schedule. The following implications of the expression were evaluated in an experiment with pigeons: (a) if the reinforcement rates for the multiple variable-interval schedules are equal, then the ratio of response rates is unity at all reinforcement rates of the common schedule; (b) if the reinforcement rates for the multiple schedules are unequal, then the ratio of response rates increases as the reinforcement rate provided by the common schedule increases; (c) the limit of the ratio is equal to the ratio of the reinforcement rates. Satisfactory confirmation was obtained for the first two implications, but the third was left in doubt.  相似文献   

15.
Four pigeons received pre-training that included presentation of the reinforcer independently of behavior and then baseline training on a variable-interval schedule of reinforcement. With the introduction of a multiple schedule, in which the first stimulus was associated with a response contingent and a second stimulus with a response independent, 1-min variable-interval schedule, a reduction in response rate was obtained in the second component, which was not accompanied by a behavioral contrast effect in the first component. A further three pigeons were given the same pre-training and baseline training before the introduction of an otherwise identical multiple schedule, in which no reinforcement occured in the second component. Behavioral contrast was obtained from all three subjects. The results indicated that under conditions of constant reinforcement density a reduction in responding is not a sufficient condition for the occurrence of behavioral contrast.  相似文献   

16.
A two-state analysis of fixed-interval responding in the pigeon   总被引:15,自引:14,他引:1       下载免费PDF全文
The behavior of pigeons on six geometrically spaced fixed-interval schedules ranging from 16 to 512 sec is described as a two-state process. In the first state, which begins immediately after reinforcement, response rate is low and constant. At some variable time after reinforcement there is an abrupt transition to a high and approximately constant rate. The point of rapid transition occurs, on the average, at about two-thirds of the way through the interval. Response rate in the second state is an increasing, negatively accelerated function of rate of reinforcement in the second state.  相似文献   

17.
Key pecking and treadle pressing in pigeons were compared under concurrent (key-treadle) and single-operant differential-reinforcement-of-low-rate schedules of food reinforcement ranging from 5 to 60 sec (concurrent procedure) or 5 to 120 sec (single-operant procedure). Under both procedures, the two operants followed the same general law: decreasing response rate and reinforcement rate and increasing number of responses per reinforcement as a function of increasing schedule interval. High correlations were found between key pecking and treadle pressing for the measures of response rate, reinforcement rate, and responses per reinforcement. Regression equations allowed the prediction of treadle pressing from key pecking. More bursting occurred in responding to the key, and key pecking showed a more precise temporal discrimination than treadle pressing. A test for sequential dependencies between key and treadle responses showed significant dependencies not only under the concurrent procedure but also in data created artificially by merging key and treadle sequences from different pigeons under the concurrent procedure and from the same pigeon under the single-operant procedure. It seems likely that the sequential dependencies found were due to the independent action of the schedule on each operant and that behavioral dependencies did not occur with the concurrent training procedure. The key-peck operant does not appear to have any special qualities that preclude its use in discovering general laws of behavior, at least under the differential-reinforcement-of-low-rate schedule. The usefulness of the key peck in other situations requires direct experimental study.  相似文献   

18.
The duration and frequency of food presentation were varied in concurrent variable-interval variable-interval schedules of reinforcement. In the first experiment, in which pigeons were exposed to a succession of eight different schedules, neither relative duration nor relative frequency of reinforcement had as great an effect on response distribution as they have when they are manipulated separately. These results supported those previously reported by Todorov (1973) and Schneider (1973). In a second experiment, each of seven pigeons was exposed to only one concurrent schedule in which the frequency and/or duration of reinforcement differed on the two keys. Under these conditions, each pigeon's relative rate of response closely matched the relative total access to food that each schedule provided. This result suggests that previous failures to obtain matching may be due to factors such as an insufficient length of exposure to each schedule or to the pigeons' repeated exposure to different concurrent schedules.  相似文献   

19.
Operant reinforcement of aggression was studied in food-deprived pigeons by delivering food for attacks against a target pigeon. The food was delivered according to a fixed-interval schedule and attack behavior was recorded automatically. Attack could be conditioned and extinguished, and the proportion of time spent in attack was a direct function of the frequency of reinforcement of the attack. The fixed-interval schedule produced an increasing rate of attack during the interval between food reinforcements. This positive curvature was an inverse function of the duration of the interval. The findings revealed that the duration and temporal patterning of the complex social behavior of attack can be influenced in a substantial and predictable manner by the schedule and frequency of operant reinforcement.  相似文献   

20.
Key pecking by pigeons was maintained on a chained fixed-interval 4-min (12-min for 1 subject) fixed-ratio 1 schedule of food presentation. Attacks toward a restrained and protected conspecific were recorded. In the first experiment, the amount of food presented per interval was manipulated across phases by varying the number of fixed ratios required in the terminal link of the chain. Measures of attack for all pigeons during the fixed-interval component increased monotonically as a function of food amount. In the second experiment, two different food amounts alternated within each experimental session under a multiple schedule. For both pigeons in this experiment, measures of attack were higher during the component that delivered the larger food amount per interval. The differences in levels of attack induced by the two food amounts in Experiment 2, however, were not as great as in Experiment 1; apparently this was because attack during the first interval of each component was controlled in part (P-5626) or entirely (P-7848) by the reinforcement amount delivered at the end of the previous component. Attack was also a function of the location of the interfood interval within the session. For both pigeons, attack tended to decrease throughout the session. The results of both experiments suggest that attack is an increasing function of reinforcement amount under fixed-interval schedules, but that this function may be influenced by the manner in which reinforcement amount is manipulated, by the duration of the interfood interval, and by the location of the interfood interval within the experimental session. In general, these results are compatible with theories of induced attack and other schedule-induced behavior that emphasize aversive after-effects of reinforcement presentation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号