Squirrel monkeys were trained on a multiple schedule in which 10-min periods on a continuous shock avoidance schedule, indicated by a yellow light, alternated with 10-min periods on a 1.5-min variable interval schedule of food reinforcement (VI 1.5). A white light indicated that VI 1.5 was in effect, except for the middle 2 min of the period on VI 1.5, in which a blue light appeared and terminated with the delivery of a 0.5-sec unavoidable shock. Stable response rates developed in the avoidance and VI 1.5 components. However, the highest response rates occurred in the blue, preshock stimulus. A series of experiments showed that responding in the blue stimulus persisted even when responding had been extinguished on both the VI schedule of food reinforcement and the shock avoidance schedule. Responding in the blue stimulus ceased when the blue stimulus terminated without shock or when it terminated with a response-contingent shock. Each time responding ceased, it was restored by terminating the blue stimulus with an unavoidable shock. When the blue stimulus was on throughout each session and unavoidable shocks were delivered at regular 10-min intervals, responding was well maintained. These results show that in monkeys that have been trained on a continuous avoidance schedule, unavoidable shocks can maintain responding even under conditions where responses have no programmed consequences. 相似文献
The progressive ratio schedule requires the subject to emit an increasing number of responses for each successive reinforcement. Eventually, the response requirement becomes so large that the subject fails to respond for a period of 15 min and thereby terminates the session. This point is arbitrarily defined as the “breaking point” of the subject's performance. The measure is quantified in terms of the number of responses in the final completed (i.e., reinforced) ratio run of the session. Previous work has shown that this measure varies as a function of several motivational variables and may thus be useful as an index of reinforcement strength. The present study is an extension of that work. The subjects were four rats. In the first experiment, the effects of the size of the increment by which each ratio run increased were studied. In two additional experiments, the volume of a liquid reinforcer was varied using both large and small ratio increments. The results indicate that the number of responses in the final completed ratio run increases as a function of the size of the ratio increment. However, the number of reinforcements obtained by the animals per session declines sharply. When large ratio increments are used, the number of responses in the final ratio increases as a function of the volume of the reinforcer, but when small increments are used, progressive satiation results in a decline in performance with the larger volumes of liquid. 相似文献
Two procedures were used in an investigation of the effects of deprivation upon counting and timing. Under the first procedure, fixed minimum interval (FMI), the rat received liquid reinforcement every time it pressed bar B after having waited a minimum of 5 sec following a press on bar A. Under the second procedure, fixed consecutive number (FCN), reinforcement was delivered every time the rat pressed bar B following a run of at least four consecutive responses on bar A.
Water deprivation was varied over a set of values ranging from 4 to 56 hr. Deprivation had almost no effect on the waiting time in the FMI procedure, or on the number of responses per run in the FCN procedure. With both procedures, increasing deprivation shortened the pause between reinforcement and the next response. In the FCN procedure, the speed with which the runs were executed increased with increasing deprivation, although the number of responses in these runs was relatively unaffected.
Of 23 pigeons, 11 received continuous reinforcement for key pecking, and 12 received an FR 10 schedule of reinforcement. The birds were then tested without food, but with potential conditioned reinforcers presented either on the same schedule as in training, on the other schedule, or not at all. Each bird in the subgroup trained on CRF and tested with Sr's at FR 10 not only gave more responses in testing than did each bird in both subgroups receiving no Sr's, but also gave more responses than did each bird in the Sr subgroup receiving CRF training and Sr's at CRF. Cumulative records are presented to show the effects of different schedules of conditioned reinforcers. 相似文献
A more direct method than the usual ones for obtaining inhibitory gradients requires that the dimension of the nonreinforced stimulus selected for testing be orthogonal to the dimensions of the reinforced stimulus. In that case, the test points along the inhibitory gradient are equally distant from the reinforced stimulus. An attempt was made to realize this condition by obtaining inhibitory gradients along the frequency dimension of a pure tone after discrimination training in which the nonreinforced stimulus was a pure tone (or tones), and the reinforced stimulus was either white noise or the absence of a tone. The results showed that some degree of specific inhibitory control was exerted by the frequency of the tone, although the gradients were broad and shallow in slope.
A further experiment was conducted to see whether the modification of an excitatory gradient resulting from training to discriminate neighboring tones could arise from a simple interaction of inhibitory and excitatory gradients. The results indicated that it could not, since discrimination training produced a concentration of responding in the vicinity of the reinforced stimulus which cannot be derived from any plausible gradient of inhibition.