首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Dual-process models of the word-frequency mirror effect posit that low-frequency words are recollected more often than high-frequency words, producing the hit rate differences in the word-frequency effect, whereas high-frequency words are more familiar, producing the false-alarm-rate differences. In this pair of experiments, the authors demonstrate that the analysis of receiver operating characteristic (ROC) curves provides critical information in support of this interpretation. Specifically, when participants were required to discriminate between studied nouns and their plurality reversed complements, the ROC curve was accurately described by a threshold model that is consistent with recollection-based recognition. Further, the plurality discrimination ROC curves showed characteristics consistent with the interpretation that participants recollected low-frequency items more than high-frequency items.  相似文献   

2.
Young and older adults were tested on recognition memory for pictures. The Yonelinas high threshold (YHT) model, a formal implementation of 2-process theory, fit the response distribution data of both young and older adults significantly better than a normal unequal variance signal-detection model. Consistent with this finding, nonlinear z-transformed receiver operating characteristic curves were obtained for both groups. Estimates of recollection from the YHT model were significantly higher for young than for older adults. This deficit was not a consequence of a general decline in memory; older adults showed comparable overall accuracy and in fact a nonsignificant increase in their familiarity scores. Implications of these results for theories of recognition memory and the mnemonic deficit associated with aging are discussed.  相似文献   

3.
4.
According to dual-process models, associative recognition memory mainly relies on recollection without benefiting from familiarity. This study investigates the circumstances under which familiarity may support associative recognition judgements by comparing recognition memory for arbitrarily paired items (i.e., pairs of face stimuli depicting two different persons; interitem associations) with recognition memory for pairs of items that are highly overlapping and can be unitised into a coherent whole (i.e., pairs of physically different but very similar face stimuli depicting the same person; intraitem associations). Estimates of familiarity and recollection were derived from receiver operating characteristics. Consistent with the hypothesis that familiarity is able to support associative recognition memory, but only when the to-be-associated stimuli can be unitised, results from two experiments revealed higher familiarity estimates for intra- compared to interitem associations. By contrast, recollection for recombined pairs was higher for inter- compared to intraitem associations. We propose a hypothetical model on how intraitem associations may benefit from familiarity.  相似文献   

5.
Receiver operating characteristics (ROCs) have been used extensively to study the processes underlying human recognition memory, and this method has recently been applied in studies of rats. However, the extent to which the results from human and animal studies converge is neither entirely clear, nor is it known how the different methods used to obtain ROCs in different species impact the results. A recent study used a response bias ROC manipulation with rats and demonstrated that speeding memory responses reduced the contribution of recollection, not familiarity. The current study confirms this finding in humans using a comparable response bias method. Moreover, a comparison of the response bias methods commonly used in animal studies and the confidence rating method typically employed in human studies produced similar ROC functions. The present results suggest that the analysis of recognition memory ROCs provides a fruitful method to bridge the human and animal memory literatures.  相似文献   

6.
Receiver operating characteristic graphs are shown to be a variant form of ordinal dominance graphs. The area above the latter graph and the area below the former graph are useful measures of both the size or importance of a difference between two populations and/or the accuracy of discrimination performance. The usual estimator for this area is closely related to the Mann-Whitney U statistic. Statistical literature on this area estimator is reviewed. For large sample sizes, the area estimator is approximately normally distributed. Formulas for the variance and the maximum variance of the area estimator are given. Several different methods of constructing confidence intervals for the area measure are presented and the strengths and weaknesses of each of these methods are discussed. Finally, the Appendix presents the derivation of a new mathematical result, the maximum variance of the area estimator over convex ordinal dominance graphs.  相似文献   

7.
In two experiments we investigated recognition and classification judgements using an artificial grammar learning paradigm. In Experiment 1, when only new test items had to be judged, analysis of z-transformed receiver operating characteristics (z-ROCs) revealed no differences between classification and recognition. In Experiment 2, where we included old test items, z-ROCs in the two tasks differed, suggesting that judgements relied on different types of information. The results are interpreted in terms of heuristics that people use when making classification and recognition judgements.  相似文献   

8.
In two experiments we investigated recognition and classification judgements using an artificial grammar learning paradigm. In Experiment 1, when only new test items had to be judged, analysis of z-transformed receiver operating characteristics (z-ROCs) revealed no differences between classification and recognition. In Experiment 2, where we included old test items, z-ROCs in the two tasks differed, suggesting that judgements relied on different types of information. The results are interpreted in terms of heuristics that people use when making classification and recognition judgements.  相似文献   

9.
10.
A number of Minnesota Multiphasic Personality Inventory-2 (MMPI-2) items have been hypothesized to reflect neurologic symptomatology, rather than psychopathology, among closed-head-injury (CHI) patients. Some investigators have proposed a correction factor interpretive approach, which involves the deletion of such items from the MMPI-2 profile, as a method of reducing the probability of artificial clinical scale elevations due to the symptoms of CHI. The present study employed receiver operating characteristic (ROC) analysis to evaluate the sensitivity and specificity of three correction factors. All three factors demonstrated strong sensitivity when discriminating CHI patients from normal individuals but demonstrated poor specificity when discriminating CHI patients from psychiatric patients. These findings suggest that caution should be applied in using MMPI-2 neurologic correction factors, particularly with patients who might have comorbid psychiatric conditions.  相似文献   

11.
It is shown that the symmetry of the receiver operating characteristic curve implies that the Kullback–Leibler divergences between the signal and noise populations are equal when the arguments are interchanged.  相似文献   

12.
For receiver operating characteristic curves to be symmetric the signal distribution must be an orientation-reversing involution of the noise distribution on the strength axis.  相似文献   

13.
Receiver operating characteristic (ROC) analysis is a widely used and accepted method for improving decision making performance across a range of diagnostic settings. The goal of this paper is to demonstrate how ROC analysis can be used to improve the quality of decisions made routinely in a policing context. To begin, I discuss the general principles underlying the ROC approach and demonstrate how one can conduct the analysis. Several practical applications of ROC analysis are then presented by drawing on a number of policing tasks where the procedure has been used already (bite mark identification and linking serial crimes) or where it could be used in the future (statement validity assessment and determining the veracity of suicide notes). I conclude by considering briefly some of the potential difficulties that may be encountered when using ROC analysis in the policing context and offer some possible solutions to these problems. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

14.
Is recollection a continuous/graded process or a threshold/all-or-none process? Receiver operating characteristic (ROC) analysis can answer this question as the continuous model and the threshold model predict curved and linear recollection ROCs, respectively. As memory for plurality, an item's previous singular or plural form, is assumed to rely on recollection, the nature of recollection can be investigated by evaluating plurality memory ROCs. The present study consisted of four experiments. During encoding, words (singular or plural) or objects (single/singular or duplicate/plural) were presented. During retrieval, old items with the same plurality or different plurality were presented. For each item, participants made a confidence rating ranging from “very sure old”, which was correct for same plurality items, to “very sure new”, which was correct for different plurality items. Each plurality memory ROC was the proportion of same versus different plurality items classified as “old” (i.e., hits versus false alarms). Chi-squared analysis revealed that all of the plurality memory ROCs were adequately fit by the continuous unequal variance model, whereas none of the ROCs were adequately fit by the two-high threshold model. These plurality memory ROC results indicate recollection is a continuous process, which complements previous source memory and associative memory ROC findings.  相似文献   

15.
Signal detection theory (SDT) requires that the slope of the receiver operating characteristic (ROC) is independent of the probability of signal and noise. But it has been shown that when the rating procedure is applied to detection, the slope of the receiver operating characteristic may increase as a function of the probability of the signal (Schulman & Greenberg, 1970). This presents a serious problem for signal detection theory. This problem is examined in relation to a recent theory of criterion setting (Treisman & Williams, 1984), and an explanation for the effect of signal probability on ROC slope is derived which is compatible with SDT. Further data on the relation between signal probability and ROC slope are reported.  相似文献   

16.
In 3 experiments, young and older adults studied lists of unrelated word pairs and were given confidence-rated item and associative recognition tests. Several different models of recognition were fit to the confidence-rating data using techniques described by S. Macho (2002, 2004). Concordant with previous findings, item recognition data were best fit by an unequal-variance signal detection theory model for both young and older adults. For both age groups, associative recognition performance was best explained by models incorporating both recollection and familiarity components. Examination of parameter estimates supported the conclusion that recollection is reduced in old age, but inferences about age differences in familiarity were highly model dependent. Implications for dual-process models of memory in old age are discussed.  相似文献   

17.
Receiver operating characteristics (ROCs) in recognition memory: a review   总被引:2,自引:0,他引:2  
Receiver operating characteristic (ROC) analysis is being used increasingly to examine the memory processes underlying recognition memory. The authors discuss the methodological issues involved in conducting and analyzing ROC results, describe the various models that have been developed to account for these results, review the behavioral empirical literature, and assess the models in light of those results. The empirical literature includes studies of item recognition, relational recognition (e.g., source and associative tests), as well as exclusion and remember-know tasks. Nine empirical regularities are described, and a number of unresolved empirical issues are identified. The results indicate that several common classes of recognition models, such as pure threshold and pure signal detection models, are inadequate to account for recognition memory, whereas several hybrid models that incorporate a signal detection-based process and a threshold recollection or attention process are in better agreement with the results. The results indicate that there are at least 2 functionally distinct component/processes underlying recognition memory. In addition, the ROC results have various implications for how recognition memory performance should be measured.  相似文献   

18.
A theory for automatized performance in item recognition tasks is outlined. Automation occurs during practice with consistent mappings of stimuli to responses. Automatized processing at stages of pattern recognition, binary coding, and response evocation is described in sufficient detail to provide a qualitative explanation for (a) extant data on transfer of training (where previous theories fail), ( b ) known effects of visual confusability and positive set size, and (c) null effect of negative set size (demonstrated in Experiment 1). The effects of visual confusability and positive set size are ascribed to the stage of pattern recognition; it is suggested that their joint effects on reaction times and error rates (measured in Experiment 2) be explained at a quantitative level by a parallel random walk recognition model.  相似文献   

19.
Standard factorial designs in psycholinguistics have been complemented recently by large-scale databases providing empirical constraints at the level of item performance. At the same time, the development of precise computational architectures has led modelers to compare item-level performance with item-level predictions. It has been suggested, however, that item performance includes a large amount of undesirable error variance that should be quantified to determine the amount of reproducible variance that models should account for. In the present study, we provide a simple and tractable statistical analysis of this issue. We also report practical solutions for estimating the amount of reproducible variance for any database that conforms to the additive decomposition of the variance. A new empirical database consisting of the word identification times of 140 participants on 120 words is then used to test these practical solutions. Finally, we show that increases in the amount of reproducible variance are accompanied by the detection of new sources of variance.  相似文献   

20.
Do the processing and online manipulation of stimuli that are less familiar require more working memory (WM) resources? Is it more difficult to solve demanding problems when the symbols involved are less rather than more familiar? We explored these questions with a dual-task paradigm in which subjects had to solve algebra problems of different complexities while simultaneously holding novel symbol–digit associations in WM. The symbols were previously unknown Chinese characters, whose familiarity was manipulated by differential training frequency with a visual search task for nine hour-long sessions over 3 weeks. Subsequently, subjects solved equations that required one or two transformations. Before each trial, two different integers were assigned to two different Chinese characters of the same training frequency. Half of the time, those characters were present as variables in the equation and had to be substituted for the corresponding digits. After attempting to solve the equation, subjects had to recognize which two characters were shown immediately before that trial and to recall the integer associated with each. Solution accuracy and response times were better when the problems required one transformation only; variable substitution was not required; or the Chinese characters were high frequency. The effects of stimulus familiarity increased as the WM demands of the equation increased. Character–digit associations were also recalled less well with low-frequency characters. These results provide strong support that WM capacity depends not only on the number of chunks of information one is attempting to process but also on their strength or familiarity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号