首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 8 毫秒
1.
The interference in colour naming may extend beyond critical Stroop trials. This “slow” effect was first discovered in emotional Stroop tasks, but is extended here to classical Stroop. In two experiments, meaningless coloured letter strings followed a colour word or neutral word. Student participants (Experiment 1), and 18 stroke patients and 18 matched controls (Experiment 2) showed substantial interference by incongruent colour words, both in the word trial (fast component) and in the subsequent string trial (slow component). Different patient subgroups emerged from the comparison of Stroop performance with the controls. An association of fast and slow components was only found in one subgroup. Exploratory analyses revealed no clear differences in damage location between subgroups. Fast interference caused by colour-meaning conflict may be specific for classical Stroop, but the broader occurrence of slow effects suggests a more generalised process of disengagement from attention-demanding stimuli.  相似文献   

2.
3.
The error of inferring dispositional causes for constrained behavior was investigated in the domain of personality. Subjects were randomly assigned to write essays presenting themselves as strongly introverted or extraverted. Within groups, subjects exchanged essays and estimated the actual (self-rated) introversion/extraversion of the writer. The procedure minimized the likelihood of certain factors conducive to correspondent inference, e.g., the low salience of constraint or nonrepresentative sample of dispositions among essay writers. The experiment included an instructional set variable which involved accentuating the situational constraint or reinforcing the subjects' inclination to individuate the writer. In all conditions, a significant pattern of correspondent inference occurred, with attributions aligned to the directionality of the essays. The results, consistent with findings from attitude attribution research, suggest that the direction of the essay provides an initial hypothesis of correspondent inference. Subjects may then use their impression of the essay's extremity as a basis upon which to adjust their attribution in accord with the constraint of the position assignment.  相似文献   

4.
5.
6.
A robust vision-based staircase identification method is proposed, which comprises 2D staircase detection and 3D staircase localization. The 2D detector pre-screens the input image, and the 3D localization algorithm continues the task of retrieving geometry of the staircase on the reported region in the image. A novel set of principal component analysis-based Haar-like features are introduced, which extends the classical Haar-like features from local to global domain and are extremely efficient at rejecting non-object regions for the early stages of the cascade, and the Viola–Jones rapid object detection framework is improved to adapt the context of staircase detection, modifications have been made on the scanning scheme, multiple detections integrating scheme and the final detection evaluation metrics. The V-disparity concept is applied to detect the planar regions on the staircase surface and locate 3D planes quickly from disparity maps, and then, the 3D position of staircase is localized robustly. Finally, experiments show the performance of the proposed method.  相似文献   

7.
In an earlier paper (Kashdan, Biswas-Diener, &; King, 2008), we outlined a critique of the distinction being made between eudaimonic and hedonic forms of happiness. That paper seems to have had the desired effect in stimulating discourse on this important subject as evidenced by a number of responses from our colleagues. In this paper, we address these responses collectively. In particular, we outline common intellectual ground with the responding authors as well as points of difference.  相似文献   

8.
This paper presents a robust steganography method for HEVC based on secret sharing. To prevent intra-frame distortion drift, three Classes of intra-frame prediction are defined. To improve the robustness of the secret message, the embedded message has been encoded into n sub-secrets by using (t, n)-threshold secret sharing. Then the encoded message is embedded into the multi-coefficients of the selected 4 × 4 luminance DST blocks which meet the Classes mentioned above. When the test video faced with different loss-rates, the survival rate of the proposed steganography method has increased by around 24.27%, 27.38%, 61.91% and 32.26% compared with Swati, Hayat, and Shahid (2014), respectively; the survival rate of the proposed steganography method has increased by around 28.74%, 34.60%, 56.05% and 41.40% compared with Liu, Liu, and Zhao (2017), respectively; when the loss-rate is less than 15%, the proposed steganography method can get 100% survival rate. And the experiments prove that the proposed method can achieve high performance than previously video steganography methods based on HEVC, especially for visual quality and robustness performance.  相似文献   

9.
10.
In this centenary of Freud's The Interpretation of Dreams it is important to revisit this classic, to discuss why it is a classic, to consider what has been learned since its publication, and to discuss what changes in our understanding of dreams and dreaming are called for. To this end, we briefly discuss some of the main themes of the book. Then we review both changes in psychoanalytic thinking and theory and the results of many studies made possible by the discovery of the electro‐encephalographic changes that occur during sleep and their relevance for understanding dreams and their function. We suspect that Freud would have been delighted to know about this explosion of information about the physiology of dreaming. With this in mind, we consider the need for modification of some of Freud's theories while noting that his basic contribution, that dreams are meaningful and understandable, has been amply confirmed. We then discuss these observations in relation to how we approach working with dreams.  相似文献   

11.
We assembled National Basketball Association and Major League Baseball player performance data from recent years, tracking 3 year periods in players’ careers: pre-contract year (baseline), contract year (CY; salient external incentive present), and post-contract year (salient external incentive removed). In both sports, we examined both individual scoring statistics (points scored, batting average) and non-scoring statistics (e.g. blocked shots, fielding percentage) over the 3 years. Using extrinsic motivation theories, we predicted and found a boost in some scoring statistics during the CY (relative to the pre-CY), but no change in non-scoring statistics. Using intrinsic motivation theories, we predicted and found an undermining of many statistics in the post-CY, relative to both the CY and the pre-CY baseline. Boosted CY scoring performance predicted post-CY salary raises in both sports, but salary raises were largely unrelated to post-CY performance. The CY performance boost is real, but team managers should know that it might be followed by a performance crash—the CY “syndrome.”  相似文献   

12.
Researchers often want to demonstrate a lack of interaction between two categorical predictors on an outcome. To justify a lack of interaction, researchers typically accept the null hypothesis of no interaction from a conventional analysis of variance (ANOVA). This method is inappropriate as failure to reject the null hypothesis does not provide statistical evidence to support a lack of interaction. This study proposes a bootstrap‐based intersection–union test for negligible interaction that provides coherent decisions between the omnibus test and post hoc interaction contrast tests and is robust to violations of the normality and variance homogeneity assumptions. Further, a multiple comparison strategy for testing interaction contrasts following a non‐significant omnibus test is proposed. Our simulation study compared the Type I error control, omnibus power and per‐contrast power of the proposed approach to the non‐centrality‐based negligible interaction test of Cheng and Shao (2007, Statistica Sinica, 17, 1441). For 2 × 2 designs, the empirical Type I error rates of the Cheng and Shao test were very close to the nominal α level when the normality and variance homogeneity assumptions were satisfied; however, only our proposed bootstrapping approach was satisfactory under non‐normality and/or variance heterogeneity. In general a × b designs, although the omnibus Cheng and Shao test, as expected, is the most powerful, it is not robust to assumption violation and results in incoherent omnibus and interaction contrast decisions that are not possible with the intersection–union approach.  相似文献   

13.
The goal of this study was to investigate the performance of Hall’s transformation of the Brunner-Dette-Munk (BDM) and Welch-James (WJ) test statistics and Box-Cox’s data transformation in factorial designs when normality and variance homogeneity assumptions were violated separately and jointly. On the basis of unweighted marginal means, we performed a simulation study to explore the operating characteristics of the methods proposed for a variety of distributions with small sample sizes. Monte Carlo simulation results showed that when data were sampled from symmetric distributions, the error rates of the original BDM and WJ tests were scarcely affected by the lack of normality and homogeneity of variance. In contrast, when data were sampled from skewed distributions, the original BDM and WJ rates were not well controlled. Under such circumstances, the results clearly revealed that Hall’s transformation of the BDM and WJ tests provided generally better control of Type I error rates than did the same tests based on Box-Cox’s data transformation. Among all the methods considered in this study, we also found that Hall’s transformation of the BDM test yielded the best control of Type I errors, although it was often less powerful than either of the WJ tests when both approaches reasonably controlled the error rates.  相似文献   

14.
Classic matching theory, which is based on Herrnstein's (1961) original matching equation and includes the well-known quantitative law of effect, is almost certainly false. The theory is logically inconsistent with known experimental findings, and experiments have shown that its central constant-k assumption is not tenable. Modern matching theory, which is based on the power function version of the original matching equation, remains tenable, although it has not been discussed or studied extensively. The modern theory is logically consistent with known experimental findings, it predicts the fact and details of the violation of the classic theory's constant-k assumption, and it accurately describes at least some data that are inconsistent with the classic theory.  相似文献   

15.
Non-human primates compare quantities in a crude manner, by approximating their values. Less is known about the mental transformations that non-humans can perform over approximate quantities, such as arithmetic transformations. There is evidence that human symbolic arithmetic has a deep psychological connection with the primitive, approximate forms of quantification of non-human animals. Here, we ask whether the subtle performance signatures that humans exhibit during symbolic arithmetic also bear a connection to primitive arithmetic. Specifically, we examined the problem size effect, the tie effect, and the practice effect—effects which are commonly observed in children’s math performance in school. We show that, like humans, monkeys exhibited the problem size and tie effects, indicating commonalities in arithmetic algorithms with humans. Unlike humans, however, monkeys did not exhibit a practice effect. Together, these findings provide new evidence for a cognitive relation between non-symbolic and symbolic arithmetic.  相似文献   

16.
Let r1 and r2 be two dependent estimates of Pearson's correlation. There is a substantial literature on testing H0 : ρ1 = ρ2, the hypothesis that the population correlation coefficients are equal. However, it is well known that Pearson's correlation is not robust. Even a single outlier can have a substantial impact on Pearson's correlation, resulting in a misleading understanding about the strength of the association among the bulk of the points. A way of mitigating this concern is to use a correlation coefficient that guards against outliers, many of which have been proposed. But apparently there are no results on how to compare dependent robust correlation coefficients when there is heteroscedasicity. Extant results suggest that a basic percentile bootstrap will perform reasonably well. This paper reports simulation results indicating the extent to which this is true when using Spearman's rho, a Winsorized correlation or a skipped correlation.  相似文献   

17.
The authors tested whether a simple model based on the cancellation of the rate of change in bearing angle could account for the behavioral adaptations produced when individuals intercept moving balls while walking. In Experiment 1, the place of arrival of the ball and the angle of approach were varied. In accord with the model, velocity regulations were earlier and more pronounced the larger the angle of approach. In Experiment 2, ball speed unexpectedly changed during a trial, once again highlighting participants' functional velocity adaptations. A direct test of the model on the basis of each individual trial (N = 256) revealed that, on average, 70% of the total variance could be explained. Together, those results confirm the usefulness of such a robust strategy in the control of interceptive tasks.  相似文献   

18.
We examined 633 procedures that can be used to compare the variability of scores across independent groups. The procedures, except for one, were modifications of the procedures suggested by Levene (1960) and O'Brien (1981). We modified their procedures by substituting robust measures of the typical score and variability, rather than relying on classical estimators. The robust measures that we utilized were either based on a priori or empirically determined symmetric or asymmetric trimming strategies. The Levene‐type and O'Brien‐type transformed scores were used with either the ANOVA F test, a robust test due to Lee and Fung (1985), or the Welch (1951) test. Based on four measures of robustness, we recommend a Levene‐type transformation based upon empirically determined 20% asymmetric trimmed means, involving a particular adaptive estimator, where the transformed scores are then used with the ANOVA F test.  相似文献   

19.
Multiple 3-D interpretations in a classic stereokinetic effect   总被引:2,自引:0,他引:2  
It is known that a flat ellipse rotating in the frontoparallel plane sooner or later appears as a rigid circular disc tilting in 3-D space. An experiment is reported in which prolonged exposure to the same flat pattern produces a second previously unnoticed 3-D percept: an elongated egg slanted in 3-D space, which points towards the observer and the end parts of which describe a circular trajectory in the frontal plane. It is shown that the achievement of this alternative percept is not affected by the particular shape of the ellipse, although the time needed to reach it increases with an ellipse with a 2:3 axis ratio.  相似文献   

20.
The relation between spatial ability (body orientation and body awareness) and field dependence/independence was assessed by measuring performance on the Embedded Figures Test and WISC or WAIS Block Design at four different training levels in classical ballet, beginners to advanced, and a control group with no training in ballet, gymnastics, or sports (n = 70). No significant correlation was found between perceptual style or Block Design performance and body orientation or body awareness at any of the four levels of training. No significant differences were observed between the advanced group and the control group. Training in a particular type of spatial skill such as orientation within surrounding space was not related to the skill required to manipulate an abstractly represented space.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号