共查询到20条相似文献,搜索用时 15 毫秒
1.
Judith E. Larkin 《欧洲人格杂志》1991,5(1):15-34
In the face of uncertainty and disagreement about the meaning and measurement of the self-monitoring construct, the author proposes an implicit theories approach to shed light on what self-monitoring scales may be tapping. The first study explored people's notions of what high and low self-monitors are like, based on the statements in the 18-item Self-Monitoring Scale (Gangestad and Snyder, 1985). The second study compared that measure with Lennox and Wolfe's (1984) Revised Self-Monitoring Scale and examined defensive motivation within the scales. The third study consisted of two experiments to determine whether subjects perceived the items of Gangestad and Snyder's Self-Monitoring Scale as reflecting a unitary latent entity or separate, contradictory variables. It was concluded that the implicit theories approach appears to be a useful complement to traditional factor analytic studies, providing new ways of looking at a personality construct, clarifying some theoretical issues, and generating hypotheses for future research. 相似文献
2.
Straatemeier M van der Maas HL Jansen BR 《Journal of experimental child psychology》2008,100(4):276-296
In the field of children’s knowledge of the earth, much debate has concerned the question of whether children’s naive knowledge—that is, their knowledge before they acquire the standard scientific theory—is coherent (i.e., theory-like) or fragmented. We conducted two studies with large samples (N = 328 and N = 381) using a new paper-and-pencil test, denoted the EARTH (EArth Representation Test for cHildren), to discriminate between these two alternatives. We performed latent class analyses on the responses to the EARTH to test mental models associated with these alternatives. The naive mental models, as formulated by Vosniadou and Brewer, were not supported by the results. The results indicated that children’s knowledge of the earth becomes more consistent as children grow older. These findings support the view that children’s naive knowledge is fragmented. 相似文献
3.
Terrace HS 《Trends in cognitive sciences》2005,9(4):202-210
Recent advances have allowed the application of behaviorism's rigor to the control of complex cognitive tasks in animals. This article examines recent research on serially organized behavior in animals. 'Chaining theory', the traditional approach to the study of such behavior, reduces intelligent action to sequences of discrete stimulus-response units in which each overt response is evoked by a particular stimulus. However, such theories are too weak to explain many forms of serially organized cognition, both in humans and animals. By training non-human primates to produce arbitrary sequences that cannot be learned as chains of particular motor responses, the simultaneous chaining paradigm has overcome limitations of chaining theory in experiments on serial expertise, the use of numerical rules, knowledge of ordinal position, and distance and magnitude effects. 相似文献
4.
Michael Borenstein Jacob Cohen Hannah R. Rothstein Simcha Pollack John M. Kane 《Behavior research methods》1992,24(4):565-572
Computer programs for statistical power analysis typically require the user to provide a series of values and respond by reporting the corresponding power. These programs provide essentially the same functions as a published text, albeit in a more convenient form. In this paper, we describe a program that instead uses innovative graphic techniques to provide insight into the interaction among the factors that determine power. For example, fort tests, the means and standard deviations of the two distributions, sample sizes, and alpha are displayed as bar graphs. As the researcher modifies these values, the corresponding values of beta (also displayed as a bar graph) and power are updated and displayed immediately. By displaying all of the factors that are instrumental in determining power, the program ensures that each will be addressed By allowing the user to determine the impact that any modifications will have on power, the program encourages an appropriate balance between alpha and beta while working within the constraints imposed by a limited sample size. The program also allows the user to generate tables and graphs to document the impact of the various factors on power. In addition, the program enables the user to run on-screen Monte Carlo simulations to demonstrate the importance of adequate statistical power, and as such, it can serve as a unique educational tool. 相似文献
5.
To uncover those factors that buffer the impact of stressful negative experiences on adolescent adjustment, a theoretical model of adolescent stress and coping, with social support and social problem solving proposed as moderators, was investigated using path analysis. The study was conducted with 122 ninth-and tenth-grade nonreferred high school students. Using the LISREL statistical package (J?reskog & S?rbom, 1986), it was found that a recursive loop leading from stress outcomes back to negative stressors did not allow for a successful solution to the model. However, the effects of stressful events on adjustment were mediated by coping resources, which included a combination of problem-solving abilities and social support. Overall, the findings replicated previous investigations that have demonstrated direct relationships among stressful life events, social support, problem solving, and adolescent adjustment. While a successful fit to the theoretical model was not attained, it was concluded that a refined model may provide a more acceptable solution. 相似文献
6.
In many human movement studies angle-time series data on several groups of individuals are measured. Current methods to compare groups include comparisons of the mean value in each group or use multivariate techniques such as principal components analysis and perform tests on the principal component scores. Such methods have been useful, though discard a large amount of information. Functional data analysis (FDA) is an emerging statistical analysis technique in human movement research which treats the angle-time series data as a function rather than a series of discrete measurements. This approach retains all of the information in the data. Functional principal components analysis (FPCA) is an extension of multivariate principal components analysis which examines the variability of a sample of curves and has been used to examine differences in movement patterns of several groups of individuals. Currently the functional principal components (FPCs) for each group are either determined separately (yielding components that are group-specific), or by combining the data for all groups and determining the FPCs of the combined data (yielding components that summarize the entire data set). The group-specific FPCs contain both within and between group variation and issues arise when comparing FPCs across groups when the order of the FPCs alter in each group. The FPCs of the combined data may not adequately describe all groups of individuals and comparisons between groups typically use t-tests of the mean FPC scores in each group. When these differences are statistically non-significant it can be difficult to determine how a particular intervention is affecting movement patterns or how injured subjects differ from controls. In this paper we aim to perform FPCA in a manner allowing sensible comparisons between groups of curves. A statistical technique called common functional principal components analysis (CFPCA) is implemented. CFPCA identifies the common sources of variation evident across groups but allows the order of each component to change for a particular group. This allows for the direct comparison of components across groups. We use our method to analyze a biomechanical data set examining the mechanisms of chronic Achilles tendon injury and the functional effects of orthoses. 相似文献
7.
A simple method, based on cross-correlation functions (CCFs) between two time series of kinematic or physiological measurements, is proposed for the analysis of multisegmental movements. Special emphasis is placed on measuring accelerations. When the movements of two body segments are coordinated but consistently time lagged, their CCF displays a peak at the corresponding time abscissa. The reproducible positions of the peaks reflect biomechanical or physiological constraints. Several significantly large peaks can be observed in a CCF. It is possible to identify coordinated movements involving more than two segments by applying simple rules of compatibility between the time lags and between the signs of the correlation peaks. With the method proposed, it is possible to determine the signs of relative variation and the time lags of the successive statistically correlated segmental movements. This is particularly useful in the case of both continuous and periodic sensorimotor control, where classical poststimulus methods cannot be applied. Unlike the classical poststimulus methods, this method does not require a time origin, and it is not necessary to monitor the muscles or even to specify exactly which ones are involved. The method is also applicable to experiments involving a time origin (e.g., and applied perturbation), although in this case it is less accurate than the averaging technique. Individual postural strategies can be identified, which suggests some interesting potential applications of the method to clinical studies. 相似文献
8.
Researchers and practitioners have long debated the structural nature of mental disorders. Until recently, arguments favoring categorical or dimensional conceptualizations have been based primarily on theoretical speculation and indirect empirical evidence. Within the depression literature, methodological limitations of past studies have hindered their capacity to inform this important controversy. Two studies were conducted using MAXCOV and MAMBAC, taxometric procedures expressly designed to assess the underlying structure of a psychological construct. Analyses were performed in large clinical samples with high base rates of major depression and a broad range of depressive symptom severity. Results of both studies, drawing on 3 widely used measures of depression, corroborated the dimensionality of depression. Implications for the conceptualization, investigation, and assessment of depression are discussed. 相似文献
9.
10.
P. E. Meehl and N. G. Waller (2002) proposed an innovative method for assessing path analysis models wherein they subjected a given model, along with a set of alternatives, to risky tests using selected elements of a sample correlation matrix. Although the authors find much common ground with the perspective underlying the Meehl-Waller approach, they suggest that there are aspects of the proposed procedure that require close examination and further development. These include the selection of only one subset of correlations to estimate parameters when multiple solutions are generally available, the fact that the risky tests may test only a subset of parameters rather than the full model of interest, and the potential for different results to be obtained from analysis of equivalent models. 相似文献
11.
12.
Graham Oddie 《Synthese》2013,190(9):1647-1687
Theories of verisimilitude have routinely been classified into two rival camps—the content approach and the likeness approach—and these appear to be motivated by very different sets of data and principles. The question thus naturally arises as to whether these approaches can be fruitfully combined. Recently Zwart and Franssen (Synthese 158(1):75–92, 2007) have offered precise analyses of the content and likeness approaches, and shown that given these analyses any attempt to meld content and likeness orderings violates some basic desiderata. Unfortunately their characterizations of the approaches do not embrace the paradigm examples of those approaches. I offer somewhat different characterizations of these two approaches, as well as of the consequence approach (Schurz and Weingartner (Synthese 172(3):415–436, 2010) which happily embrace their respective paradigms. Finally I prove that the three approaches are indeed compatible, but only just, and that the cost of combining them is too high. Any account which combines the strictures of what I call the strong likeness approach with the demands of either the content or the consequence approach suffers from precisely the same defect as Popper’s—namely, it entails the trivialization of truthlikeness. The downside of eschewing the strong likeness constraints and embracing the content constraints alone is the underdetermination of the concept of truthlikeness. 相似文献
13.
Gluck GA 《Journal of personality assessment》1979,43(5):541-543
Replicated the conditions established by Kramer in his attempt to contribute to the construct validation of the FIRO-B. Froehle's apparent later replication produced significantly different results from Kramer's original study. In replicating Kramer's design this researcher wished to establish whether the earlier failure to replicate was due to a difference in design or to an actual lack of construct validity of the FIRO-B. Kramer's findings were supported and an alternative explanation for the difference in Kramer's and Froehle's findings is discussed. 相似文献
14.
Ehrlich LT 《Journal of the American Psychoanalytic Association》2004,52(4):1075-1093
Given the decline in the average psychoanalytic practice, it is crucial to examine the variables affecting the individual analyst's practice. One such variable is the analyst's reluctance to begin a new analysis. Literature exploring its origins, possible manifestations, and effects on the analyst's thinking and practicing is reviewed. The analyst's reluctance is considered (1) as a defense against powerful affects, (2) as a co-created resistance, and (3) as a manifestation of the analyst's conflicts. Two clinical examples illustrate how this reluctance and its subsequent recognition influence the analyst's work. It is suggested that the present reality of a socioeconomic climate adverse to psychoanalysis, with fewer patients willing to engage in analysis from the outset, might be used to rationalize the analyst's reluctance to begin. It is also suggested that the analyst's reluctance to begin a new analysis is much more pervasive and influential than is presently recognized. 相似文献
15.
We propose the use of the bootstrap resampling technique as a tool to assess the within-subject reliability of experimental modulation effects on event-related potentials (ERPs). The assessment of the within-subject reliability is relevant in all those cases when the subject score is obtained by some estimation procedure, such as averaging. In these cases, possible deviations from the assumptions on which the estimation procedure relies may lead to severely biased results and, consequently, to incorrect functional inferences. In this study, we applied bootstrap analysis to data from an experiment aimed at investigating the relationship between ERPs and memory processes. ERPs were recorded from two groups of subjects engaged in a recognition memory task. During the study phase, subjects in Group A were required to make an orthographic judgment on 160 visually presented words, whereas subjects in Group B were only required to pay attention to the words. During the test phase all subjects were presented with the 160 previously studied words along with 160 new words and were required to decide whether the current word was “old” or “new.” To assess the effect of word imagery value, half of the words had a high imagery value and half a low imagery value. Analyses of variance performed on ERPs showed that an imagery-induced modulation of the old/new effect was evident only for subjects who were not engaged in the orthographic task during the study phase. This result supports the hypothesis that this modulation is due to some aspect of the recognition memory process and not to the stimulus encoding operations that occur during the recognition memory task. However, bootstrap analysis on the same data showed that the old/new effect on ERPs was not reliable for all the subjects. This result suggests that only a cautious inference can be made from these data. 相似文献
16.
Daniel A. Wilkenfeld 《Synthese》2014,191(14):3367-3391
In this paper, I argue that explanations just ARE those sorts of things that, under the right circumstances and in the right sort of way, bring about understanding. This raises the question of why such a seemingly simple account of explanation, if correct, would not have been identified and agreed upon decades ago. The answer is that only recently has it been made possible to analyze explanation in terms of understanding without the risk of collapsing both to merely phenomenological states. For the most part, theories of explanation were for 50 years held hostage to the historical accident that they far outstripped in sophistication corresponding accounts of understanding. 相似文献
17.
Thomas W Dougherty Robert D Pritchard 《Organizational behavior and human decision processes》1985,35(2):141-155
New measures of role ambiguity, role conflict, and role overload were developed for a group of attorneys located in the headquarters of a large energy company. These measures were based upon a recently developed theory of behavior in organizations, which focuses on specific job products as an essential component of organizational roles. The measures have an attractive potential for applied efforts (e.g., training) to rectify or diminish role stress problems. Forty respondents completed the product-based measures in addition to commonly used measures of the role variables and a number of outcome measures. Results indicated that (1) the product-based role measures displayed patterns of relationships with outcomes which were quite similar to the patterns for commonly used role measures, and (2) the product-based measures of role variables compared favorably to commonly used measures in terms of frequency of relationships to outcome variables and appeared to be somewhat superior in terms of method variance problems. 相似文献
18.
This paper formalizes and provides static and dynamic estimators for a scaling model for rating chess players. The model was suggested by the work of Arpad Elo, the inventor of the chess rating system in current use by both the United States and international chess federations. The model can be viewed as a Thurstone Case V model that permits draws (ties). A related model based on a linear approximation is also analyzed. In the chess application, possibly changing ability parameters are estimated sequentially from sparse data structures that often involve many fewer than observations on the M players to the rated. In contrast, psychological applications of paired-comparison scaling generally use models with no draw provision to estimate static parameters from a systematically obtained data structure such as a replicated “round robin” involving all M entities to be scaled. In the paper, both static and sequential estimators are provided and evaluated for a number of different data structures. Sampling theory for the estimators is developed. The application of rating systems to track temporally changing ability parameters may prove useful in many areas of psychology. 相似文献
19.
The concurrent detection task is a powerful method for assessing interactions in the processing of two sensory signals. On each trial, a stimulus is presented that is composed of one, both, or neither signal, and the observer makes a detection rating for each stimulus. A classical bivariate signal-detection analysis applies to these data, but is limited by its inability to differentiate certain types of sensory interactions from more cognitive components, and by the lack of an associated testing procedure. The present paper presents an alternative analysis, based on the contingency table of sensory ratings. Six classes of effect can be distinguished and tested: (1) simple response bias, (2) detection of the two signals, (3) interference of each signal on the response to the other signal, (4) sensory and response correlation, (5) bivariate response biases, and (6) higher order association. Complete computational detail is provided. 相似文献
20.
The measurement of human behavior is a complex task, both for psychologists and human sciences researchers and with respect
to technology, since advanced and sophisticated instruments may have to be implemented to manage the plurality of variables
involved. In this article, an observational study is presented in which a quantitative procedure, theexternal variables method (Duncan & Fiske, 1977), was integrated with a structural analysis (Magnusson, 1993, 2000) in order to detect the hidden organization
of nonverbal behavior in Italian and Icelandic interactions. To this aim, Theme software was introduced and employed. The
results showed that both the frequency and the typology of gestures deeply change as a function of culture. Moreover, a high
number of patterns was detected in both Italian and Icelandic interactions: They appeared to be complex sequences in which
a huge number of events were constantly happening and recurring. In this domain, Theme software provides a methodological
progression from the quantitative to the structural approach. 相似文献