首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 937 毫秒
1.
In this study a hierarchical model for the structure of vocational interests is proposed. Theoretical and methodological considerations, reanalysis of the C. E. Lunneborg and P. E. Lunneborg (Journal of Vocational Behavior, 1975, 7, 313–326) data, and an alternative interpretation of existing findings suggest that this model accounts for the interrelations among the vocational interest fields better than the hexagonal-circular models, or the four-factorial structure proposed by Lunneborg and Lunneborg (1975). The implications of this hierarchical model for vocational theory and some applications in vocational guidance are discussed.  相似文献   

2.
The class of first order polynomial measurement representations is defined, and a method for proving the existence of such representations is described. The method is used to prove the existence of first order polynomial generalizations of expected utility theory, difference measurement, and additive conjoint measurement. It is then argued that first order polynomial representations provide a deep and far reaching characterization of psychological invariance for subjective magnitudes of multiattributed stimuli. To substantiate this point, two applications of first order polynomial representation theory to the foundations of psychophysics are described. First, Relation theory, a theory of subjective magnitude proposed by Shepard (Journal of Mathematical Psychology, 1981, 24, 21–57) and Krantz (Journal of Mathematical Psychology, 1972, 9, 168–199), is generalized to a theory of magnitude for multiattributed stimuli. The generalization is based on a postulate of context invariance for the constituent uniattribute magnitudes of a multiattribute magnitude. Second, the power law for subjective magnitude is generalized to a multiattribute version of the power law. Finally, it is argued that a common logical pattern underlies multiattribute generalizations of psychological theories to first order polynomial representations. This abstract pattern suggests a strategy for theory construction in multiattribute psychophysics.  相似文献   

3.
The attribution made by an observer (O) to an actor in the forced compliance situation was regarded as a probability revision process which can be described by a Bayesian inference model. Os' perceptions of the forced compliance situation were analyzed in terms of the input components into the Bayesian model: prior probabilities of the relevant attitudes and the diagnostic values of the behaviors which the actor may choose. In order to test propositions made by attribution theory about such perceptions (Kelley, 1967;Messick, 1971), Os viewed actors under conditions of Low Inducement (LI) and High Inducement (HI). Before observing the actor's decision, Os estimated the prior probabilities of the relevant attitudes and the conditional probabilities of compliance and refusal given each of the attitudes. After observing the actor's decision, Os estimated the posterior probabilities of the attitudes. As expected, in the LI condition, compared to the HI condition, compliance was seen as less probable and more diagnostic about the actor's attitudes, and the posterior probability of the corresponding attitude was higher. Contrary to expectations, within both conditions, compliance, compared to refusal, was seen as less diagnostic and more probable.  相似文献   

4.
Various statistics have been proposed as standard methods for calculating and reporting interobserver agreement scores. The advantages and disadvantages of each have been discussed in this journal recently but without resolution. A formula is presented that combines separate measures of occurrence and nonoccurrence percentages of agreement, with weight assigned to each measure, varying according to the observed rate of behavior. This formula, which is a modification of a formula proposed by Clement (1976), appears to reduce distortions due to "chance" agreement encountered with very high or low observed rates of behavior while maintaining the mathematical and conceptual simplicity of the conventional method for calculating occurrence and nonoccurrence agreement.  相似文献   

5.
Suppes (1969, Journal of Mathematical Psychology, 6) apparently proved that any finite automaton could be mimicked by the asymptotic behavior of a stimulussampling S?R model, broadly implying that cognitive theory could be reduced to S?R theory. Here it is shown that the finite automata used by Suppes are more restricted and less powerful than finite automata in general. Furthermore, the S?R models proposed by Suppes are limited to producing the behavior of only these restricted automata, and cannot mimic all finite automata. Hence, the formal claim that S?R models are adequate to produce all behaviors obtainable from finite automata, and the informal claim that cognitive theory reduces to S?R theory, do not follow from Suppes's (1969) result. Some alternative S?R models and their problems are also briefly discussed.  相似文献   

6.
This experiment sought to determine whether previously found metric violations of additive expectancy-value models C.F. J. C. Shanteau, Journal of Experimental Psychology, 1974, 103, 680–691; J. G. Lynch and J. L. Cohen, Journal of Personality and Social Psychology, 1978, 36, 1138–1151) were attributable to the inappropriateness of these models or to nonlinearities in the relationship between numerical ratings and underlying psychological impressions. Undergraduate participants performed two tasks employing the same experimental stimuli. In the first task, they rated the subjective values of hypothetical bets, judged separately and in combination. In the second task, they made pairwise comparisons of the same bets in terms of preference. The use of the same experimental stimuli in both tasks allowed a test of alternative models of utility judgment through application of the criterion of scale convergence (M. H. Birnbaum & C. T. Veit, Perception and Psychophysics, 1974, 15, 7–15). Results suggested that the additive expectancy-value model of judgments of the utilities of combinations of outcomes should be replaced by a weighted averaging rule in which the weight given to the value of each outcome in the averaging process is greater when this value is negative and extreme than when it is neutral.  相似文献   

7.
One of the most notable counterexamples to expected utility theory is the “Allais paradox” (M. Allais, 1953, Econometrica, 31, 503–546). A number of alternative theories have been proposed in an attempt to resolve this paradox, notably including Karmarkar, 1978, Karmarkar, 1979, 24, 67–72). It is shown that SWU theory necessarily involves violations of dominance, but that the theory can be modified to avoid these violations. The result is a special case of J. Quiggin's anticipated utility theory (1982, Journal of Economic Behaviour and Organisation, 3, 323–343).  相似文献   

8.
The quality of life of people with end stage kidney disease (ESKD) has traditionally been measured using instruments that emphasise objective health status. The present study validates an alternative measure, the Personal Wellbeing Index (PWI), which measures subjective wellbeing. An Australian ESKD sample (N?=?172, Mean age?=?64.04, SD?=?14.82) completed the PWI as well as health-specific quality of life measures. The PWI was subject to confirmatory factor analysis, and a series of regressions and between-group comparisons were performed to reveal that it is a psychometrically appropriate measure for this sample. The PWI and health-specific measures each yield different and complementary results. Thus, the PWI is proposed as a complement to existing health-related quality of life tools, in order to broaden understanding of the patient’s subjective experience. The resulting profile is argued to better inform targeted interventions to improve the quality of life of people with ESKD.  相似文献   

9.
We present a new mathematical notion, dissimilarity function, and based on it, a radical extension of Fechnerian Scaling, a theory dealing with the computation of subjective distances from pairwise discrimination probabilities. The new theory is applicable to all possible stimulus spaces subject to the following two assumptions: (A) that discrimination probabilities satisfy the Regular Minimality law and (B) that the canonical psychometric increments of the first and second kind are dissimilarity functions. A dissimilarity function Dab for pairs of stimuli in a canonical representation is defined by the following properties: (1) ab?Dab>0; (2) Daa=0; (3) If and , then ; and (4) for any sequence {anXnbn}nN, where Xn is a chain of stimuli, DanXnbn→0?Danbn→0. The expression DaXb refers to the dissimilarity value cumulated along successive links of the chain aXb. The subjective (Fechnerian) distance between a and b is defined as the infimum of DaXb+DbYa across all possible chains X and Y inserted between a and b.  相似文献   

10.
Verbal phrases denoting uncertainty are of two kinds: positive, suggesting the occurrence of a target outcome, and negative, drawing attention to its nonoccurrence (Teigen & Brun, 1995). This directionality is correlated with, but not identical to, high and low p values. Choice of phrase will in turn influence predictions and decisions. A treatment described as having “some possibility” of success will be recommended, as opposed to when it is described as “quite uncertain,” even if the probability of cure referred to by these two expressions is judged to be the same (Experiment 1). Individuals who formulate their chances of achieving a successful outcome in positive terms are supposed to make different decisions than individuals who use equivalent, but negatively formulated, phrases (Experiments 2 and 3). Finally, negative phrases lead to fewer conjunction errors in probabilistic reasoning than do positive phrases (Experiment 4). For instance, a combination of 2 “uncertain” outcomes is readily seen to be “very uncertain.” But positive phrases lead to fewer disjunction errors than do negative phrases. Thus verbal probabilistic phrases differ from numerical probabilities not primarily by being more “vague,” but by suggesting more clearly the kind of inferences that should be drawn.  相似文献   

11.
In a recent paper, Chrobak and Zaragoza (Journal of Experimental Psychology: General, 142(3), 827–844, 2013) proposed the explanatory role hypothesis, which posits that the likelihood of developing false memories for post-event suggestions is a function of the explanatory function the suggestion serves. In support of this hypothesis, they provided evidence that participant-witnesses were especially likely to develop false memories for their forced fabrications when their fabrications helped to explain outcomes they had witnessed. In three experiments, we test the generality of the explanatory role hypothesis as a mechanism of eyewitness suggestibility by assessing whether this hypothesis can predict suggestibility errors in (a) situations where the post-event suggestions are provided by the experimenter (as opposed to fabricated by the participant), and (b) across a variety of memory measures and measures of recollective experience. In support of the explanatory role hypothesis, participants were more likely to subsequently freely report (E1) and recollect the suggestions as part of the witnessed event (E2, source test) when the post-event suggestion helped to provide a causal explanation for a witnessed outcome than when it did not serve this explanatory role. Participants were also less likely to recollect the suggestions as part of the witnessed event (on measures of subjective experience) when their explanatory strength had been reduced by the presence of an alternative explanation that could explain the same outcome (E3, source test + warning). Collectively, the results provide strong evidence that the search for explanatory coherence influences people’s tendency to misremember witnessing events that were only suggested to them.  相似文献   

12.
13.
Verbal probability phrases (e.g. "possible" or "doubtful") have a feature called "directionality" ( Teigen & Brun, 1995 ), which focuses listeners on event occurrence or nonoccurrence. We conducted an experiment about certainty estimations based on verbal probabilities in order to examine the effect of directionality on perceived certainty. In measuring perceived certainty, we used scale-based method involving responses with a scale (e.g. 101 points' scale, 0 = unlikely to 100 = likely) and numerical method involving responses such as "50%." We found that, although the effect of directionality on perceived certainty was observed in using the scale-based method, the effect disappeared when the numerical method was used. We discuss these results from two types of information processing (intuitive, associative processing and deliberate, rule-based processing).  相似文献   

14.
15.
16.
The Sunk Costs Fallacy or Argument from Waste   总被引:1,自引:0,他引:1  
This project tackles the problem of analyzing a specific form of reasoning called sunk costs in economics and argument from waste in argumentation theory. The project is to build a normative structure representing the form of the argument, and then to apply this normative structure to actual cases in which the sunk costs argument has been used. The method is partly structural and partly empirical. The empirical part is carried out through the analysis of case studies of the sunk costs argument found in business decision-making, as well as other areas like medical decision-making and everyday conversational argumentation. The structural part is carried out by using existing methods and techniques from argumentation theory, like argumentation schemes. The project has three especially significant findings. First, the sunk costs argument is not always fallacious, and in many cases it can be seen to be a rational precommitment strategy. Second, a formal model of argumentation, called practical reasoning, can be constructed that helps a rational critic to judge which sunk costs arguments are fallacious and which are not. Third, this formal model represents an alternative model of rationality to the cost-benefit model based on Bayesian calculation of probabilities. This alternative model is called the argumentation model, and it is based on interpersonal reasoning in dialogue as the model of rational thinking. This model in turn is based on the underlying notion of commitment in dialogue.  相似文献   

17.
A stochastic model of the calibration of subjective probabilities based on support theory (Rottenstreich and Tversky, 1997, Tversky and Koehler, 1994) is presented. This model extends support theory—a general representation of probability judgment—to the domain of calibration, the analysis of the correspondence between subjective and objective probability. The random support model can account for the common finding of overconfidence, and also predicts the form of the relationship between overconfidence and item difficulty (the “hard–easy effect”). The parameters of the model have natural psychological interpretations, such as discriminability between correct and incorrect hypotheses, and extremity of judgment. The random support model can be distinguished from other stochastic models of calibration by: (a) using fewer parameters, (b) eliminating the use of variable cutoffs by mapping underlying support directly into judged probability, (c) allowing validation of model parameters with independent assessments of support, and (d) applying to a wide variety of tasks by framing probability judgment in the integrative context of support theory.  相似文献   

18.
Neither M. G. McGee (Developmental Review, 1981, 1, 289–295) nor M. J. Allen, M. A. Wittig, and K. Butler (Developmental Review, 1981, 1, 284–288) suggest any alternative explanation for our finding that water-level performance appears to have an X-linked genetic basis. The power calculations of Allen et al. are found to be faulty, and McGee confuses the hypothesis we tested with a weaker hypothesis. Although the X-linked genetic model is not an adequate model of water-level performance the water-level data fit the X-linked model far better than color blindness and HCN data McGee presents as exemplars of X-linked characteristics.  相似文献   

19.
Drawing on the theory and research of psychophysics, a nonlinear model is hypothesized to explain the connection between education and income and occupational prestige. To achieve this, Weber's (R. L. Gregory, 1981, Mind in Science, Cambridge, Cambridge Univ. Press, pp. 501–503) and Stevens' (S. S. Stevens, 1970, Science170, 1043–1050) laws are brought together in an intrinsically nonlinear model. Guided by the earlier work of R. L. Hamblin (1971, Sociometry, 34, 423–452) and others, the work of O. D. Duncan (1961, in A. J. Reiss, Jr., O. D. Duncan, P. K. Hatt, & C. C. North (Eds.), Occupations and Social Status, New York, Free Press) is reanalyzed testing the possibility that work on the socioeconomic index can be understood as a prestige allocation process which follows psychophysical principles. That is, prestige is assigned to occupations, given specifiable levels of educational and income attainment, in a manner parallel to the way in which individuals respond to changes in the intensity of other stimuli. Using first the data developed by Duncan (1961) to test the model and the 1963 NORC data (R. W. Hodge, P. M. Siegel, & P. H. Rossi, 1964, American Journal of Sociology, 70, 286–302) to replicate it, a measurement model consistent with the theoretical model is evaluated. Comparing the results of the nonlinear model to that of the linear, it is concluded that a model is obtained yielding theoretical confirmation with no loss in predictive accuracy. The resultant nonlinear model yields alternative substantive implications concerning the relative influence of income and education on occupational prestige to those to be inferred from linear models. Perhaps most important, however, is the candidacy given by these results to psychophysics as the explanatory mechanism in the prestige allocation process.  相似文献   

20.
An empirical evaluation of temporal aspects of contextual conditioning was conducted in relation to Asratyan’s (1965) theory of transswitching and to an alternative explanation that was partly stimulated by the Rescorla-Wagner model (Rescorla and Wagner 1972). On the basis of a human electrodermal conditioning preparation suggested by Kimmel and Ray (1978), five groups with 12 subjects each were run. The results indicated that the basic phenomena of transswitching are robust and therefore could be replicated; but the Asratyan theory was rejected. All the results supported an alternative explanation: in contextual conditioning, duration of contextual stimuli is less important than order. Phasic switching is due to simultaneous occurrence of stimuli (differential compound conditioning) and is therefore compatible with the Rescorla-Wagner model. Tonic switching is due to signals that occur before a marked sequence of conditioning trials, in part a challenge to the Rescorla-Wagner model. Long delays between critical events can perhaps be compensated for by mediating memory processes.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号