首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
We present a new mathematical notion, dissimilarity function, and based on it, a radical extension of Fechnerian Scaling, a theory dealing with the computation of subjective distances from pairwise discrimination probabilities. The new theory is applicable to all possible stimulus spaces subject to the following two assumptions: (A) that discrimination probabilities satisfy the Regular Minimality law and (B) that the canonical psychometric increments of the first and second kind are dissimilarity functions. A dissimilarity function Dab for pairs of stimuli in a canonical representation is defined by the following properties: (1) ab?Dab>0; (2) Daa=0; (3) If and , then ; and (4) for any sequence {anXnbn}nN, where Xn is a chain of stimuli, DanXnbn→0?Danbn→0. The expression DaXb refers to the dissimilarity value cumulated along successive links of the chain aXb. The subjective (Fechnerian) distance between a and b is defined as the infimum of DaXb+DbYa across all possible chains X and Y inserted between a and b.  相似文献   

2.
3.
A discrimination function shows the probability or degree with which stimuli are discriminated from each other when presented in pairs. In a previous publication [Kujala, J.V., & Dzhafarov, E.N. (2008). On minima of discrimination functions. Journal of Mathematical Psychology, 52, 116–127] we introduced a condition under which the conformity of a discrimination function with the law of Regular Minimality (which says, essentially, that “being least discriminable from” is a symmetric relation) implies the constancy of the function’s minima (i.e., the same level of discriminability of every stimulus from the stimulus least discriminable from it). This condition, referred to as “well-behavedness,” turns out to be unnecessarily restrictive. In this note we give a significantly more general definition of well-behavedness, applicable to all Hausdorff arc-connected stimulus spaces. The definition employs the notion of the smallest transitively and topologically closed extension of a relation. We provide a transfinite-recursive construction for this notion and illustrate it by examples.  相似文献   

4.
On the law of Regular Minimality: Reply to Ennis   总被引:1,自引:0,他引:1  
Ennis's critique touches on issues important for psychophysics, but the points he makes against the hypothesis that Regular Minimality is a basic property of sensory discrimination are not tenable.(1) Stimulus variability means that one and the same apparent stimulus value (as measured by experimenter) is a probabilistic mixture of true stimulus values. The notion of a true stimulus value is a logical necessity: variability and distribution presuppose the values that vary and are distributed (even if these values are represented by processes or sets rather than real numbers). Regular Minimality is formulated for true stimulus values. That a mixture of probabilities satisfying Regular Minimality does not satisfy this principle (unless it also satisfies Constant Self-Similarity) is an immediate consequence of my 2003 analysis. Stimulus variability can be controlled or estimated: the cases when observed violations of Regular Minimality can be accounted for by stimulus variability corroborate rather than falsify this principle. In this respect stimulus variability is no different from fatigue, perceptual learning, and other factors creating mixtures of discrimination probabilities in an experiment.(2) Could it be that well-behaved Thurstonian-type models are true models of discrimination but their parameters are so adjusted that the violations of Regular Minimality they lead to (due to my 2003 theorems) are too small to be detected experimentally? This is possible, but this amounts to admitting that Regular Minimality is a law after all, albeit only approximate: nothing in the logic of the Thurstonian-type representations per se prevents them from violating Regular Minimality grossly rather than slightly. Moreover, even very small violations predicted by a given class of Thurstonian-type models can be tested in specially designed experiments (perhaps under additional, independently testable assumptions). The results of one such experiment, in which observers were asked to alternately adjust to each other the values of stimuli in two observation areas, indicate that violations of Regular Minimality, if any, are far below limits of plausible interpretability.  相似文献   

5.
The computation of subjective (Fechnerian) distances from discrimination probabilities involves cumulation of appropriately transformed psychometric increments along smooth arcs (in continuous stimulus spaces) or chains of stimuli (in discrete spaces). In a space where any two stimuli that are each other's points of subjective equality are given identical physical labels, psychometric increments are positive differences ψ(x,y)-ψ(x,x) and ψ(y,x)-ψ(x,x), where xy and ψ is the probability of judging two stimuli different. In continuous stimulus spaces the appropriate monotone transformation of these increments (called overall psychometric transformation) is determined uniquely in the vicinity of zero, and its extension to larger values of its argument is immaterial. In discrete stimulus spaces, however, Fechnerian distances critically depend on this extension. We show that if overall psychometric transformation is assumed (A) to be the same for a sufficiently rich class of discrete stimulus spaces, (B) to ensure the validity of the Second Main Theorem of Fechnerian Scaling in this class of spaces, and (C) to agree in the vicinity of zero with one of the possible transformations in continuous spaces, then this transformation can only be identity. This result is generalized to the broad class of “discrete-continuous” stimulus spaces, of which continuous and discrete spaces are proper subclasses.  相似文献   

6.
A Thurstonian-type model for pairwise comparisons is any model in which the response (e.g., “they are the same” or “they are different”) to two stimuli being compared depends, deterministically or probabilistically, on the realizations of two randomly varying representations (perceptual images) of these stimuli. The two perceptual images in such a model may be stochastically interdependent but each has to be selectively dependent on its stimulus. It has been previously shown that all possible discrimination probability functions for same–different comparisons can be generated by Thurstonian-type models of the simplest variety, with independent percepts and deterministic decision rules. It has also been shown, however, that a broad class of Thurstonian-type models, called “well-behaved” (and including, e.g., models with multivariate normal perceptual representations whose parameters are smooth functions of stimuli) cannot simultaneously account for two empirically plausible properties of same–different comparisons, Regular Minimality (which essentially says that “being least discriminable from” is a symmetric relation) and nonconstancy of the minima of discrimination probabilities (the fact that different pairs of least discriminable stimuli are discriminated with different probabilities). These results have been obtained for stimulus spaces represented by regions of Euclidean spaces. In this paper, the impossibility for well-behaved Thurstonian-type models to simultaneously account for Regular Minimality and nonconstancy of minima is established for a much broader notion of well-behavedness applied to a much broader class of stimulus spaces (any Hausdorff arc-connected ones). The universality of Thurstonian-type models with independent perceptual images and deterministic decision rules is shown (by a simpler proof than before) to hold for arbitrary stimulus spaces.  相似文献   

7.
A discrimination function ψ(x,y) assigns a measure of discriminability to stimulus pairs x,y (e.g., the probability with which they are judged to be different in a same-different judgment scheme). If for every x there is a single y least discriminable from x, then this y is called the point of subjective equality (PSE) for x, and the dependence h(x) of the PSE for x on x is called a PSE function. The PSE function g(y) is defined in a symmetrically opposite way. If the graphs of the two PSE functions coincide (i.e., gh−1), the function is said to satisfy the Regular Minimality law. The minimum level functions are restrictions of ψ to the graphs of the PSE functions. The conjunction of two characteristics of ψ, (1) whether it complies with Regular Minimality, and (2) whether the minimum level functions are constant, has consequences for possible models of perceptual discrimination. By a series of simple theorems and counterexamples, we establish set-theoretic, topological, and analytic properties of ψ which allow one to relate to each other these two characteristics of ψ.  相似文献   

8.
The paper presents different representation theorems for the Bradley — Terry — Luce (BTL) models of Beaver and Gokhale and of Davidson and Beaver. In particular, algorithms that can be used in constructing BTL scales are provided. The uniqueness theorems show that the Davidson — Beaver model should be preferred to the Beaver — Gokhale model since the multiplicative order effect parameter is uniquely determined whereas the additive effect parameter is merely a ratio scale. Finally, a relationship to the simple BTL model is established. Let p(a, b) denote the probability that a is chosen when (a, b) is presented in a fixed order. Then the probabilities p(a, b) satisfy the Beaver — Gokhale model if and only if the balanced probabilities pb(a, b):= ½ (p(a, b) + 1–p (b, a)) satisfy the simple BTL model.  相似文献   

9.
Two statistical problems prevent meaningful item analysis of Family Relations Test (FRT) data: (a) multiple assignment invalidates the assumption of independence and (b) low frequencies prevent computation of probabilities. It is suggested that: (a) multiple assignment be disallowed and (b) items be made more representative of family interactions.  相似文献   

10.
Dzhafarov [(2002). Multidimensional Fechnerian scaling: Pairwise comparisons, regular minimality, and nonconstant self-similarity. Journal of Mathematical Psychology, 46, 583-608] claims that Regular Minimality (RM) is a fundamental property of “same-different” discrimination probabilities and supports his claim with some empirical evidence. The key feature of RM is that the mapping, h, between two observation areas based on minimum discrimination probability is invertible. Dzhafarov [(2003a). Thurstonian-type representations for “same-different” discriminations: Deterministic decisions and independent images. Journal of Mathematical Psychology, 47, 184-204; (2003b). Thurstonian-type representations for “same-different” discriminations: Probabilistic decisions and interdependent images. Journal of Mathematical Psychology, 47, 229-243] also demonstrates that well-behaved Thurstonian models of “same-different” judgments are incompatible with RM and Nonconstant Self-Similarity (NCSS). There is extensive empirical support for the latter. Stimulus and neural sources of perceptual noise are discussed and two points are made:
Point 1: Models that require discrimination probabilities for noisy stimuli to possess the property that h is invertible would be too restrictive.
Point 2: In the absence of stimulus noise, violations of RM may be so subtle that their detection would be unlikely.
  相似文献   

11.
Following the emergence of two four-member equivalence classes (A1B1C1D1 and A2B2C2D2), 5 students were exposed to a series of phases including a baseline conditional discrimination reversal (i.e., choosing D2 was reinforced and D1 punished given Sample A1; choosing D1 was reinforced and D2 punished given Sample A2), the delayed introduction of CD/DC transitivity/equivalence probes, DE conditional discrimination training, a second baseline conditional discrimination reversal (i.e., choosing C2 was reinforced given B1, etc.), and a return to original baseline reinforcement contingencies. Results showed that baseline and symmetry probe performances were extremely sensitive to baseline modifications. In contrast, patterns on transitivity/equivalence probes remained predominantly consistent with the originally established equivalence classes, although there were exceptions on some E probe relations for 2 subjects. The dissociation between baseline and symmetry versus transitivity/equivalence patterns may have important implications because it is not easily accounted for by current models of equivalence phenomena.  相似文献   

12.
A new definition of the perceptual separability of stimulus dimensions is given in terms of discrimination probabilities. Omitting technical details, stimulus dimensions are considered separable if the following two conditions are met: (a) the probability of discriminating two sufficiently close stimuli is computable from the probabilities with which one discriminates the projections of these stimuli on the coordinate axes; (b) the psychometric differential for discriminating two sufficiently close stimuli that differ in one coordinate only does not depend on the value of their matched coordinates (the psychometric differential is the difference between the probability of discriminating a comparison stimulus from a reference stimulus and the probability with which the reference is discriminated from itself). Thus defined perceptual separability is analyzed within the framework of the regular variation version of multidimensional Fechnerian scaling. The result of this analysis is that the Fechnerian metric of a stimulus space with perceptually separable dimensions has the structure of a Minkowski power metric with respect to these dimensions. The exponent of this metric equals the psychometric order of the stimulus space, or 1, whichever is greater.  相似文献   

13.
This paper uses a non-distributive system of Boolean fractions (a|b), where a and b are 2-valued propositions or events, to express uncertain conditional propositions and conditional events. These Boolean fractions, ‘a if b’ or ‘a given b’, ordered pairs of events, which did not exist for the founders of quantum logic, can better represent uncertain conditional information just as integer fractions can better represent partial distances on a number line. Since the indeterminacy of some pairs of quantum events is due to the mutual inconsistency of their experimental conditions, this algebra of conditionals can express indeterminacy. In fact, this system is able to express the crucial quantum concepts of orthogonality, simultaneous verifiability, compatibility, and the superposition of quantum events, all without resorting to Hilbert space. A conditional (a|b) is said to be “inapplicable” (or “undefined”) in those instances or models for which b is false. Otherwise the conditional takes the truth-value of proposition a. Thus the system is technically 3-valued, but the 3rd value has nothing to do with a state of ignorance, nor to some half-truth. People already routinely put statements into three categories: true, false, or inapplicable. As such, this system applies to macroscopic as well as microscopic events. Two conditional propositions turn out to be simultaneously verifiable just in case the truth of one implies the applicability of the other. Furthermore, two conditional propositions (a|b) and (c|d) reside in a common Boolean sub-algebra of the non-distributive system of conditional propositions just in case b=d, their conditions are equivalent. Since all aspects of quantum mechanics can be represented with this near classical logic, there is no need to adopt Hilbert space logic as ordinary logic, just a need perhaps to adopt propositional fractions to do logic, just as we long ago adopted integer fractions to do arithmetic. The algebra of Boolean fractions is a natural, near-Boolean extension of Boolean algebra adequate to express quantum logic. While this paper explains one group of quantum anomalies, it nevertheless leaves no less mysterious the ‘influence-at-a-distance’, quantum entanglement phenomena. A quantum realist must still embrace non-local influences to hold that “hidden variables” are the measured properties of particles. But that seems easier than imaging wave-particle duality and instant collapse, as offered by proponents of the standard interpretation of quantum mechanics. Partial support for this work is gratefully acknowledged from the In-House Independent Research Program and from Code 2737 at the Space & Naval Warfare Systems Center (SSC-SD), San Diego, CA 92152-5001. Presently this work is supported by Data Synthesis, 2919 Luna Avenue, San Diego, CA 92117.  相似文献   

14.
This experiment examined the relationship between reinforcer magnitude and quantitative measures of performance on progressive‐ratio schedules. Fifteen rats were trained under a progressive‐ratio schedule in seven phases of the experiment in which the volume of a 0.6‐M sucrose solution reinforcer was varied within the range 6–300 μl. Overall response rates in successive ratios conformed to a bitonic equation derived from Killeen's (1994) Mathematical Principles of Reinforcement. The “specific activation” parameter, a, which is presumed to reflect the incentive value of the reinforcer, was a monotonically increasing function of reinforcer volume; the “response time” parameter, δ, which defines the minimum response time, increased as a function of reinforcer volume; the “currency” parameter, b, which is presumed to reflect the coupling of responses to the reinforcer, declined as a function of volume. Running response rate (response rate calculated after exclusion of the postreinforcement pause) decayed monotonically as a function of ratio size; the index of curvature of this function increased as a function of reinforcer volume. Postreinforcement pause increased as a function of ratio size. Estimates of a derived from overall response rates and postreinforcement pauses showed a modest positive correlation across conditions and between animals. Implications of the results for the quantification of reinforcer value and for the use of progressive‐ratio schedules in behavioral neuroscience are discussed.  相似文献   

15.
This study investigated beliefs about gender discrimination in opportunities for promotion in organisations and their relation to gender and gender-focused ambivalent beliefs as measured, respectively, by the Ambivalent Sexism Inventory (ASI) and the Ambivalence toward Men Inventory (AMI) (Glick and Fiske, Ambivalent sexism. In M.P. Zanna (Ed.), Advances in experimental social psychology, 33: pp. 115-188, San Diego, CA: Academic, 2001a). These two inventories were administered to 225 students at Flinders University in Adelaide, Australia along with discrimination items concerning advantage, responsibility, guilt, and resentment about the advancement of men and women in the workplace. Results showed gender differences in discrimination beliefs and in the hostile and benevolent scales from the ASI and AMI. Gender differences and relations between these scales and the discrimination variables were interpreted in terms of system-justification, self and group interests, and the effects of values and beliefs about deservingness and entitlement. This study was supported by a grant from the Australian Research Council.  相似文献   

16.
Ariel Cohen 《Studia Logica》2008,90(3):369-383
Most solutions to the sorites reject its major premise, i.e. the quantified conditional . This rejection appears to imply a discrimination between two elements that are supposed to be indiscriminable. Thus, the puzzle of the sorites involves in a fundamental way the notion of indiscriminability. This paper analyzes this relation and formalizes it, in a way that makes the rejection of the major premise more palatable. The intuitive idea is that we consider two elements indiscriminable by default, i.e. unless we know some information that discriminates between them. Specifically, following Rough Set Theory, two elements are defined to be indiscernible if they agree on the vague property in question. Then, a is defined to be indiscriminable from b if a is indiscernible by default from b. That is to say, a is indiscriminable from b if it is consistent to assume that a and b agree on the relevant vague property. Indiscernibility by default is formalized with the use of Default Logic, and is shown to have intuitively desirable properties: it is entailed by equality, is reflexive and symmetric. And while the relation is neither transitive nor substitutive, it is “almost” substitutive. This definition of indiscriminability is incorporated into three major theories of vagueness, namely the supervaluationist, epistemic, and contextualist views. Each one of these theories is reduced to a different strategy dealing with multiple extensions in Default Logic, and the rejection of the major premise is shown to follow naturally. Thus, while the proposed notion of indiscriminability does not solve the sorites by itself, it does make the unintuitive conclusion of many of its proposed solutions—the rejection of the major premise—a bit easier to accept.  相似文献   

17.
In a definition (∀x)((xєr)↔D[x]) of the set r, the definiens D[x] must not depend on the definiendum r. This implies that all quantifiers in D[x] are independent of r and of (∀x). This cannot be implemented in the traditional first-order logic, but can be expressed in IF logic. Violations of such independence requirements are what created the typical paradoxes of set theory. Poincaré’s Vicious Circle Principle was intended to bar such violations. Russell nevertheless misunderstood the principle; for him a set a can depend on another set b only if (bєa) or (b ⊆ a). Likewise, the truth of an ordinary first-order sentence with the G?del number of r is undefinable in Tarki’s sense because the quantifiers of the definiens depend unavoidably on r.  相似文献   

18.
Analogies must be symmetric. If a is like b, then b is like a. So if a has property R, and if R is within the scope of the analogy, then b (probably) has R. However, analogical arguments generally single out, or depend upon, only one of a or b to serve as the basis for the inference. In this respect, analogical arguments are directed by an asymmetry. I defend the importance of this neglected – even when explicitly mentioned – feature in understanding analogical arguments.  相似文献   

19.
Mirels (1976) demonstrated that derivations from Implicit Personality Theory were compromised by marked absolute discrepancies between subjects’ estimates of the coendorsement of personality items and the empirical relations between the items. Arguments presented by Jackson, Chan, and Strieker (1979) in a recent critique of this demonstration were shown to be based on an arbitrary and severely restrictive view of implicit theory. Logical and empirical considerations were brought to bear in the present paper to indicate that (a) empirical conditional probabilities of test item coendorsement are an appropriate comparison standard for estimates of those probabilities, (b) large absolute discrepancies between estimated and empirical coendorsement must be regarded as seriously impugning the accuracy of IPT, and (c) exclusive reliance on the correlational correspondence between estimated and empirical coendorsement results in an overly sanguine view of the accuracy of IPT. Moreover, it was shown that subjects fail to discriminate between highly asymmetrical conditional probabilities, a finding directly at variance with the assertion that the presumably veridical postulates of implicit theory are inductively extracted from experience. Also discussed were the relation between IPT and everyday social judgments, and the influence of IPT on behavioral ratings.  相似文献   

20.
This paper continues the development of the Dissimilarity Cumulation theory and its main psychological application, Universal Fechnerian Scaling [Dzhafarov, E.N and Colonius, H. (2007). Dissimilarity Cumulation theory and subjective metrics. Journal of Mathematical Psychology, 51, 290-304]. In arc-connected spaces the notion of a chain length (the sum of the dissimilarities between the chain’s successive elements) can be used to define the notion of a path length, as the limit inferior of the lengths of chains converging to the path in some well-defined sense. The class of converging chains is broader than that of converging inscribed chains. Most of the fundamental results of the metric-based path length theory (additivity, lower semicontinuity, etc.) turn out to hold in the general dissimilarity-based path length theory. This shows that the triangle inequality and symmetry are not essential for these results, provided one goes beyond the traditional scheme of approximating paths by inscribed chains. We introduce the notion of a space with intermediate points which generalizes (and specializes to when the dissimilarity is a metric) the notion of a convex space in the sense of Menger. A space is with intermediate points if for any distinct there is a different point such that (where D is dissimilarity). In such spaces the metric G induced by D is intrinsic: coincides with the infimum of lengths of all arcs connecting to In Universal Fechnerian Scaling D stands for either of the two canonical psychometric increments and (ψ denoting discrimination probability). The choice between the two makes no difference for the notions of arc-connectedness, convergence of chains and paths, intermediate points, and other notions of the Dissimilarity Cumulation theory.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号