首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Principal components analysis of sampled functions   总被引:3,自引:0,他引:3  
This paper describes a technique for principal components analysis of data consisting ofn functions each observed atp argument values. This problem arises particularly in the analysis of longitudinal data in which some behavior of a number of subjects is measured at a number of points in time. In such cases information about the behavior of one or more derivatives of the function being sampled can often be very useful, as for example in the analysis of growth or learning curves. It is shown that the use of derivative information is equivalent to a change of metric for the row space in classical principal components analysis. The reproducing kernel for the Hilbert space of functions plays a central role, and defines the best interpolating functions, which are generalized spline functions. An example is offered of how sensitivity to derivative information can reveal interesting aspects of the data.This research was supported by Grant PA 0320 to the second author by the Natural Sciences and Research Council of Canada. We are grateful to the reviewers of an earlier version and to J. B. Kruskal and S. Winsberg for many helpful comments concerning exposition.  相似文献   

2.
Parallel factor analysis (PARAFAC) is a useful multivariate method for decomposing three-way data that consist of three different types of entities simultaneously. This method estimates trilinear components, each of which is a low-dimensional representation of a set of entities, often called a mode, to explain the maximum variance of the data. Functional PARAFAC permits the entities in different modes to be smooth functions or curves, varying over a continuum, rather than a collection of unconnected responses. The existing functional PARAFAC methods handle functions of a one-dimensional argument (e.g., time) only. In this paper, we propose a new extension of functional PARAFAC for handling three-way data whose responses are sequenced along both a two-dimensional domain (e.g., a plane with x- and y-axis coordinates) and a one-dimensional argument. Technically, the proposed method combines PARAFAC with basis function expansion approximations, using a set of piecewise quadratic finite element basis functions for estimating two-dimensional smooth functions and a set of one-dimensional basis functions for estimating one-dimensional smooth functions. In a simulation study, the proposed method appeared to outperform the conventional PARAFAC. We apply the method to EEG data to demonstrate its empirical usefulness.  相似文献   

3.
Norihiro Kamide 《Studia Logica》2005,80(2-3):265-289
A general Gentzen-style framework for handling both bilattice (or strong) negation and usual negation is introduced based on the characterization of negation by a modal-like operator. This framework is regarded as an extension, generalization or re- finement of not only bilattice logics and logics with strong negation, but also traditional logics including classical logic LK, classical modal logic S4 and classical linear logic CL. Cut-elimination theorems are proved for a variety of proposed sequent calculi including CLS (a conservative extension of CL) and CLScw (a conservative extension of some bilattice logics, LK and S4). Completeness theorems are given for these calculi with respect to phase semantics, for SLK (a conservative extension and fragment of LK and CLScw, respectively) with respect to a classical-like semantics, and for SS4 (a conservative extension and fragment of S4 and CLScw, respectively) with respect to a Kripke-type semantics. The proposed framework allows for an embedding of the proposed calculi into LK, S4 and CL.  相似文献   

4.
We investigate M-trees, that is, trees with structure possible at each node or level. M is a mathematical structure such as a set or a Cartesian product. An extension of Pólya's theorem is proved which allows the number of M-trees for a given number of nodes to be counted. The special case of componential trees is investigated. Here M is a Cartesian product of 0's and 1's. A componential analysis is a componential tree of depth 1, that is, there is no hierarchy. We prove that for any componential tree there exists a componential analysis which makes the same predictions on a triad test of judged similarity. A brief empirical example is given, in which a componential tree is applied as a model of a sernantic domain.  相似文献   

5.
The belief-bias effect is one of the most-studied biases in reasoning. A recent study of the phenomenon using the signal detection theory (SDT) model called into question all theoretical accounts of belief bias by demonstrating that belief-based differences in the ability to discriminate between valid and invalid syllogisms may be an artifact stemming from the use of inappropriate linear measurement models such as analysis of variance (Dube et al., Psychological Review, 117(3), 831–863, 2010). The discrepancy between Dube et al.’s, Psychological Review, 117(3), 831–863 (2010) results and the previous three decades of work, together with former’s methodological criticisms suggests the need to revisit earlier results, this time collecting confidence-rating responses. Using a hierarchical Bayesian meta-analysis, we reanalyzed a corpus of 22 confidence-rating studies (N =?993). The results indicated that extensive replications using confidence-rating data are unnecessary as the observed receiver operating characteristic functions are not systematically asymmetric. These results were subsequently corroborated by a novel experimental design based on SDT’s generalized area theorem. Although the meta-analysis confirms that believability does not influence discriminability unconditionally, it also confirmed previous results that factors such as individual differences mediate the effect. The main point is that data from previous and future studies can be safely analyzed using appropriate hierarchical methods that do not require confidence ratings. More generally, our results set a new standard for analyzing data and evaluating theories in reasoning. Important methodological and theoretical considerations for future work on belief bias and related domains are discussed.  相似文献   

6.
PMETRIC is a computer program for the analysis of observed psychometric functions. It can estimate the parameters of these functions, using either probit analysis (a parametric technique) or the Spearman-Kärber method (a nonparametric one). For probit analysis, either a maximum likelihood or a minimum χ2 criterion may be used for parameter estimation. In addition, standard errors of parameter estimates can be estimated via bootstrapping. The program can be used to analyze data obtained from either yes-no orm-alternative forced-choice tasks. To facilitate the use of PMETRIC in simulation work, an associated program, PMETGEN, is provided for the generation of simulated psychometric function data. Use of PMETRIC is illustrated with data from a duration discrimination task.  相似文献   

7.
Two groups of Ss received either two or 16 paired classical conditioning trials beyond the peak CR. A third group received the same stimuli as in the 16 postpeak condition but in an unpaired and random order. The stimuli in all three groups were delivered directly to S. Subsequently, all three groups, including a fourth which was not given any prior direct classical conditioning, were exposed to vicariously instigated classical conditioning. This consisted of havingS observe someone (model) employed byE who received the same CS as was delivered during direct conditioning. The CS was paired with the feigned arm movement of the model, simulating a reaction to shock. This vicarious classical conditioning procedure when compared to direct classical conditioning resulted in smaller GSR magnitudes for both the CRs and UCRs. Previous experience with direct classical conditioning seems to have had an attenuating effect on GSR magnitude during the vicarious situation. A postexperimental questionnaire tended to support the results, and the relationship between the present study and current classical conditioning theory is discussed.  相似文献   

8.
The security level models of Gilboa [1988. A combination of expected utility and maxmin decision criteria. Journal of Mathematical Psychology, 32, 405-420] and of Jaffray [1988. Choice under risk and the security factor: An axiomatic model. Theory and Decision, 24, 169-200] as well as the security and potential level model of Cohen [1992. Security level, potential level, expected utility: A three-criteria decision model under risk. Theory and Decision, 33, 101-104] and Essid [1997. Choice under risk with certainty and potential effects: A general axiomatic model. Mathematical Social Sciences, 34, 223-247] successfully accommodate classical Allais paradoxes while they offer an interesting explanation for their occurrence. However, experimental data suggest a systematic violation of these models when lotteries with low probabilities of bad or good outcomes are involved. In our opinion, one promising candidate for the explanation of these violations is the assumption of thresholds in the perception of security and potential levels. The present paper develops an axiomatic model that allows for such thresholds, so that the derived representation of preferences can accommodate the observed violations of the original security and potential level models.  相似文献   

9.
A general one-way analysis of variance components with unequal replication numbers is used to provide unbiased estimates of the true and error score variance of classical test theory. The inadequacy of the ANOVA theory is noted and the foundations for a Bayesian approach are detailed. The choice of prior distribution is discussed and a justification for the Tiao-Tan prior is found in the particular context of the “n-split” technique. The posterior distributions of reliability, error score variance, observed score variance and true score variance are presented with some extensions of the original work of Tiao and Tan. Special attention is given to simple approximations that are available in important cases and also to the problems that arise when the ANOVA estimate of true score variance is negative. Bayesian methods derived by Box and Tiao and by Lindley are studied numerically in relation to the problem of estimating true score. Each is found to be useful and the advantages and disadvantages of each are discussed and related to the classical test-theoretic methods. Finally, some general relationships between Bayesian inference and classical test theory are discussed. Supported in part by the National Institute of Child Health and Human Development under Research Grant 1 PO1 HDO1762. Reproduction, translation, use or disposal by or for the United States Government is permitted.  相似文献   

10.
A recursive dynamic programming strategy is discussed for optimally reorganizing the rows and simultaneously the columns of ann ×n proximity matrix when the objective function measuring the adequacy of a reorganization has a fairly simple additive structure. A number of possible objective functions are mentioned along with several numerical examples using Thurstone's paired comparison data on the relative seriousness of crime. Finally, the optimization tasks we propose to attack with dynamic programming are placed in a broader theoretical context of what is typically referred to as the quadratic assignment problem and its extension to cubic assignment.Partial support for this research was provided by NIJ Grant 80-IJ-CX-0061.  相似文献   

11.
A weighted Euclidean distance model for analyzing three-way dissimilarity data (stimuli by stimuli by subjects) for heterogeneous subjects is proposed. First, it is shown that INDSCAL may fail to identify a common space representative of the observed data structure in presence of heterogeneity. A new model that removes the rotational invariance of the classical multidimensional scaling problem and specifies K common homogeneous spaces is proposed. The model, called mixture INDSCAL in K classes, or briefly K-INDSCAL, still includes individual saliencies. However, the large number of parameters in K-INDSCAL may produce instability of the estimates and therefore a parsimonious model will also be discussed. The parameters of the model are estimated in a least-squares fitting context and an efficient coordinate descent algorithm is given. The usefulness of K-INDSCAL is demonstrated by both artificial and real data analyses.  相似文献   

12.
In nonexperimental data, at least three possible explanations exist for the association of two variables x and y: (1) x is the cause of y, (2) y is the cause of x, or (3) an unmeasured confounder is present. Statistical tests that identify which of the three explanatory models fits best would be a useful adjunct to the use of theory alone. The present article introduces one such statistical method, direction dependence analysis (DDA), which assesses the relative plausibility of the three explanatory models on the basis of higher-moment information about the variables (i.e., skewness and kurtosis). DDA involves the evaluation of three properties of the data: (1) the observed distributions of the variables, (2) the residual distributions of the competing models, and (3) the independence properties of the predictors and residuals of the competing models. When the observed variables are nonnormally distributed, we show that DDA components can be used to uniquely identify each explanatory model. Statistical inference methods for model selection are presented, and macros to implement DDA in SPSS are provided. An empirical example is given to illustrate the approach. Conceptual and empirical considerations are discussed for best-practice applications in psychological data, and sample size recommendations based on previous simulation studies are provided.  相似文献   

13.
Frank Saunders Jr. 《Dao》2014,13(2):215-229
In this paper, I examine the concept of truth in classical Chinese philosophy, beginning with a critical examination of Chad Hansen’s claim that it has no such concept. By using certain passages that emphasize analogous concepts in the philosophy of language of the Later Mohist Canons, I argue that while there is no word in classical Chinese that functions as truth generally does in Western philosophy for grammatical reasons, the Later Mohists were certainly working with a notion of semantic adequacy in which a language-to-world relationship is made an object of investigation, challenging Hansen’s position that classical Chinese functions within a primarily pragmatic linguistic framework in which a language-to-user relationship determines the meaning of words.  相似文献   

14.
Quantitative opponent-colors theory is based on cancellation of redness by admixture of a standard green, of greenness by admixture of a standard red, of yellowness by blue, and of blueness by yellow. The fundamental data are therefore the equilibrium colors: the set A1 of lights that are in red/green equilibrium and the set A2 of lights that are in yellow/blue equilibrium. The result that a cancellation function is linearly related to the color-matching functions can be proved from more basic axioms, particularly, the closure of the set of equilibrium colors under linear operations. Measurement analysis treats this as a representation theorem, in which the closure properties are axioms and in which the colorimetric homomorphism has the cancellation functions as two of its coordinates.Consideration of equivalence relations based on opponent cancellation leads to a further step: analysis of equivalence relations based on direct matching of hue attributes. For additive whiteness matching, this yields a simple extension of the representation theorem, in which the third coordinate is luminance. For other attributes, precise representation theorems must await a better qualitative characterization of various nonlinear phenomena, especially the veiling of one hue attribute by another and the various hue shifts.  相似文献   

15.
The concept of scaffolding is generally invoked to refer to the ways in which a more expert individual assists a child by performing a part a task or by otherwise directing or supporting a child's task-related actions. A coactive systems model of development provides a framework for examining other ways in which person-environment relations may scaffold development. From a coactive systems view, the unit of analysis for understanding development is the coactive person-environment system. Within such a system, although individual actors exert control over their actions, thoughts and feelings, action is the product of coactions among each element of the system over time. From this view, coactive scaffolding refers to any process outside of an individual's direct control that functions to direct individual action toward novel or higher-order forms. Three broad categories (and subtypes) of coactive scaffolding are proposed and illustrated: ecological scaffolding, social scaffolding, and self-scaffolding.  相似文献   

16.
Michael Baumgartner 《Synthese》2014,191(7):1349-1373
A natural language argument may be valid in at least two nonequivalent senses: it may be interpretationally or representationally valid (Etchemendy in The concept of logical consequence. Harvard University Press, Cambridge, 1990). Interpretational and representational validity can both be formally exhibited by classical first-order logic. However, as these two notions of informal validity differ extensionally and first-order logic fixes one determinate extension for the notion of formal validity (or consequence), some arguments must be formalized by unrelated nonequivalent formalizations in order to formally account for their interpretational or representational validity, respectively. As a consequence, arguments must be formalized subject to different criteria of adequate formalization depending on which variant of informal validity is to be revealed. This paper develops different criteria that formalizations of an argument have to satisfy in order to exhibit the latter’s interpretational or representational validity.  相似文献   

17.
Luce's Axiom is interpreted in terms of a sequence of measures on the unit interval, and their limit properties are discussed. In particular, all limit laws are found to be either absolutely continuous with density xα for α ∈ (?1, ∞) or else degenerate laws consisting of a point mass at 0 or 1. A close connection between Luce's choice theory and Karamata's theory of regularly varying functions is established and systematically used.  相似文献   

18.
19.
The published and unpublished research pertaining to Caldwell's Home Observation for Measurement of the Environment (HOME) inventory was recently reviewed by R. Elardo and R. H. Bradley (Developmental Review, 1981, 1, 113–145). Their review was unfortunately deficient in a number of respects. For example, certain methodological issues bearing on the interpretation of the available data were either unrecognized or dismissed without adequate consideration. In addition, some of the data in the studies cited which contradicted or clouded their conclusions were omitted. An extension and reanalysis of the conclusions reached by Elardo and Bradley in light of recently published studies which were not included in their review, and data not discussed in the studies which were reviewed, is provided. Methodological problems are highlighted, and directions for future research are suggested.  相似文献   

20.
Research on temporal-order perception uses temporal-order judgment (TOJ) tasks or synchrony judgment (SJ) tasks in their binary SJ2 or ternary SJ3 variants. In all cases, two stimuli are presented with some temporal delay, and observers judge the order of presentation. Arbitrary psychometric functions are typically fitted to obtain performance measures such as sensitivity or the point of subjective simultaneity, but the parameters of these functions are uninterpretable. We describe routines in MATLAB and R that fit model-based functions whose parameters are interpretable in terms of the processes underlying temporal-order and simultaneity judgments and responses. These functions arise from an independent-channels model assuming arrival latencies with exponential distributions and a trichotomous decision space. Different routines fit data separately for SJ2, SJ3, and TOJ tasks, jointly for any two tasks, or also jointly for the three tasks (for common cases in which two or even the three tasks were used with the same stimuli and participants). Additional routines provide bootstrap p-values and confidence intervals for estimated parameters. A further routine is included that obtains performance measures from the fitted functions. An R package for Windows and source code of the MATLAB and R routines are available as Supplementary Files.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号