首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   19篇
  免费   0篇
  2023年   1篇
  2020年   1篇
  2019年   1篇
  2013年   3篇
  2012年   1篇
  2011年   2篇
  2010年   1篇
  2009年   1篇
  2008年   1篇
  2007年   1篇
  2006年   1篇
  2005年   2篇
  1994年   1篇
  1993年   1篇
  1991年   1篇
排序方式: 共有19条查询结果,搜索用时 15 毫秒
1.
Twelve rats made repeated choices on an adjusting-delay schedule between a smaller reinforcer (A) that was delivered immediately after a response and a larger reinforcer (B) that was delivered after a delay which increased or decreased by 20% depending on the subject's choices in successive blocks of trials. In two phases of the experiment (100 sessions and 40 sessions), reinforcer sizes were selected which enabled theoretical parameters expressing the rate of delay discounting and sensitivity to reinforcer size to be estimated from the ratio of the indifference delays obtained in the two phases. Indifference delays, calculated from adjusting delays in the last 10 sessions of each phase, were shorter when the sizes of A and B were 14 and 25 μl of a 0.6 M sucrose solution than when they were 25 and 100 μl of the same solution. The ratio of the indifference delays was significantly smaller than that predicted on the basis of an assumed linear relation between reinforcer size and instantaneous reinforcer value, consistent with a previous proposal that this relation may be hyperbolic in form. Estimates of the rate of delay discounting based on the ratio of the two indifference delays (mean, 0.08 s(-1)) were similar to values obtained previously using different intertemporal choice protocols. Estimates of the size-sensitivity parameter (mean 113 μl) were similar to estimates recently derived from performance on progressive-ratio schedules. In both phases of the experiment, adjusting delays in successive blocks of trials were analyzed using the Fourier transform. The power spectrum obtained from individual rats had a dominant frequency that corresponded to a period of oscillation of the adjusting delay between 30 and 100 trial blocks (mean, 78). Power in the dominant frequency band was highest in the early sessions of the first phase and declined with extended training. It is suggested that this experimental protocol may have utility in neurobehavioral studies of intertemporal choice.  相似文献   
2.
Adaptive learning games should provide opportunities for the student to learn as well as motivate playing until goals have been reached. In this paper, we give a mathematically rigorous treatment of the problem in the framework of Bayesian decision theory. To quantify the opportunities for learning, we assume that the learning tasks that yield the most information about the current skills of the student, while being desirable for measurement in their own right, would also be among those that are efficient for learning. Indeed, optimization of the expected information gain appears to naturally avoid tasks that are exceedingly demanding or exceedingly easy as their results are predictable and thus uninformative. Still, tasks that are efficient for learning may be experienced as too challenging, and the resulting failures can lower motivation. Therefore, in addition to quantifying the expected informational benefit for learning of any prospective task to be presented next, we also model the expected motivational cost of its presentation, measured simply as the estimated probability of failure in our example model. We propose a “learner-friendly” adaptation algorithm that chooses the learning tasks by optimizing the expected benefit divided by the expected cost. We apply this algorithm to a Rasch-like student model implemented within a real-world application and present initial results of a pilot experiment.  相似文献   
3.
We describe new Fourier- and shape-based methods for quantifying variation in phase-portraits, and re-analyze previously-published ontogenetic and adult data [Clark, J. E., & Phillips, S. J. (1993). A longitudinal study of intralimb coordination in the first year of independent walking: A dynamical systems approach. Child Development, 64, 1143–1157]. Results show considerable variation between individuals and through development, but after 6 months of walking some gait patterns stabilize.  相似文献   
4.
In scholarship on the history of philosophy, it is widely assumed that Charles Fourier was a utopian socialist who could not have exerted a significant influence on the development of Karl Marx's thought. Indeed, both Marx and Engels seem to have advanced this view. In contrast, I argue that in 1844 when Marx was developing his anthropology and social critique, he relied upon Fourier's thought to supply a key assumption. After establishing this connection, I explain why Marx's tacit reliance on Fourier creates a problematic undercurrent in his thought.  相似文献   
5.
Principles of quantitative electroencephalography (EEG) relevant to neurotherapy are reviewed. A brief history of EEG, the general properties of human EEG, and the issues and obstacles associated with quantitative methods are discussed. Fourier analysis is also described.  相似文献   
6.
We propose a new psychometric model for two-dimensional stimuli, such as color differences, based on parameterizing the threshold of a one-dimensional psychometric function as an ellipse. The Ψ Bayesian adaptive estimation method applied to this model yields trials that vary in multiple stimulus dimensions simultaneously. Simulations indicate that this new procedure can be much more efficient than the more conventional procedure of estimating the psychometric function on one-dimensional lines independently, requiring only one-fourth or less the number of trials for equivalent performance in typical situations. In a real psychophysical experiment with a yes-no task, as few as 22 trials per estimated threshold ellipse were enough to consistently demonstrate certain color appearance phenomena. We discuss the practical implications of the multidimensional adaptation. In order to make the application of the model practical, we present two significantly faster algorithms for running the Ψ method: a discretized algorithm utilizing the Fast Fourier Transform for better scaling with the sampling rates and a Monte Carlo particle filter algorithm that should be able to scale into even more dimensions.  相似文献   
7.
In this article, some additive models of behavioral measures are defined, and distributional tests of them are proposed. Major theoretical results include (a) conditions for additivity of components to predict the highest level of dominance in a model-free stochastic dominance hierarchy, (b) methods of identifying the shape of the hazard rate function of an added component from certain relationships among the observable density and distribution functions, and (c) effects of stochastic dependence between components on the distributional tests. Feasibility and usefulness of the methods are demonstrated by application to choice RT and ratings experiments.The author was supported by grants MH44640 to Roger Ratcliff and AFOSR 90-0246 (jointly funded by NSF) to Gail McCoon. Parts of this work were presented at the European Mathematical Psychology meetings of 1991. My thanks to F. Gregory Ashby, Bruce Bloxom, Roger Ratcliff, W. Schwarz, Jim Townsend, and an anonymous reviewer for their many helpful suggestions.  相似文献   
8.
J. O. Ramsay 《Psychometrika》1991,56(4):611-630
The option characteristic curve, the relation between ability and probability of choosing a particular option for a test item, can be estimated by nonparametric smoothing techniques. What is smoothed is the relation between some function of estimated examinee ability rankings and the binary variable indicating whether or not the option was chosen. This paper explores the use of kernel smoothing, which is particularly well suited to this application. Examples show that, with some help from the fast Fourier transform, estimates can be computed about 500 times as rapidly as when using commonly used parametric approaches such as maximum marginal likelihood estimation using the three-parameter logistic distribution. Simulations suggest that there is no loss of efficiency even when the population curves are three-parameter logistic. The approach lends itself to several interesting extensions.The author wishes to acknowledge the support of the National Sciences and Engineering Research Council of Canada through grant A320 and the support of Educational Testing Service during his leave there.  相似文献   
9.
Summary  The two Heisenberg Uncertainties (UR) entail an incompatibility between the two pairs of conjugated variables E, t and p, q. But incompatibility comes in two kinds, exclusive of one another. There is incompatibility defineable as: (p → − q) & (q→ − p) or defineable as [(p →− q) & (q →− p)] ↔ r. The former kind is unconditional, the latter conditional. The former, in accordance, is fact independent, and thus a matter of logic, the latter fact dependent, and thus a matter of fact. The two types are therefore diametrically opposed.In spite of this, however, the existing derivations of the Uncertainties are shown here to entail both types of incompatibility simultaneously. Δ E Δ th is known to derive from the quantum relation E = hν plus the Fourier relation Δ ν Δ t ≥ 1. And the Fourier relation assigns a logical incompatibility between Δ ν = 0, Δ t = 0. (Defining a repetitive phenomenon at an instant t → 0 is a self contradictory notion.) An incompatibility, therefore, which is fact independent and unconditional. How can one reconcile this with the fact that Δ EΔ t exists if and only if h > 0, which latter supposition is a factual truth, entailing that a Δ E = 0, Δ t = 0 incompatibility should itself be fact dependent? Are we to say that E and t are unconditionally incompatible (via Δ ν Δ t ≥ 1) on condition that E = hν is at all true? Hence, as presently standing, the UR express a self-contradicting type of incompatibility.To circumvent this undesirable result, I reinterpret E = hν as relating the energy with a period. Though only one such period. And not with frequency literally. (It is false that E = ν . It is true that E = ν times the quantum.) In this way, the literal concept of frequency does not enter as before, rendering Δ ν Δ t ≥ 1 inapplicable. So the above noted contradiction disappears. Nevertheless, the Uncertainties are derived. If energy is only to be defined over a period, momentum only over a distance (formerly a wavelength) resulting during such period, thus yielding quantized action of dimensions Et = pq, then energies will become indefinite at instants, momenta indefinite at points, leading, as demanded, to (symmetric!) Δ E Δ t = Δ p Δ qh’s.  相似文献   
10.
Since Berger’s discovery of the electroencephalogram (EEG), its analysis has been generally restricted to the visual range (upmost 100Hz) and has ignored higher frequency components. One reason should be that there are no reliable methods to distinguish the brain potentials from muscle activity. We have introduced fluctuation analysis, which is popular method especially in the field of basic physiology to clinical electrophysiology. In our previous study, it was declared that power spectral density (PSD) of human high frequency EEG was composed of double Lorentzians and vanished into white level within 1kHz. Then the purpose of this study is to elucidate the “Automated Fluctuation Analysis,” which enables us to evaluate these higher frequency components and its physiological meaning especially focused on conscious level from wakefulness to sleep stage 1. Seventy-four scalp recording EEGs in twenty normal subjects were studied. In short, “Automated Fluctuation Analysis” is made of three steps: amplification of EEG signal, A/D conversion and Fast Fourier Transform by signal processor and extraction of Lorentzian parameters. PSD of high frequency EEG was displayed on log-log graph and the algorithm fit to the following Lorentzian formula were mathematically based on Brown & Dennis. S(f)=S1/ [1+(f/fc1)2] + S2/ [1+(f/fc2)2], where S(f) is PSD (μ V2/Hz) at each frequency (f;Hz), S1 and S2 are the plateau level or zero-frequency power of the initial and second Lorentz, and fc1 and fc2 are the corner or half-power frequency of the initial and second Lorentz, respectively. As results, during wakefulness the PSD of high frequency EEG activity was composed of double Lorentzian fluctuations and the power distribution of S1 value in topographical display was frontal dominant. This pattern of S1 value disappeared and S2 value became lower during sleepiness and the second Lorentz disappeared during sleep.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号