全文获取类型
收费全文 | 44篇 |
免费 | 3篇 |
出版年
2022年 | 1篇 |
2021年 | 1篇 |
2020年 | 2篇 |
2019年 | 1篇 |
2018年 | 3篇 |
2017年 | 1篇 |
2015年 | 1篇 |
2014年 | 1篇 |
2013年 | 3篇 |
2012年 | 1篇 |
2011年 | 2篇 |
2010年 | 1篇 |
2009年 | 2篇 |
2008年 | 1篇 |
2006年 | 2篇 |
2005年 | 3篇 |
2002年 | 2篇 |
2001年 | 2篇 |
2000年 | 1篇 |
1998年 | 2篇 |
1997年 | 1篇 |
1992年 | 2篇 |
1991年 | 2篇 |
1990年 | 1篇 |
1989年 | 1篇 |
1988年 | 1篇 |
1987年 | 1篇 |
1986年 | 1篇 |
1984年 | 1篇 |
1981年 | 1篇 |
1979年 | 2篇 |
排序方式: 共有47条查询结果,搜索用时 15 毫秒
41.
Multi-valued Calculi for Logics Based on Non-determinism 总被引:2,自引:0,他引:2
42.
43.
Learning a Generative Probabilistic Grammar of Experience: A Process‐Level Model of Language Acquisition
下载免费PDF全文
![点击此处可从《Cognitive Science》网站下载免费的PDF全文](/ch/ext_images/free.gif)
We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural‐language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this manner takes the form of a directed weighted graph, whose nodes are recursively (hierarchically) defined patterns over the elements of the input stream. We evaluated the model in seventeen experiments, grouped into five studies, which examined, respectively, (a) the generative ability of grammar learned from a corpus of natural language, (b) the characteristics of the learned representation, (c) sequence segmentation and chunking, (d) artificial grammar learning, and (e) certain types of structure dependence. The model's performance largely vindicates our design choices, suggesting that progress in modeling language acquisition can be made on a broad front—ranging from issues of generativity to the replication of human experimental findings—by bringing biological and computational considerations, as well as lessons from prior efforts, to bear on the modeling approach. 相似文献
44.
Maximality is a desirable property of paraconsistent logics, motivated by the aspiration to tolerate inconsistencies, but
at the same time retain from classical logic as much as possible. In this paper we introduce the strongest possible notion
of maximal paraconsistency, and investigate it in the context of logics that are based on deterministic or non-deterministic three-valued matrices.
We show that all reasonable paraconsistent logics based on three-valued deterministic matrices are maximal in our strong sense.
This applies to practically all three-valued paraconsistent logics that have been considered in the literature, including
a large family of logics which were developed by da Costa’s school. Then we show that in contrast, paraconsistent logics based
on three-valued properly nondeterministic matrices are not maximal, except for a few special cases (which are fully characterized).
However, these non-deterministic matrices are useful for representing in a clear and concise way the vast variety of the (deterministic)
three-valued maximally paraconsistent matrices. The corresponding weaker notion of maximality, called premaximal paraconsistency, captures the “core” of maximal paraconsistency of all possible paraconsistent determinizations of a non-deterministic matrix,
thus representing what is really essential for their maximal paraconsistency. 相似文献
45.
46.
Animal Cognition - Based on past experience, food-related-cues can help foragers to predict the presence and the expected quality of food. However, when the food is already visible there is no need... 相似文献
47.