全文获取类型
收费全文 | 26678篇 |
免费 | 290篇 |
国内免费 | 2篇 |
专业分类
26970篇 |
出版年
2020年 | 74篇 |
2019年 | 110篇 |
2018年 | 3570篇 |
2017年 | 2908篇 |
2016年 | 2353篇 |
2015年 | 297篇 |
2014年 | 199篇 |
2013年 | 666篇 |
2012年 | 784篇 |
2011年 | 2619篇 |
2010年 | 2647篇 |
2009年 | 1589篇 |
2008年 | 1852篇 |
2007年 | 2330篇 |
2006年 | 196篇 |
2005年 | 332篇 |
2004年 | 281篇 |
2003年 | 239篇 |
2002年 | 187篇 |
2001年 | 235篇 |
2000年 | 249篇 |
1999年 | 174篇 |
1998年 | 82篇 |
1997年 | 63篇 |
1996年 | 58篇 |
1993年 | 57篇 |
1992年 | 113篇 |
1991年 | 112篇 |
1990年 | 115篇 |
1989年 | 97篇 |
1988年 | 90篇 |
1987年 | 95篇 |
1986年 | 98篇 |
1985年 | 106篇 |
1984年 | 91篇 |
1983年 | 72篇 |
1981年 | 58篇 |
1980年 | 56篇 |
1979年 | 110篇 |
1978年 | 59篇 |
1976年 | 63篇 |
1975年 | 76篇 |
1974年 | 104篇 |
1973年 | 91篇 |
1972年 | 76篇 |
1971年 | 78篇 |
1970年 | 79篇 |
1969年 | 82篇 |
1968年 | 76篇 |
1967年 | 88篇 |
排序方式: 共有10000条查询结果,搜索用时 0 毫秒
911.
James F. Woodward 《Synthese》2011,182(1):165-179
This paper provides a restatement and defense of the data/ phenomena distinction introduced by Jim Bogen and me several decades
ago (e.g., Bogen and Woodward, The Philosophical Review, 303–352, 1988). Additional motivation for the distinction is introduced,
ideas surrounding the distinction are clarified, and an attempt is made to respond to several criticisms. 相似文献
912.
We describe an ontology of philosophy that is designed to aid navigation through philosophical literature, including literature
in the form of encyclopedia articles and textbooks and in both printed and digital forms. The ontology is designed also to
serve integration and structuring of data pertaining to the philosophical literature, and in the long term also to support
reasoning about the provenance and contents of such literature, by providing a representation of the philosophical domain
that is oriented around what philosophical literature is about. 相似文献
913.
Brigitte Falkenburg 《Synthese》2011,182(1):149-163
Depending on different positions in the debate on scientific realism, there are various accounts of the phenomena of physics. For scientific realists like Bogen and Woodward, phenomena are matters
of fact in nature, i.e., the effects explained and predicted by physical theories. For empiricists like van Fraassen, the
phenomena of physics are the appearances observed or perceived by sensory experience. Constructivists, however, regard the
phenomena of physics as artificial structures generated by experimental and mathematical methods. My paper investigates the
historical background of these different meanings of “phenomenon” in the traditions of physics and philosophy. In particular,
I discuss Newton’s account of the phenomena and Bohr’s view of quantum phenomena, their relation to the philosophical discussion,
and to data and evidence in current particle physics and quantum optics. 相似文献
914.
Jaakko Hintikka 《Synthese》2011,183(1):69-85
The modern notion of the axiomatic method developed as a part of the conceptualization of mathematics starting in the nineteenth
century. The basic idea of the method is the capture of a class of structures as the models of an axiomatic system. The mathematical
study of such classes of structures is not exhausted by the derivation of theorems from the axioms but includes normally the
metatheory of the axiom system. This conception of axiomatization satisfies the crucial requirement that the derivation of
theorems from axioms does not produce new information in the usual sense of the term called depth information. It can produce
new information in a different sense of information called surface information. It is argued in this paper that the derivation
should be based on a model-theoretical relation of logical consequence rather than derivability by means of mechanical (recursive)
rules. Likewise completeness must be understood by reference to a model-theoretical consequence relation. A correctly understood
notion of axiomatization does not apply to purely logical theories. In the latter the only relevant kind of axiomatization
amounts to recursive enumeration of logical truths. First-order “axiomatic” set theories are not genuine axiomatizations.
The main reason is that their models are structures of particulars, not of sets. Axiomatization cannot usually be motivated
epistemologically, but it is related to the idea of explanation. 相似文献
915.
The distinction between data and phenomena introduced by Bogen and Woodward (Philosophical Review 97(3):303–352, 1988) was
meant to help accounting for scientific practice, especially in relation with scientific theory testing. Their article and
the subsequent discussion is primarily viewed as internal to philosophy of science. We shall argue that the data/phenomena
distinction can be used much more broadly in modelling processes in philosophy. 相似文献
916.
Cedric Paternotte 《Synthese》2011,183(2):249-276
Defined and formalized several decades ago, widely used in philosophy and game theory, the concept of common knowledge is
still considered as problematic, although not always for the right reasons. I suggest that the epistemic status of a group
of human agents in a state of common knowledge has not been thoroughly analyzed. In particular, every existing account of
common knowledge, whether formal or not, is either too strong to fit cognitively limited individuals, or too weak to adequately
describe their state. I provide a realistic definition of common knowledge, based on a formalization of David Lewis’ seminal
account and show that it is formally equivalent to probabilistic common belief. This leads to a philosophical analysis of
common knowledge which answers several common criticisms and sheds light on its nature. 相似文献
917.
Aidan Lyon 《Synthese》2011,182(3):413-432
Some have argued that chance and determinism are compatible in order to account for the objectivity of probabilities in theories
that are compatible with determinism, like Classical Statistical Mechanics (CSM) and Evolutionary Theory (ET). Contrarily,
some have argued that chance and determinism are incompatible, and so such probabilities are subjective. In this paper, I
argue that both of these positions are unsatisfactory. I argue that the probabilities of theories like CSM and ET are not
chances, but also that they are not subjective probabilities either. Rather, they are a third type of probability, which I
call counterfactual probability. The main distinguishing feature of counterfactual-probability is the role it plays in conveying important counterfactual
information in explanations. This distinguishes counterfactual probability from chance as a second concept of objective probability. 相似文献
918.
Over the past 20 years or so, a small but growing literature has emerged with the aim of modeling agents who are unaware of certain things. In this paper we compare two different approaches to modeling unawareness: the object-based approach of Board and Chung (Object-based unawareness: theory and applications. University of Minnesota, Mimeo, 2008) and the subjective-state-space approach of Heifetz et al. (J Econ Theory 130:78–94, 2006). In particular, we show that subjective-state-space models (henceforth HMS structures) can be embedded within object-based models (henceforth OBU structures), demonstrating that the latter are at least as expressive. As long as certain restrictions are imposed on the form of the OBU structure, the embedding can also go the other way. A generalization of HMS structures (relaxing the partitional properties of knowledge) gives us a full converse. 相似文献
919.
Information protocols (IP’s) were developed to describe players who learn their social situation by their experiences. Although IP’s look similar to colored multi-graphs (MG’s), the two objects are constructed in fundamentally different ways. IP’s are constructed using the global concept of history, whereas graphs are constructed using the local concept of edges. We give necessary and sufficient conditions for each theory to be captured by the other. We find that the necessary and sufficient condition for IP theory to be captured by MG theory, which we call SE, excludes relevant game situations. Hence, we conclude that IP theory remains a vital tool and cannot be replaced by MG theory. 相似文献
920.
Charlie Pelling 《Synthese》2011,178(3):437-459
According to the epistemic theory of hallucination, the fundamental psychological nature of a hallucinatory experience is constituted by its being ‘introspectively indiscriminable’, in some sense, from a veridical experience of a corresponding type. How is the notion of introspective indiscriminability to which the epistemic theory appeals best construed? Following M. G. F. Martin, the standard assumption is that the notion should be construed in terms of negative epistemics: in particular, it is assumed that the notion should be explained in terms of the impossibility that a hallucinator might possess a certain type of knowledge on a certain basis. I argue that the standard assumption is mistaken. I argue that the relevant notion of introspective indiscriminability is better construed in terms of positive epistemics: in particular, I argue that the notion is better explained by reference to the fact that it would be rational for a hallucinator positively to make a certain type of judgement, were that judgement made on a certain basis. 相似文献