The architecture of complexity: A new blueprint |
| |
Authors: | Peter Turney |
| |
Affiliation: | (1) Department of Philosophy, University of Toronto, M5S 1A1 Toronto, Ontario, Canada |
| |
Abstract: | The logic of scientific discovery is now a concern of computer scientists, as well as philosophers. In the computational approach to inductive inference, theories are treated as algorithms (computer programs), and the goal is to find the simplest algorithm that can generate the given data. Both computer scientists and philosophers want a measure of simplicity, such that simple theories are more likely to be true than complex theories. I attempt to provide such a measure here. I define a measure of simplicity for directed graphs, inspired by Herbert Simon's work. Many structures, including algorithms, can be naturally modelled by directed graphs. Furthermore, I adapt an argument of Simon's to show that simple directed graphs are more stable and more resistant to damage than complex directed graphs. Thus we have a reason for pursuing simplicity, other than purely economical reasons.This paper is based on part of my doctoral dissertation. Thanks to my thesis supervisor Professor Alasdair Urquhart for his encouragement, constructive criticism, and for directing me to several relevant articles; to my advisor, Professor Ian Hacking, for reminding me to concentrate on results that might have some application in the real world; to Professor Eric Mendelsohn for checking my use of graph theory; and to my friend Wendy Brandts for sharing her ideas on a closely related problem.Thanks to my friends and family (the latter being a (proper) subset of the former, and both including Dr. Louise Linney, of course) for ongoing support in my scholastic endeavors.Thanks to the Social Sciences and Humanities Research Council for financial assistance. (Awards 452-86-5885 and 453-87-0513.) Thanks to the University of Toronto for financial assistance. |
| |
Keywords: | |
本文献已被 SpringerLink 等数据库收录! |
|