John Collier | University of KwaZulu-Natal (original) (raw)
Books by John Collier
Every Thing Must Go argues that the only kind of metaphysics that can contribute to objective kno... more Every Thing Must Go argues that the only kind of metaphysics that can contribute to objective knowledge is one based specifically on contemporary science as it really is, and not on philosophers' a priori intuitions, common sense, or simplifications of science. In addition to showing how recent metaphysics has drifted away from connection with all other serious scholarly inquiry as a result of not heeding this restriction, they demonstrate how to build a metaphysics compatible with current fundamental physics ('ontic structural realism'), which, when combined with their metaphysics of the special sciences ('rainforest realism'), can be used to unify physics with the other sciences without reducing these sciences to physics itself. Taking science metaphysically seriously, Ladyman and Ross argue, means that metaphysicians must abandon the picture of the world as composed of self-subsistent individual objects, and the paradigm of causation as the collision of such objects.
Everything Must Go also assesses the role of information theory and complex systems theory in attempts to explain the relationship between the special sciences and physics, treading a middle road between the grand synthesis of thermodynamics and information, and eliminativism about information. The consequences of the author's metaphysical theory for central issues in the philosophy of science are explored, including the implications for the realism vs. empiricism debate, the role of causation in scientific explanations, the nature of causation and laws, the status of abstract and virtual objects, and the objective reality of natural kinds.
Papers by John Collier
Biological Theory (forthcoming 2013)
Evolutionary moral realism is the view that there are moral values with roots in evolution that a... more Evolutionary moral realism is the view that there are moral values with roots in evolution that are both specifically moral and exist independently of human belief systems. In beginning to sketch the outlines of such a view, we examine moral goods like fairness and empathetic caring as valuable and real aspects of the environments of species that are intelligent and social, or at least developing along an evolutionary trajectory that could lead to a level of intelligence that would enable individual members of the species to recognize and respond to such things as the moral goods they in fact are. We suggest that what is most morally interesting and important from a biological perspective is the existence and development of such trajectories, rather than the position of one particular species, such as our own, on one particular trajectory.
We argue that living systems process information such that functionality emerges in them on acont... more We argue that living systems process information such that functionality emerges in them on acontinuous basis. We then provide a framework that can explain and model the normativity of biological functionality. In addition we offer an explanation of the anticipatory nature of functionalitywithin our overall approach. We appeal to a Peircean approach to semiotics, and especially toBiosemiotics, as well as to a dynamical approach to Digital-Analog relations and the interplaybetween different levels of functionality in autonomous systems, taking an integrative approach. Wethen apply the underlying logic to a particular biological system, giving a model of the BCR signalingsystem, in order to demonstrate how biosemiotic concepts can be used to build an account of biological information and functionality. Next we show how this framework can be used to explainand model more complex aspects of biological normativity, for example, how cross-talk betweendifferent signaling pathways can be avoided. Overall, we describe a robust theoretical framework forthe emergence of normative functions and, consequently, for the way information is transducedacross several interconnected organizational levels in an autonomous system, and we demonstratehow this can be applied in real biological phenomena. Our aim is to open the way towards realistictools for the modeling of information and normativity in autonomous biological agents.
It is generally agreed that organisms are Complex Adaptive Systems. Since the rise of Cybernetics... more It is generally agreed that organisms are Complex Adaptive Systems. Since the rise of Cybernetics in the middle of the last century ideas from information theory and control theory have been applied to the adaptations of biological organisms in order to explain how they work. This does not, however, explain functionality, which is widely but not universally attributed to biological systems. There are two approaches to functionality, one based on etiology (what a trait was selected for), and the other based in autonomy. I argue that the etiological approach, as understood in terms of control theory, suffers from a problem of symmetry, by which function can equally well be placed in the environment as in the organism. Focusing on the autonomy view, I note that it can be understood to some degree in terms of control theory in its version called second order cybernetics. I present an approach to second order cybernetics that seems plausible for organisms with limited computational power, due to Hooker, Penfold and Evans. They hold that this approach gives something like concepts, certainly abstractions from specific situations, a trait required for functionality in its system adaptive form (i.e., control of the system by itself). Using this cue, I argue that biosemiotics provides the methodology to incorporate these quasi concepts into an account of functionality.
The paradigm of Laplacean determinism combines three regulative principles: determinism, predicta... more The paradigm of Laplacean determinism combines three regulative principles: determinism, predictability, and the explanatory adequacy of universal laws together with purely local conditions. Historically, it applied to celestial mechanics, but it has been expanded into an ideal for scientific theories whose cogency is often not questioned. Laplace’s demon is an idealization of mechanistic scientific method. Its principles together imply reducibility, and rule out holism and emergence. I will argue that Laplacean determinism fails even in the realm of planetary dynamics, and that it does not give suitable criteria for explanatory success except within very well defined and rather exceptional domains. Ironically, the very successes of Laplacean method in the Solar System were made possible only by processes that are not themselves tractable to Laplacean methodology. The results of some of these processes were first observed in 1964, and violate the Lapacean requirements of locality and predictability, opening the door to holism and nonreducibility, i.e., emergence. Despite the falsification of Laplacean methodology, the explanatory resources of holism and emergence remain in scientific limbo, though emergence has been used somewhat indiscriminately in recent scientific literature. I make some remarks at the end about the proper use of emergence in its traditional sense going back to C.D. Broad.
Cognition, Communication, Co-operation. Vol. 9 no 2: 295-304., 2011
Abstract: There are many different mathematical definitions of information that have their variou... more Abstract: There are many different mathematical definitions of information that have their various uses, but I will be concerned with notions of information used in applications in various branches of science that are distinguished by their topic, i.e., what they apply to. I describe the major uses information, and show their relations to each other. I will argue that the various uses form a nested hierarchy, in which each is a restriction on the previous, inheriting the properties of its predecessor, but adding in new features that make it a special case. The lowest level is physical information determined by distinctions and the highest is explicit representation in linguistic social communication. Is there anything common to information at all these levels? I will argue that there is, and that information in each case is what Donald MacKay (1969) called a distinction that makes a difference. What distinguishes the use of information at each level is what distinctions make a causal difference at that level. At each successive level distinctions that make a difference at a previous level make no difference at that level. In order to create this sort of filter new levels have to be formed by cohesion peculiar to the identifying characteristics at that level. A consequence of this view is that information must have causal powers, and that there is a tight connection between information and causation.
Almost fifty years ago Wilfrid Sellars described two competing ways of imagining the world, the M... more Almost fifty years ago Wilfrid Sellars described two competing ways of imagining the world, the Manifest Image and the Scientific Image. The Manifest Image is an idealization of common sense aided by critical philosophy, whereas the Scientific Image is the product of our best science. The methodologies of the two images are very different: the Manifest Image deals with experience and looks only at relations among bits of experience and analysis of experience into the relations that must lie behind it, whereas the Scientific Image is grounded in explanations of experience, typically causal explanations. This need not be a problem if the two images are compatible. Sellars argued, however, that the Manifest Image implies continuity, but the best science of the time told us (or appeared to tell us) that the world is made up of discrete subatomic particles and discrete transitions between quantum states, making the two incompatible. Although Sellars noted that future science might show that the world is continuous, he did not follow this up. Science in the last fifty years has given much more evidence for continuity in the world from complexity studies and Quantum Mechanics, so perhaps the two images can be reconciled after all.
Open Systems and Information Dynamics, 6 (1999): 241-302.
Both natural and engineered systems are fundamentally dynamical in nature: their defining propert... more Both natural and engineered systems are fundamentally dynamical in nature: their defining properties are causal, and their functional capacities are causally grounded. Among dynamical systems, an interesting and important sub-class are those that are autonomous, anticipative and adaptive (AAA). Living systems, intelligent systems, sophisticated robots and social systems belong
to this class, and the use of these terms has recently spread rapidly through the scientific literature.
Central to understanding these dynamical systems is their complicated organisation and their consequent capacities for re- and self- organisation. But there is at present no general analysis of
these capacities or of the requisite organisation involved. We define what distinguishes AAA systems from other kinds of systems by characterising their central properties in a dynamically
interpreted information theory.
Every Thing Must Go argues that the only kind of metaphysics that can contribute to objective kno... more Every Thing Must Go argues that the only kind of metaphysics that can contribute to objective knowledge is one based specifically on contemporary science as it really is, and not on philosophers' a priori intuitions, common sense, or simplifications of science. In addition to showing how recent metaphysics has drifted away from connection with all other serious scholarly inquiry as a result of not heeding this restriction, they demonstrate how to build a metaphysics compatible with current fundamental physics ('ontic structural realism'), which, when combined with their metaphysics of the special sciences ('rainforest realism'), can be used to unify physics with the other sciences without reducing these sciences to physics itself. Taking science metaphysically seriously, Ladyman and Ross argue, means that metaphysicians must abandon the picture of the world as composed of self-subsistent individual objects, and the paradigm of causation as the collision of such objects.
Everything Must Go also assesses the role of information theory and complex systems theory in attempts to explain the relationship between the special sciences and physics, treading a middle road between the grand synthesis of thermodynamics and information, and eliminativism about information. The consequences of the author's metaphysical theory for central issues in the philosophy of science are explored, including the implications for the realism vs. empiricism debate, the role of causation in scientific explanations, the nature of causation and laws, the status of abstract and virtual objects, and the objective reality of natural kinds.
Biological Theory (forthcoming 2013)
Evolutionary moral realism is the view that there are moral values with roots in evolution that a... more Evolutionary moral realism is the view that there are moral values with roots in evolution that are both specifically moral and exist independently of human belief systems. In beginning to sketch the outlines of such a view, we examine moral goods like fairness and empathetic caring as valuable and real aspects of the environments of species that are intelligent and social, or at least developing along an evolutionary trajectory that could lead to a level of intelligence that would enable individual members of the species to recognize and respond to such things as the moral goods they in fact are. We suggest that what is most morally interesting and important from a biological perspective is the existence and development of such trajectories, rather than the position of one particular species, such as our own, on one particular trajectory.
We argue that living systems process information such that functionality emerges in them on acont... more We argue that living systems process information such that functionality emerges in them on acontinuous basis. We then provide a framework that can explain and model the normativity of biological functionality. In addition we offer an explanation of the anticipatory nature of functionalitywithin our overall approach. We appeal to a Peircean approach to semiotics, and especially toBiosemiotics, as well as to a dynamical approach to Digital-Analog relations and the interplaybetween different levels of functionality in autonomous systems, taking an integrative approach. Wethen apply the underlying logic to a particular biological system, giving a model of the BCR signalingsystem, in order to demonstrate how biosemiotic concepts can be used to build an account of biological information and functionality. Next we show how this framework can be used to explainand model more complex aspects of biological normativity, for example, how cross-talk betweendifferent signaling pathways can be avoided. Overall, we describe a robust theoretical framework forthe emergence of normative functions and, consequently, for the way information is transducedacross several interconnected organizational levels in an autonomous system, and we demonstratehow this can be applied in real biological phenomena. Our aim is to open the way towards realistictools for the modeling of information and normativity in autonomous biological agents.
It is generally agreed that organisms are Complex Adaptive Systems. Since the rise of Cybernetics... more It is generally agreed that organisms are Complex Adaptive Systems. Since the rise of Cybernetics in the middle of the last century ideas from information theory and control theory have been applied to the adaptations of biological organisms in order to explain how they work. This does not, however, explain functionality, which is widely but not universally attributed to biological systems. There are two approaches to functionality, one based on etiology (what a trait was selected for), and the other based in autonomy. I argue that the etiological approach, as understood in terms of control theory, suffers from a problem of symmetry, by which function can equally well be placed in the environment as in the organism. Focusing on the autonomy view, I note that it can be understood to some degree in terms of control theory in its version called second order cybernetics. I present an approach to second order cybernetics that seems plausible for organisms with limited computational power, due to Hooker, Penfold and Evans. They hold that this approach gives something like concepts, certainly abstractions from specific situations, a trait required for functionality in its system adaptive form (i.e., control of the system by itself). Using this cue, I argue that biosemiotics provides the methodology to incorporate these quasi concepts into an account of functionality.
The paradigm of Laplacean determinism combines three regulative principles: determinism, predicta... more The paradigm of Laplacean determinism combines three regulative principles: determinism, predictability, and the explanatory adequacy of universal laws together with purely local conditions. Historically, it applied to celestial mechanics, but it has been expanded into an ideal for scientific theories whose cogency is often not questioned. Laplace’s demon is an idealization of mechanistic scientific method. Its principles together imply reducibility, and rule out holism and emergence. I will argue that Laplacean determinism fails even in the realm of planetary dynamics, and that it does not give suitable criteria for explanatory success except within very well defined and rather exceptional domains. Ironically, the very successes of Laplacean method in the Solar System were made possible only by processes that are not themselves tractable to Laplacean methodology. The results of some of these processes were first observed in 1964, and violate the Lapacean requirements of locality and predictability, opening the door to holism and nonreducibility, i.e., emergence. Despite the falsification of Laplacean methodology, the explanatory resources of holism and emergence remain in scientific limbo, though emergence has been used somewhat indiscriminately in recent scientific literature. I make some remarks at the end about the proper use of emergence in its traditional sense going back to C.D. Broad.
Cognition, Communication, Co-operation. Vol. 9 no 2: 295-304., 2011
Abstract: There are many different mathematical definitions of information that have their variou... more Abstract: There are many different mathematical definitions of information that have their various uses, but I will be concerned with notions of information used in applications in various branches of science that are distinguished by their topic, i.e., what they apply to. I describe the major uses information, and show their relations to each other. I will argue that the various uses form a nested hierarchy, in which each is a restriction on the previous, inheriting the properties of its predecessor, but adding in new features that make it a special case. The lowest level is physical information determined by distinctions and the highest is explicit representation in linguistic social communication. Is there anything common to information at all these levels? I will argue that there is, and that information in each case is what Donald MacKay (1969) called a distinction that makes a difference. What distinguishes the use of information at each level is what distinctions make a causal difference at that level. At each successive level distinctions that make a difference at a previous level make no difference at that level. In order to create this sort of filter new levels have to be formed by cohesion peculiar to the identifying characteristics at that level. A consequence of this view is that information must have causal powers, and that there is a tight connection between information and causation.
Almost fifty years ago Wilfrid Sellars described two competing ways of imagining the world, the M... more Almost fifty years ago Wilfrid Sellars described two competing ways of imagining the world, the Manifest Image and the Scientific Image. The Manifest Image is an idealization of common sense aided by critical philosophy, whereas the Scientific Image is the product of our best science. The methodologies of the two images are very different: the Manifest Image deals with experience and looks only at relations among bits of experience and analysis of experience into the relations that must lie behind it, whereas the Scientific Image is grounded in explanations of experience, typically causal explanations. This need not be a problem if the two images are compatible. Sellars argued, however, that the Manifest Image implies continuity, but the best science of the time told us (or appeared to tell us) that the world is made up of discrete subatomic particles and discrete transitions between quantum states, making the two incompatible. Although Sellars noted that future science might show that the world is continuous, he did not follow this up. Science in the last fifty years has given much more evidence for continuity in the world from complexity studies and Quantum Mechanics, so perhaps the two images can be reconciled after all.
Open Systems and Information Dynamics, 6 (1999): 241-302.
Both natural and engineered systems are fundamentally dynamical in nature: their defining propert... more Both natural and engineered systems are fundamentally dynamical in nature: their defining properties are causal, and their functional capacities are causally grounded. Among dynamical systems, an interesting and important sub-class are those that are autonomous, anticipative and adaptive (AAA). Living systems, intelligent systems, sophisticated robots and social systems belong
to this class, and the use of these terms has recently spread rapidly through the scientific literature.
Central to understanding these dynamical systems is their complicated organisation and their consequent capacities for re- and self- organisation. But there is at present no general analysis of
these capacities or of the requisite organisation involved. We define what distinguishes AAA systems from other kinds of systems by characterising their central properties in a dynamically
interpreted information theory.
Consideration of an example of successful reference gives rise to two important insights. The fir... more Consideration of an example of successful reference gives rise to two important insights. The first is that reference should be understood most fundamentally in terms of the pragmatic success of each individual utterance. The second is that linguistic conventions need to be understood as on a par with the non-linguistic regularities that competent language users rely upon to refer. Syntax and semantics are part of what Barwise and Perry (1983) call the context of the utterance, contributing to the pragmatics of the utterance.
We show why reference should be understood in pragmatic terms and point out that, since success is often achieved in non-standard, creative ways, any formalization of pragmatics can
only be partial. We show that the need for such an inventive approach to referring traces back to the need for language to be highly efficient, with expressions underdetermining their interpretation. Our second step is to argue that the semantic and syntactic regularities, which might seem to be independent of the context of an utterance, should actually be understood as also being part of that context. In doing so, our account spells out some of the possible implications of Millikan’s (1998) account of conventions and how it makes the creative use of language possible.
Logic, Epistemology and Philosophy of Science, …, Jan 1, 1987
I argue that Hanson, Kuhn, Feyerabend and Churchland are correct in thinking that observation is ... more I argue that Hanson, Kuhn, Feyerabend and Churchland are correct in thinking that observation is theory laden in a way which infects our choice of theories. Although I agree with critics of theory ladenness that theory independent observation is possible, I hold that evaluation of the evidential significance of observations for a theory must rely on the conceptual resources of the theory. Empirical evidence is either not interpreted, and is not rich enough to guide theory choice, or else it is rich enough to guide theory choice, and its significance depends on the theory for which it is evidence. Either we avoid theory ladenness and severely limit science, or else we accept theory ladenness and its consequences. Given a preference for a non-trivial science, we must ask how the difficulties theory ladenness presents for the objective evaluation of theories can be alleviated. I infer some lessons from the analogy of perception.
Although the complexity of biological systems and subsystems like DNA and various transcription a... more Although the complexity of biological systems and subsystems like DNA and various transcription and translation pathways is of interest in itself, organization is of fundamental importance to understanding biological systems. It would be convenient to have a general definition of organization applicable to biological systems. I propose that C.H. Bennett’s notion of logical depth is a suitable candidate. I discuss the problems with using complexity measures alone, and then the relations between logical depth and algorithmic complexity. Last, I give some examples in which depth gives a better measure of what might naively be taken to be complexity in biological systems by any biologists,,and then argue that this must be augmented by consideration of dynamical processes.
In B. Brown, K. de Laplante and K. Peacock, Philosophy of Ecology. (Dordrecht: North-Holland), 2011
The issue of ecosystem individuation is of both theoretical and practical importance. Ecosystems ... more The issue of ecosystem individuation is of both theoretical and practical importance. Ecosystems are dynamical systems, so a dynamical account of ecosystem is more appropriate than a static definition. Dynamical definitions are also more useful if we want to study ecosystem change and the possible limits of that change. A dynamical account is especially useful for ecosystem management and intervention, since, aside from the issue of matching management scale with ecosystem scale, these are dynamical interactions themselves, and their dynamics must be incorporated into the existing ecosystem dynamics. Because ecosystems are typically complexly organized, and thus not subject to one grand model, it is useful to develop a number of working models that can be applied in specific cases as appropriate. In many cases more than one model or metamodels will apply, and different models can be used to constrain each other, especially in cases where ecosystems skirt the borders of specific metamodels.
The management model best for self-organized complex social systems encourages diversity with min... more The management model best for self-organized complex social systems encourages diversity with minimal top-down control.
Any top-down control should work with natural properties of the system.
The outcome cannot be determined in advance.
I call this facilitation.
Dynamical systems theory applies to anything that changes with time. In mathematics this is inter... more Dynamical systems theory applies to anything that changes with time. In mathematics this is interpreted rather broadly, but in physics, and often in other sciences, it applies to systems with forces and flows, often in a network, that are typically open to exchanges with the outside. This makes it well suited to the study of ecosystems. Ecosystems are not only open to outside influences, but are often nested by scale in space and time. One of the first problems in discussing ecosystem function, then, is to give a definition of ecosystem individuation and its consequences. One of the consequences is that it is reasonable to define functionality within an ecosystem in terms of contributions to the maintenance of this individuation, as I have done elsewhere for organisms, using a dynamical notion of autonomy. I will briefly argue that common etiological accounts of function are not suitable for discussing ecosystem function. We don’t typically think of ecosystems as autonomous, but autonomy comes in degrees, so even if the word is not apt, the idea is. I will distinguish between ecosystem role in general and functionality in particular. Ecosystem role, which is sometimes identified with function, can actually undermine ecosystem functionality. I will also distinguish between ecosystem functions and ecosystem services. The latter serve some larger or separate systems (whence again the importance of individuation). They are important for understanding how nested ecosystems are related to each other through functional dependence.
This course will have two parts. The first part will deal generally with complex systems, specifi... more This course will have two parts. The first part will deal generally with complex systems, specifically complexly organized systems of the kind that are found widely in biology. The issues dealt with will be complex systems and their properties, methods of dealing with complex systems, how to individuate systems in a way that allows testing, emergent properties and entities, hierarchy, autonomy and functionality. These topics will then be applied to ecological systems, respecting their peculiar nature. In particular, individuation in ecological systems is less sharp than for, say organisms, and their stabilizing properties are less strict than the autonomy we find in organisms. This means that the notion of ecological functionality is less clear and it must be carefully distinguished from candidate properties like ecological role, ecological services and biodiversity. In many cases the ecological functions within an ecosystem are rather abstract, and individual entities (such as predator and prey roles) play only an indirect function. A proper understanding of ecosystem function will also help towards understanding ecosystem robustness, resilience to perturbations, and to some extent to ecosystem management.
I have provided some background readings for each section. The required readings will be a much smaller set, given for each class..
MDPI Proceedings, 2017
Information is usually in strings, like sentences, programs or data for a computer. Information i... more Information is usually in strings, like sentences, programs or data for a computer.
Information is a much more general concept, however. Here I look at information systems that can
be three or more dimensional, and examine how such systems can be arranged hierarchically so that
each level has an associated entropy due to unconstrained information at a lower level. This allows
self-organization of such systems, which may be biological, psychological, or social.
MDPI Proceedings 2017, 1, 68; IS4SI 2017 Summit DIGITALISATION FOR A SUSTAINABLE SOCIETY, Gothenburg, Sweden, 12–16 June 2017, 2017
Dimensional analysis is a technique used by scientists and engineers to check the rationality of ... more Dimensional analysis is a technique used by scientists and engineers to check the
rationality of their calculations, but it can also be used to determine the nature of the quantities used.
Information is usually measured in bits, or binary digits, but it could be measured using any other
base. I will be arguing that, given the possibility of an objective measure of information in terms of
asymmetries, and the relation of information to order, Schrὂdinger’s suggestion that negentropy
was an appropriate measure should be taken seriously. After clarifying this notion, I use
dimensional analysis to show that negentropy has units of degrees of freedom, and that this is a
sensible unit of information.
Dynamical systems theory applies to anything that changes with time. In mathematics this is inter... more Dynamical systems theory applies to anything that changes with time. In mathematics this is interpreted rather broadly, but in physics, and often in other sciences it applies to systems in which there are forces and flows, often in a network. In my work with Cliff Hooker on dynamical approaches to mind and cognition we adopt a general approach to the world that we call Dynamical Realism. We used this to develop ideas of unity and individuation, emergence, functionality and intentionality. Our basic working hypothesis is that anything that is real is dynamical, or can be understood dynamically. In some respects this is pretty much trivial: any system is a dynamical system. Nonetheless, dynamical realism imposes some discipline through its implication that things should be viewed dynamically in order to understand what they are really like. I will start with a description of some of the features of dynamical systems, including the intractability of complexly organized systems, and indicate how we might deal with such systems. I then sketch how these ideas can be applied to the issues of individuation, emergence, functionality and intentionality. The last two are explained in terms of a sense of autonomy that was first identified by Kant, who proposed that we need a different notion of causation. In particular, autonomous systems require 1) non-equilibrium conditions, 2) internal dynamical differentiation, 3) hierarchical and interactive process organization, 4) incomplete closure, 5) openness to the world, 6) openness to infrastructural inputs, 7) the existence of autonomy is identical to the corresponding process closure, and is not something complementary or over and above this closure. I end with some implications for dealing with mind.
I start with a brief summary of kinds of information used in science, showing how they are nested... more I start with a brief summary of kinds of information used in science, showing how they are nested (or hierarchical), with inner kinds inheriting properties of the outer kinds. I further argue that within each kind there is also hierarchical organization, and that the major kinds are distinguished by their dynamics, not just being ordered in a hierarchy. Next I argue that similar rules that apply to non-equilibrium thermodynamics apply also to information systems, and give some examples of resulting self-organization, or what we have called “rhythmic entrainment” [1]. I point out that entrainment that results from properties within a system are more efficient than ones that are entrained by outside forces. This also gives a sort of resilience to such systems, and in higher kinds of information allows for self-adaptation via accommodating both external forces and internally generated forces. I then apply these lessons to management and argue that the most efficient and creative form of management comes not from severe control from the top, or from imposed “efficiency” but through self-organization allowed by a low degree of control and the encouragement of diversity. This form of management I call facilitation. There may be specific people assigned a facilitation role, but this is not required; any member of a group can act as a facilitator. What is required, however, is that members of the group are accustomed to being open-minded and flexible. This form of management is most compatible with anarchism as a political (and management) theory, but has benefits in pretty much any political system. I then go into some complications of this view and some of their consequences.
In everyday usage, information is knowledge or facts acquired or derived from study, instruction ... more In everyday usage, information is knowledge or facts acquired or derived from study, instruction or observation. Information is presumed to be both meaningful and veridical, and to have some appropriate connection to its object. Information might be misleading, but it can never be false. Standard information theory, on the other hand, as developed for communications [1], measurement [2] induction [3; 4] and computation [5; 6], entirely ignores the semantic aspects of information. Thus it might seem to have little relevance to our common notion of information. This is especially true considering the range of applications of information theory found in the literature of a variety of fields. Assuming, however, that the mind works computationally and can get information about things via physical channels, then technical accounts of information strongly restrict any plausible account of the vulgar notion. Some more recent information-oriented approaches to epistemology [7] and semantics [8] go further, though my introduction to the ideas was through Michael Arbib, Michael Scriven and Kenneth Sayre in the profoundly inventive late 60s and early 70s.
In this talk I will look at how the world must be in order for us to have information about it. This will take three major sections: 1) intrinsic information -- there is a unique information in any structure that can be determined using group theory, 2) the physical world (including our minds) must have specific properties in order for us to have information about the world, and 3) the nature of information channels that can convey information to us for evaluation and testing. In the process I will outline theories of physical information and semantic information. Much of the talk will be an, I hope simplified, version of [9] and [10], and other sources on my web page, and the book, Every Thing Must Go [10].
1. Shannon, C.E. and Weaver, W. 1949. The Mathematical Theory of Communication. Urbana, University of Illinois Press.
2. Brillouin, L 1962. Science and Information Theory, 2nd edition. New York, Academic Press.
3. Solomonoff, R. 1964. A formal theory of inductive inference, Part I. Information and Control, Vol 7, No. 1: 1-22.
4. Solomonoff, R. 1964. A formal theory of inductive inference, Part II. Information and Control, Vol 7, No. 2: 224-254.
5. Kolmogorov, A.N. 1965. Three approaches to the quantitative definition of information. Problems of Inform. Transmission 1: 1-7.
6. Chaitin, G.J. 1975. A theory of program size formally identical to information theory. J. ACM 22: 329-340.
7. Dretske, F. 1981. Knowledge and the Flow of Information. Cambridge, MA, MIT Press.
8. Barwise, Jon and John Perry. 1983. Situations and Attitudes. Cambridge, MA, MIT Press.
9. Collier, John 1990. Intrinsic information. in Philip Hanson (ed) Information, Language and Cognition: Vancouver Studies in Cognitive Science, Vol. 1. University of British Columbia Press, now by Oxford University Press: 390-409.
10. Collier, John. 2012. Information, causation and computation. Information and Computation: Essays on Scientific and Philosophical Understanding of Foundations of Information and Computation. Gordana Dodig Crnkovic and Mark Burgin (eds), Singapore, World Scientific: 89-106.
11. Ladyman, J., Ross, D., with Collier, J., Spurrett, D. 2007. Every Thing Must Go. Oxford, Oxford University Press.
Dimensional analysis is a technique used by scientists and engineers to check the rationality of ... more Dimensional analysis is a technique used by scientists and engineers to check the rationality of their calculations, but it can also be used to determine the nature of the quantities used. Information is usually measured in bits, or binary digits, but it could be measured using any other base. I will be arguing that, given the possibility of an objective measure of information in terms of asymmetries, and the relation of information to order, Schrὂdinger's suggestion that negentropy was an appropriate measure should be taken seriously. After clarifying this notion, I use dimensional analysis to show that negentropy has units of degrees of freedom, and that this is a sensible unit of information.
Information is usually in strings, like sentences, programs or data for a computer. Information i... more Information is usually in strings, like sentences, programs or data for a computer. Information is a much more general concept, however. Here I look at information systems that can be three or more dimensional, and examine how such systems can be arranged hierarchically so that each level has an associated entropy due to unconstrained information at a lower level. This allows self-organization of such systems, which may be biological, psychological, or social.
I start with a brief summary of kinds of information used in science, showing how they are nested... more I start with a brief summary of kinds of information used in science, showing how they are nested (or hierarchically arranged), with inner kinds inheriting properties of the outer kinds. I further argue that within each kind there is also hierarchical organization, and that the major kinds are distinguished by their dynamics, not just being ordered in a hierarchy, though similar principles apply at all levels. Next I argue that rules applying to non-equilibrium thermodynamics apply also to information systems, and I give some examples of resulting self-organization, or what we have called “rhythmic entrainment” [1]. I point out that entrainment that results from forces within a system are more efficient than ones that are entrained by outside forces. This gives a sort of resilience to such systems, and in higher kinds of information allows for self-adaptation via accommodating both external forces and internally generated forces. I then apply these lessons to management and argue that the most efficient and creative form of management comes not from severe control from the top, or from imposed “efficiency” but through self-organization allowed by a low degree of control and the encouragement of diversity. This form of management I call facilitation. There may be specific people assigned a facilitation role, but this is not required; any member of a group can act as a facilitator. What is required, however, is that members of the group are accustomed to being open-minded and flexible. This form of management is most compatible with anarchism as a political (and management) theory, but has benefits in pretty much any political system.
Causation can be understood as a computational process once we understand causation in informatio... more Causation can be understood as a computational process once we understand causation in informational terms. I argue that if we see processes as information channels, then causal processes are most readily interpreted as the transfer of information from one state to another. This directly implies that the later state is a computation from the earlier state, given causal laws, which can also be interpreted computationally. This approach unifies the ideas of causation and computation.
Handbook of Philosophy of Science, vol 8, Philosophy of Information, 2008, Chapter 5f)
The notion of information has developed in a number of different ways (as discussed in this volum... more The notion of information has developed in a number of different ways (as discussed in this volume), and many of them have been applied to biology, both usefully and gratuitously, and even misleadingly. These multiple notions of information have not surprisingly led to apparently contradictory claims by authors who have really been talking past each other, although there are also substantive issues at stake. The aim of this chapter is to review some of the ways that notions of information have been used in biology, to disentangle them, and to evaluate their implications and aptness, as well as to point out some of the more widespread confusions.
In Donald Favereau, Paul Cobley and Kalevi Kull, eds. A More Developed Sign: Interpreting the Work of Jesper Hoffmeyer. Tartu Semiotics Library 10, Tartu University Press, Tartu 2012: 175-177.
The aim of this book is to defend a radically naturalistic metaphysics. By this we mean a metaphy... more The aim of this book is to defend a radically naturalistic metaphysics. By this we mean a metaphysics that is motivated exclusively by attempts to unify hypotheses and theories that are taken seriously by contemporary science. For reasons to be explained, we take the view that no alternative kind of metaphysics can be regarded as a legitimate part of our collective attempt to model the structure of objective reality.
Book Chapter written with Don Ross and James Ladyman in Ladyman & Ross's 'Everything Must Go'.
Causation, natural laws and explanation, Jan 1, 1999
Four general approaches to the metaphysics of causation are current in Australasian philosophy. O... more Four general approaches to the metaphysics of causation are current in
Australasian philosophy. One is a development of the regularity theory (attributed to Hume) that uses counterfactuals (Lewis, 1973; 1994). A second is based in the relations of universals, which determine laws, which in turn determine causal interactions of particulars (with the possible exception of singular causation, Armstrong, 1983). This broad approach goes back to Plato, and was also held in this century by Russell, who like Plato, but unlike the more recent version of Armstrong (1983), held there were no particulars as such, only universals. A third view, originating with Reichenbach and revived by Salmon (1984), holds that a causal process is one that can be marked. This view relies heavily on ideas about the transfer of information and the relation of information to probability, but it also needs uneliminable counterfactuals. The fourth view was developed recently by Dowe (1992) and Salmon (1994). It holds that a causal process involves the transfer of a non-zero valued conserved quantity. A considerable advantage of this approach over the others is that it requires neither counterfactuals nor abstracta like universals to explain causation.
The theory of causation offered here is a development of the mark approach that entails Dowe’s conserved quantity approach. The basic idea is that causation is the transfer of a particular token of a quantity of information from one state of a system to another. Physical causation is a special case in
which physical information instances are transferred from one state of a physical system to another. The approach can be interpreted as a Universals approach (depending on ones approach to mathematical objects and qualities), and it sheds some light on the nature of the regularity approach.
After motivating and describing this approach, I will sketch how it can be used to ground natural laws and how it relates to the four leading approaches, in particular how each can be conceived as a special case of my approach. Finally, I will show how my approach satisfies the requirements of Humean supervenience. The approach relies on concrete particulars and computational logic alone, and is the second stage of constructing a minimal metaphysics, started in (Collier 1996, The necessity of natural kinds).
Ecology and Society 10(1): 29. (2005)
ABSTRACT. Complex systems are dynamic and may show high levels of variability in both space and t... more ABSTRACT. Complex systems are dynamic and may show high levels of variability in both space and time. It is often difficult to decide on what constitutes a given complex system, i.e., where system boundaries should be set, and what amounts to substantial change within the system. We discuss two central themes: the nature of system definitions and their ability to cope with change, and the importance of system definitions for the mental metamodels that we use to describe and order ideas about system change. Systems can only be considered as single study units if they retain their identity. Previous system definitions have largely ignored the need for both spatial and temporal continuity as essential attributes of identity. After considering the philosophical issues surrounding identity and system definitions, we examine their application to modeling studies. We outline a set of five alternative metamodels that capture a range of the basic dynamics of complex systems. Although Holling’s adaptive cycle is a compelling and widely applicable metamodel that fits many complex systems, there are systems that do not necessarily follow the adaptive cycle. We propose that more careful consideration of system definitions and alternative metamodels for complex systems will lead to greater conceptual clarity in the field and, ultimately, to more rigorous research.
Complex systems are dynamic and may show high levels of variability in both space and time. It is... more Complex systems are dynamic and may show high levels of variability in both space and time. It is often difficult to decide on what constitutes a given complex system, i.e., where system boundaries should be set, and what amounts to substantial change within the system. We discuss two central themes: the nature of system definitions and their ability to cope with change, and the importance of system definitions for the mental metamodels that we use to describe and order ideas about system change. Systems can only be considered as single study units if they retain their identity. Previous system definitions have largely ignored the need for both spatial and temporal continuity as essential attributes of identity. After considering the philosophical issues surrounding identity and system definitions, we examine their application to modeling studies. We outline a set of five alternative metamodels that capture a range of the basic dynamics of complex systems. Although Holling’s adaptive cycle is a compelling and widely applicable metamodel that fits many complex systems, there are systems that do not necessarily follow the adaptive cycle. We propose that more careful consideration of system definitions and alternative metamodels for complex systems will lead to greater conceptual clarity in the field and, ultimately, to more rigorous research.
Evolutionary Systems, Gertrudis van der Vijver, Stanley N. Salthe and Manuela Delpos (eds) (Dordrecht, Kluwer, 1998): 129-140., 1998
Progress has become a suspect concept in evolutionary biology, not the least because the core con... more Progress has become a suspect concept in evolutionary biology, not the least because the core concepts of neo-Darwinism do not support the idea that evolution is progressive. There have been a number of attempts to account for directionality in evolution through additions to the core hypotheses of neo-Darwinism, but they do not establish progressiveness, and they are somewhat of an ad hoc collection. The standard account of fitness and adaptation can be rephrased in terms of information theory. From this, an information of adaptation can be defined in terms of a fitness function. The information of adaptation is a measure of the mutual information between biota and their environment. If the actual state of adaptation lags behind the state of optimal adaptation, then it is possible for the information of adaptation to increase indefinitely. Since adaptations are functional, this suggests the possibility of progressive evolution in the sense of increasing adaptation.
I argue that natural kinds are necessary for science, and that they are properties, not classes o... more I argue that natural kinds are necessary for science, and that they are properties, not classes of objects. They depend on causal regularities, not on some transcendent essence. I also argue that their metaphysical necessity arises not from a transcendent necessity, but because they exist in all and only possible worlds in which they have the relevant causal regularities. Thus they can be both necessary and contingent -- they cannot be otherwise, but they might not exist.
I have previously explored autonomy as the foundation of functionality, intentionality and meanin... more I have previously explored autonomy as the foundation of functionality, intentionality and meaning, which are best explained coherently via information theory. Later I argued that autonomous systems accommodate the unexpected through self-organizing processes,
together with some constraints that maintain autonomy. A system is autonomous if it uses its own information to modify itself and its environment to enhance its survival, responding to both environmental and internal stimuli to modify its basic functions to increase its viability. Autonomy has not played much of a role in biology and cognitive science until fairly recently. The first to bring the importance of autonomy to widespread attention were Maturana and Varela, who presented a theory of autopoietic systems based on cells as a paradigm. Autopoietic systems are dynamically closed to information. This gives the curious result that humans, who transfer information if anything does, are either not autonomous or else in some sense information is not really transferred between humans.
Similar problems can be seen to arise cutting the autopoietic aspects from infrastructure in biological cells. This problem also holds for Robert Rosen’s account of living system. The real situation is not a choice between third person openness and first person closure. On our account, autonomy is a matter of degree depending on the relative organization of the system and system environment interactions. Furthermore, autonomy can come in levels, and the aims of the levels can contradict each other.