Is Our Universe Natural?1 (original) (raw)


In physics, naturalness is the property that the dimensionless ratios between free parameters or physical constants appearing in a physical theory should take values "of order 1" and that free parameters are not fine-tuned. That is, a natural theory would have parameter ratios with values like 2.34 rather than 234000 or 0.000234. The requirement that satisfactory theories should be "natural" in this sense is a current of thought initiated around the 1960s in particle physics. It is a criterion that arises from the seeming nonnaturalness of the standard model and the broader topics of the hierarchy problem, fine-tuning, and the anthropic principle. However it does tend to suggest a possible area of weakness or future development for current theories such as the Standard Model, where some parameters vary by many orders of magnitude, and which require extensive "fine-tuning" of their current values of the models concerned. The concern is that it is not yet clear whether these seemingly exact values we currently recognize, have arisen by chance (based upon the anthropic principle or similar) or whether they arise from a more advanced theory not yet developed, in which these turn out to be expected and wellexplained, because of other factors not yet part of particle physics models.

Certain results, most famously in classical statistical mechanics and complex systems, but also in quantum mechanics and high-energy physics, yield a coarse-grained stable statistical pattern in the long run. The explanation of these results shares a common structure: the results hold for a 'typical' dynamics, that is, for most of the underlying dynamics. In this paper I argue that the structure of the explanation of these results might shed some light --a different light-- on philosophical debates on the laws of nature. In the explanation of such patterns, the specific form of the underlying dynamics is almost irrelevant. The conditions required, given a free state-space evolution, suffice to account for the coarse-grained lawful behaviour. An analysis of such conditions might thus provide a different account of how regular behaviour can occur. This paper focuses on drawing attention to this type of explanation, outlining it in the diverse areas of physics in which it appears, and discussing its limitations and significance in the tractable setting of classical statistical mechanics.

This thesis gives a philosophical assessment of a contemporary movement, influential amongst physicists, about the status of microscopic and macroscopic properties. The fountainhead for the movement was a short 1972 paper `More is Different', written by the condensed-matter physicist, Philip Anderson. Each of the chapters is concerned with themes mentioned in that paper, or subsequently expounded by Anderson and his followers. In Chapter 1, I aim to locate Anderson's existence claims for `emergent properties' within the metaphysical, epistemological and methodological doctrines that identify themselves as `emergentist'. I argue, against the commentators' consensus, that the New Emergentists make claims about the metaphysical status of physical properties, and should not be read as concerned only with matters of research methodology for physics. In Chapter 2, I look at the physical examples that the New Emergentists appeal to, and propose a way of formulating their main claims within modern analytic metaphysics. I argue that it is possible to view their thesis as an updated version of `British Emergentism' a movement popular in the early years of the twentieth century. I support this contention by comparing examples of emergent properties put forward by the British and the New Emergentists. Chapter 3 is a discussion of the significance of renormalisation techniques. I attack a set of claims by Robert Batterman, who presents renormalisation as an explanatory strategy unrecognised by the philosophy of science. I separate several different methods of renormalisation analysis, and argue that he has conflated them. In Chapter 4, I discuss the theoretical representation of phase transitions in condensed matter physics; in particular, their appeal to the limit of an infinite system. I examine and refute various claims to the effect that the ineliminability of this limit in modelling phase transitions is of great metaphysical significance. I suggest a definition of phase transitions for finite systems that dissolves this illusion, and gives us reason to trust the results of the theories that only apply in the infinite limit. %I then comment on the significance of these results for a suggestion of Laura Ruetsche, that quantum statistical mechanics should be used as a guide to the interpretation of quantum field theory. Chapter 5 revisits the doctrines of the New Emergentists in order to place their views within some wider philosophical debates. I look at how their doctrines bear on issues in the interpretation of quantum mechanics, in the philosophy of mind and the philosophy of science. I close by returning to the context within physics, in which the New Emergentists originally made their presence felt: the controversy over whether elementary particle physics should enjoy greater status than other areas.

This is a short analysis of the changes in the concept of entropy as applied to physics of the present-day and Early Universe. Of special interest is a leading role of such a notion as deformation of a physical theory. The relation to a symmetry of the corresponding theory is noted. As this work is not a survey, the relevant author's works are mainly considered. This paper is to be published in special issue "Symmetry and Entropy" of journal SYMMETRY: Culture and Science