Event Categories: BSPS Choice Group Conjectures and Refutations Popper Seminar Sigma Club
Past Events › Sigma Club
Events List Navigation
Carina Prunkl (Oxford): “Black Hole Entropy is Entropy and not (necessarily) Information”
The comparison of geometrical properties of black holes with classical thermodynamic variables reveals surprising parallels between the laws of black hole mechanics and the laws of thermodynamics. Since Hawking’s discovery that black holes when coupled to quantum matter fields emit radiation at a temperature proportional to their surface gravity, the idea that black holes are genuine thermodynamic objects with a well-defined thermodynamic entropy has become more and more popular. Surprisingly, arguments that justify this assumption are both sparse and rarely convincing. Most of them rely on an information-theoretic interpretation of entropy, which in itself is a highly debated topic in the philosophy of physics. Given the amount of disagreement about the nature of entropy and the second law on the one hand, and the growing importance of black hole thermodynamics for the foundations of physics on the other hand, it is desirable to achieve a deeper understanding of the notion of entropy in the context of black hole mechanics. I discuss some of the pertinent arguments that aim at establishing the identity of black hole surface area (times a constant) and thermodynamic entropy and show why these arguments are not satisfactory. I then present a simple model of a Black Hole Carnot cycle to establish that black hole entropy is genuine thermodynamic entropy which does not require an information-theoretic interpretation.
Find out more »Matt Farr (Cambridge): “The C Theory of Time”
Abstract: Does time have a direction? Intuitively, it does. After all, our experiences, our thoughts, even our scientific explanations of phenomena are time-directed: things evolve from earlier to later, and it would seem unnecessary and indeed odd to try to expunge such talk from our philosophical lexicon. Nevertheless, in this talk I will make the case for what I call…
Find out more »James Read (Oxford): “Geometry and conspiracy in relativity theory”
I discuss the debate between advocates of dynamical versus geometrical approaches to spacetime theories, in the context of both special and general relativity. By distinguishing between what I call ‘individual’ versus ‘modal’ constraints, I argue—pace e.g. Brown—that there exists available a perfectly viable form of the geometrical approach.
Find out more »Laszlo Szabo (Eotvos University): “Empirical definitions of spatiotemporal conceptions”
First I will argue for the inevitability of a coherent, non-circular system of operational definitions of the basic spatiotemporal quantities, in terms of which the empirically testable spatiotemporal statements of physics should be expressed. A few examples will illustrate that the task is not trivial, especially if the definitions should hold with high, relativistic, precision. In my talk, I will outline a possible construction of such a system of operational definitions. It will be seen that the complete collection of operational definitions, by means of which one can reconstruct something similar to our usual spatiotemporal intuitions, would require the satisfaction of certain conditions. Whether these conditions are satisfied is an empirical question which has never truly been examined. Some speculative considerations show however an interesting picture. If all conditions are empirically satisfied then the resulted spacetime structure is a Minkowski geometry. If however, as it is expected, some of the important conditions are violated, it is not at all obvious what the resulted spacetime structure is. Nevertheless, straightforward generalizations of Minkowski geometry offer themselves as suitable mathematical description of the empirically ascertained spacetime structure; more straightforward than Riemannian geometry on a four-dimensional manifold.
Find out more »Laszlo Szabo (Eotvos University), “Meaning, Truth and Physics”
A physical theory is a partially interpreted axiomatic formal system (L, S), where L is a formal language with some logical, mathematical and physical axioms, and with some derivation rules, and the semantics S is a relationship between the formulas of L and some states of affairs in the physical world. In our ordinary discourse, the formal system L is regarded as an abstract object or structure, the semantics S as something which involves the mental/conceptual realm. This view is of course incompatible with physicalism. How can physical theory be accommodated in a purely physical ontology? The aim of the talk is to outline an account for meaning and truth of physical theory, within the philosophical framework spanned by three doctrines: physicalism, empiricism, and the formalist philosophy of mathematics.
Find out more »Balázs Gyenis (LSE): “A proof of tendency towards equilibrium”
When two gases mix their temperatures equalize. In the talk we take a look at a simple proof that aims to demonstrate this phenomena from historical, philosophical, and pedagogical perspectives. We argue that the proof can be viewed as a charitable reconstruction of Maxwell's own 1860 argument, and if so, then Maxwell preceded Boltzmann’s first attempt to give a mechanical explanation of tendency towards equilibrium with at least 6 years. Albeit the proof makes a problematic probabilistic independence assumption - and, according to a recent criticism, also a problematic physical assumption in case the masses of the molecules are different -, in this regard it does not fare worse than other later attempts. On the other hand the probabilistic independence assumption of the proof is geometrically intuitive and even invites some speculation about the physical basis of irreversibility. The proof is also simpler than many later attempts and could reasonably be included in a course on classical mechanics.
Find out more »Laurenz Hudetz (LSE): “The conceptual schemas account of interpretation”
Philosophers of physics often discuss how particular theories could or should be interpreted. But what exactly is it to interpret a formalism in the first place? This is the question addressed in this talk. I propose a general framework for making talk of interpretations more rigorous. First, I clarify what I mean by a formalism. Second, I give an account of what it is to establish a link between a formalism and data. For this purpose, I draw on the theory of relational databases to explicate what data schemas and collections of data are. I introduce the notion of an interpretive link using ideas from mathematical logic. Third, I address the question how a formalism can be interpreted in a way that goes beyond a connection to data. The basic idea is that one extends a data schema to an ontological conceptual schema and links the formalism to this extended schema. I illustrate this account of interpretation using the harmonic oscillator as a simple running example. And I highlight how the account can be fruitfully applied to address issues in philosophy of physics such as the question whether the Newtonian theory of gravitation is fully equivalent to the geometrised Newton-Cartan theory.
Find out more »Talal Debs (RSR Partners, New York): CANCELLED
Unfortunately this event has been cancelled. We apologise for any inconvenience caused.
Find out more »David Lavis (KCL): “Suppose Temperature is Negative. What Then?”
We show that it is quite easy to construct simple model systems exhibiting negative temperatures and we explore the thermodynamic consequences of this possibility particularly in relation to the second law. Approaching thermodynamics from the perspective of the Lieb--Yngvason formulation we demonstrate that both negative temperatures and heat capacities are compatible with the Carathéodory version of the second law. Then by examining all cases of heat engines and pumps working cyclically between isothermal reservoirs we show that the Kelvin--Planck version of the second law needs to be modified if the temperature can be negative and that both the Kelvin--Planck and Clausius versions need to be modified if both the temperature and heat capacity can be negative. We discuss the relevance of these conclusions for the ongoing dispute about the correct entropy for the microcanonical distribution in statistical mechanics.
Find out more »Katie Robertson (University of Birmingham): “The reduction of the second law of thermodynamics”
Abstract: TBA
Find out more »Gabor-Hofer Szabo (Hungarian Academy of Science): “Two concepts of noncontextuality in quantum mechanics”
There are two different and logically independent concepts of noncontextuality in quantum mechanics. First, an ontological (hidden variable) model for quantum mechanics is called noncontextual if every ontic (hidden) state determines the probability of the outcomes of every measurement independently of what other measurements are simultaneously performed. Second, an ontological model is noncontextual if any two measurements which are represented by the same self-adjoint operator, or equivalently, which have the same probability distribution of outcomes in every quantum state also have the same probability distribution of outcomes in every ontic state. In the talk I will argue that the Kochen-Specker arguments provide a state-independent proof only against noncontextual ontological models of the second type.
Find out more »James Fraser (Durham): “Understanding Ultraviolet Divergences”
When physicists first tried to formulate quantum field theories they ran into problems with ultraviolet divergences, integrals which blow up in the region of large momenta. Initially, this was taken as a sign that a radically new theoretical framework was needed to unify quantum theory and relativity. The renormalisation techniques developed in the late 40s showed that these infinities could be systematically eliminated, apparently solving the problem without a revolution. Still, the question of why ultraviolet divergences occur in the first place was unanswered, and the suspicion remained that they point to the break down of quantum field theories at high energies. In this talk, I sketch the development of the causal perturbation theory programme, a lesser-known approach to perturbative quantum field theory which sheds light on this issue. Originating in the work of Stueckelberg and Bogoluibov in the 50s, this approach brings the machinery of distribution theory to bear on the offending terms in the perturbative expansion. The divergences are taken to result from an improper treatment of products of distributions occurring in the perturbation series, which can if care is taken, be given a proper definition. Rather than indicating a physical problem with the high energy behaviour of quantum field theories then, ultraviolet divergences end up being understood as a purely mathematical issue.
Find out more »Lamberto Rondoni (Politecnico di Torino): “Does probability turn material? Statistical mechanics questions concerning equilibrium and non-equilibrium ensembles”
As pointed out, e.g. by R. Frigg, the foundations of statistical physics, by and large resting on the notion of statistical ensembles, remain in need of clarification. This may not be such a surprise: statistical physics raises probabilities practically to the rank of the material properties of physical objects, such as mass itself, while one may legitimately side with De Finetti, who stated that "probability does not exist". We will argue that some difficulties in giving ensembles robust foundations, derive from a gap separating probability, dynamics and mass. This observation will lead us to reconsider the role of Hamiltonian models in explaining relaxation to equilibrium. Further, we will contrast necessary with sufficient dynamical conditions, within a response theory suitable to describe open systems, i.e. systems in contact with an outer environment.
Find out more »Carina Prunkl (Oxford): “How Anthropocentric is Thermodynamics?”
Thermodynamics “smells more of its human origin than other branches of physics”, Bridgman famously wrote in 1941. Taking a closer look at the history of thermodynamics and statistical mechanics, we find that this ‘human smell’ enters the subject as early as the writings of Maxwell, who makes use of concepts such as ‘knowledge’, ‘observation’ and ‘the mind’ in order to explain thermodynamic phenomena. E.T. Jaynes some decades later goes even further and distinguishes between the ‘physical’ nature of energy and the ‘anthropomorphic’ nature of entropy. Both authors seem to suggest that thermodynamic concepts are in some sense mind-dependent, that they in some sense rely on the presence of an external observer. In this talk, I will revisit the question of how and when the ‘human smell’ enters thermodynamics by taking a closer look at Maxwell’s means-relative approach to thermodynamics. I will show that, in fact, Maxwell’s approach does not commit us to an anthropocentric reading of thermodynamics, but that it instead provides us with a powerful conceptual framework that carries over into classical and quantum statistical mechanics.
Find out more »Sean Gryb (University of Groningen): “Explaining time’s arrow: how scale symmetry affects notions of typicality in the universe”
A persistent problem in the metaphysics of time is explaining the origin of the apparent arrow of time. Popular explanations rely on the ability to define a notion of typicality for the universe as a whole. We argue that such approaches face serious difficulties. Severe mathematical and epistemological obstacles need to be overcome before a precise proposal can even be stated. More importantly, the approach is obstructed by a new symmetry argument related to the scale independence in the universe. We explain how one of the oldest known dynamical symmetries has significant albeit vastly unappreciated consequences for defining notions of typicality in the universe. We then show how these considerations pose significant difficulties for standard approaches to explanations of the arrow of time and suggest a promising new way forward that has the potential to sidestep the more problematic aspects of the standard approach.
Find out more »Sophie Ritson (Sydney): “Probing Novelty at the LHC: Heuristic appraisal of disruptive experimentation”
In this talk, ‘novelty’ is explored through a recent historical episode from high-energy experimental physics to offer an understanding of novelty as disruption. I call this the ‘750 GeV episode’, an episode where two Large Hadron Collider (LHC) experiments, CMS and ATLAS, each independently observed indications of a new resonance at approximately 750 GeV. With further data collection, the initial excess was determined to be a statistical fluctuation. The approach taken, in the analysis of interviews conducted with physicists who were involved in the ‘750 GeV episode’, is to identify novelty as a valued difference. Following this conceptually driven approach, I disambiguate between several notions of novelty through the identification of varied differences. This disambiguation is achieved through exploring differences expressed in comparison to varied expressions of the standard model, and through exploring varied ‘types’ of difference (properties and entities) to introduce disruptive exploratory experimentation, a complementary understanding ‘exploratory experimentation’ (Elliott, 2007; Steinle, 1997, 2002). I show that the kinds of novelty framed as most valuable are those that violate expectations and are difficult to incorporate into the existing structures of knowledge. In such instances, disruption to the existing ontology or ways of knowing is valued. This positive appraisal of disruption, and contradiction over confirmation, is considered in the recent context of high-energy physics, where several physicists have claimed that there is a lack of promising directions for the future, or even that the field is in a ‘crisis’. I show that the role of disruption explains the differences between the differing notions of novelty. Furthermore, I show that the positive appraisal of disruption is based on forward looking assessments of future fertility, or heuristic appraisal (Nickles, 1989, 2006). Within the context of concerns of a lack of available promising future directions, disruption becomes a generator of alternative futures.
Find out more »CANCELLED: Jeremy Butterfield (Cambridge): “On Reduction and Functionalism about Space and Time”
Unfortunately, due to the current COVID-19 situation this event will no longer take place as planned. We apologise for any inconvenience caused.
This is joint work with Henrique Gomes, Cambridge.
Henrique Gomes & Jeremy Butterfield (Cambridge): “Geometrodynamics as Functionalism about Time”
A recent literature about a doctrine called 'spacetime functionalism' focuses on how the physics of matter and radiation contributes to determining, or perhaps even determines or explains, chrono-geometry. Thus spacetime functionalism is closely related to relational, and specifically
Machian, approaches to chrono-geometry and dynamics; and to what has recently been called the 'dynamical approach' to chrono-geometry.
We are sympathetic to spacetime functionalism. We have elsewhere argued that in its best form, it says that a chrono-geometric concept (or concepts) is uniquely definable in terms of (and so reducible to) matter and radiation – and then proves a theorem to this effect. We also gave examples of such theorems from the older literature in foundations of chrono-geometry (before the recent label 'functionalism').
This paper argues that three projects in the physics literature give vivid and impressive illustrations of this kind of functionalist reduction, for time. That is: they each provide, within a theory about spatial geometry, a functionalist reduction of the temporal metric and time-evolution. And
the reduction is summed up in a theorem that the temporal metric and-or the Hamiltonian governing time-evolution is, in an appropriate sense, unique.
These three projects are all 'general-relativistic'. But they differ substantially in exactly what they assume, and in what they deduce. They are, in short:
(1): The recovery of geometrodynamics, i.e. general relativity's usual Hamiltonian, from requirements on deformations of hypersurfaces in a Lorentzian spacetime. This is due to Hojman et al. (1976).
(2): The programme of Schuller, Duell et al. (2011, 2012, 2018). They deduce from assumptions about matter and radiation in a 4-dimensional manifold that is not initially assumed to have a Lorentzian metric, the existence of a generalized metric (in some cases a Lorentzian one) – and much information about how it relates to matter and radiation.
(3): The deduction of general relativity's usual Hamiltonian in a framework without even a spacetime: that is, without initially assuming a 4-dimensional manifold, let alone one with a Lorentzian metric. This is due to Gomes and Shyam (2017).
We discuss these projects in order. We end by drawing a positive corollary of (3), for a recent programme in the foundations of classical gravity, viz. shape dynamics.
Find out more »Chrysovalantis Stergiou (The American College of Greece): “On Empirical Underdetermination of Physical Theories in C*Algebraic Setting”
Empirical underdetermination of physical theories by observational data lies at the heart of the debate over scientific realism. Antirealists of different strands contend that if observation cannot determine the state of a physical system then to talk about a uniquely defined state of the system is just a matter of convention. In the context of Algebraic Quantum Field Theory (AQFT) this stance is related to the claim that the physical topology of the state space is the weak*-topology and to what has become known as Algebraic Imperialism, the operationalist attitude which characterized the first steps of the theory. Aristidis Arageorgis (1995) devised a mathematical argument against empirical underdetermination of the state of a system in C*-algebraic setting which rests on two topological properties of the state space: being T1 and being first countable in the weak*-topology. The first property is possessed trivially by the state space while the latter is highly non-trivial and it can be derived from the assumption that the algebra of observables is separable.
In this talk we will reconstruct Arageorgis’ argument and examine its soundness with regard to the separability of the algebra of observables. We will show that separability is related to two factors: (a) the dimension of the algebra, considered as a vector space; (b) whether it is a C*- or von Neumann algebra. Finite-dimensional C*-algebras and von Neumann algebras are separable, infinite-dimensional von Neumann algebras are non-separable and infinite-dimensional C*-algebras can be separable. These considerations will be discussed with reference to classical systems of N particles, the Heisenberg model for ferromagnetism, the Haag-Araki formulation of AQFT and a separable reformulation of AQFT in Minkowski spacetime suggested by Porrmann (1999, 2004).
This talk is dedicated to the memory of my beloved teacher, colleague and friend Aris Arageorgis who untimely passed away in 2018.
Find out more »Neil Dewar (Munich): “On Absolute Units”
What is the best way to characterise the intrinsic structure of physical quantities? Field’s program shows one approach (that also delivers a nominalist treatment of such quantities); in this talk, I outline how group-theoretic methods can deliver a somewhat simpler, although non-nominalist, way of doing this for scalar and vector quantities. I go on to develop a theory on how such quantities can be algebraically combined, and use this to develop a simple intrinsic treatment of Newtonian gravitation. Finally, I argue that this treatment illuminates a “third way” in the debate over absolutism and comparativism about quantities: namely, a form of anti-quidditist absolutism.
Find out more »