8. Seminar zur Philosophie der Physik

Information

Tübingen, 24. – 26. Oktober 2025

Poster 2025 #2
Bild © Ramona Fontaine, Untitled

Call for Application

Der Workshop richtet sich insbesondere an Bachelor- und Masterstudierende der Physik und der Philosophie, aber auch an Studierende anderer Fachrichtungen. In Fachvorträgen und Diskussionsgruppen wird den Teilnehmer*innen das Forschungsgebiet der Philosophie der Physik vorgestellt. Schwerpunkt ist das Thema Information. Vorkenntnisse in Physik oder Philosophie waren nicht erforderlich.

Neben englischsprachigen Fachvorträgen von Christopher Timpson (Oxford) zum Informationsbegriff in der Quanteninformation, Javier Anta (Sevilla) zur Philosophie der Information und Saakshi Dulani (John’s Hopkins) zum Informationsparadoxon schwarzer Löcher wird es deutschsprachige, moderierte Kleingruppendiskussionen auf Basis vorbereitender Lektüre (beispielsweise zu Wheeler’s `It from Bit') und eine Podiumsdiskussionen zur Frage: `Was ist Philosophie der Physik?' geben.

Die Übernachtungskosten der Teilnehmer*innen werden vollständig übernommen. Auf Anfrage kann zusätzlich auch eine (anteilige) Übernahme der Reisekosten geprüft werden.

Bewerbungsfrist: 15. September 2025 (Rückmeldung unsererseits erfolgt bis zum 20. September)

Kontakt: seminar(at)philosophiederphysik.de

Webseite: www.philosophiederphysik.de

Organisation: Maren Bräutigam (Köln), Niels Linnemann (Genf), Kian Salimkhani (Nijmegen), Annica Vieser (Genf), Karla Weingarten (Nijmegen)

Hosts: Sophie Atzpodien und Michael Herrmann

Der Workshop wird großzügig unterstützt durch das Forum für Wissenschaftskulturen der Universität Tübingen.

Vorläufiges Programm

Ort: Hörsaal (1.Stock), Tübinger Forum für Wissenschaftskulturen, Doblerstr. 33, 72074 Tübingen

Friday, 24.10.
Saturday, 25.10.
Sunday, 26.10.
  9:00 – 10:30
Vorlesung
Javier Anta (U Seville)
Title tba
9:00 – 10:30
Vorlesung
Benjamin Jantzen (Virginia Tech)
Information and dynamics: competing metaphors for biological intelligence
  Break Break
  11:00 – 12:30
Diskussionsgruppen
11:00 – 12:30
Diskussionsgruppen
  Mittagessen Mittagessen
Anreise 14:00 – 15:30
Podiumsdiskussion
'Was ist Philosophie der Physik?'
14:00 – 15:30
Diskussionsgruppen
15:30 – 16:15
Einführung
Pause  
16:15 – 17:45
Vorlesung
Saakshi Dulani (Johns Hopkins)
Not the Measurement Problem’s Problem: Black Hole Information Loss with Schrödinger’s Cat
16:00 – 17:30
Studierendenvorträge
 
18:00 – 19:30
Vorlesung
Chris Timpson (Oxford)
Title tba
Dinner
Dinner    

Vorträge

Saakshi Dulani (Johns Hopkins): Not the Measurement Problem’s Problem: Black Hole Information Loss with Schrödinger’s Cat
Recently, several philosophers and physicists have increasingly noticed the hegemony of unitarity in the black hole information loss discourse and are challenging its legitimacy in the face of the measurement problem. They proclaim that embracing non-unitarity solves two paradoxes for the price of one. Though I share their distaste over the philosophical bias, I disagree with their strategy of still privileging certain interpretations of quantum theory. I argue that information-restoring solutions can be interpretation-neutral because the manifestation of non-unitarity in Hawking’s original derivation is unrelated to what’s found in collapse theories or generalized stochastic approaches, thereby decoupling the two puzzles.

Chris Timpson (Oxford): Is (the concept of) Information Fundamental in Physics?
There has been some considerable excitement around the idea that the concept of information provides a new unifying idea in the foundations of physics, with some even going so far as to suggest that information might by physically fundamental. But what does it even mean to claim that something is physically fundamental? I shall suggest that there are a number of different relevant notions of ‘physically fundamental’, and that when one is clear on these distinctions one can get a much better sense of what work the concept of information is doing for us. One immediate lesson is that there is no reason at all to believe that information is ontologically fundamental in physics.

Javier Anta (U Seville): Disentangling the uses of ‘entropy’ and ‘information’ in classical thermal physics
My aim in this talk is fourfold. Firstly, I will systematically analyze how the notions of ‘entropy’ and ‘information’ have been used in classical thermal physics since the 1930s until our days, describing the cluster of meanings involved in those conceptual usages and underlying their historical evolution. Secondly, I will assess the extent to which those all-pervasive entangled use of entropy and information notions can be somehow (e.g., semantically, epistemically [Anta 2021], mathematically, and so) defective in classical physics, such as being poorly-defined, being simply meaningless or even fostering confusion among scientists (see Anta 2023). Thirdly, I will evaluate the two main strategies in the marked directed to improve these defective conceptual practices: (i) substituting the highly-connotated terms of ‘entropy’ and ‘information’ by some non-loaded terms, as for instance it was first argued by Yehoshua Bar-Hillel in the 1950s; or (ii) developing a prescription on how these terms should be correctly employed to be technically meaningful, as it was pioneered by Rudolf Carnap (1977) in Two Essays on Entropy. Finally, I will argue that, in order for some ameliorative strategy to succeed, it would be necessary to develop a well-defined implementational strategy (Anta 2025), by which ameliorative conceptual prescriptions about how to use ‘entropy’ and ‘information’ could have a significant impact in the scientific community.

Benjamin Jantzen (Virginia Tech): Information and dynamics: competing metaphors for biological intelligence
It is fashionable to consider organism behavior---the variable response of a living thing in a dynamic environment---in terms of information. Organisms are said to harvest or take in information from the environment that is then processed to generate a behavioral response. This informational perspective in turn supports a number of theses concerning behavior, such as the idea that large brains are needed for "complex" behavioral repertoires, or that adaptive behavior involves the storage of states reflecting bits of information about the world external to the organism. There are reasons to be suspicious of these and other consequences of informational thinking, and thus of the underlying metaphor. Here, I will focus just on the connection between internal and external states in producing adaptive behavior.
There is a long tradition of describing organisms as information processors, and their internal states as engaged in storage of information and its transformation by computation. Broadly speaking, this perspective divides behavior along the boundary of the organism. Information enters by why of sensation, is transformed by computation, and exits in the form of behaviors, whether physiological or mechanical. Such a view demands that enough information be passed in from sensory systems at a high enough rate and be transformed quickly enough via computation to produce responsive, adaptive behavior. It thus puts constraints on the relation between the world outside and the world inside an organism. Rate-distortion theory, for instance, tells us, when given some measure of performance as a function of internal and external states, how many bits of mutual information between internal states and the environment are needed to optimize performance. It is presumed that this mutual information between external and internal states is like matter or energy that the organism must then harvest to survive.
However, I argue that two classes of system for realizing adaptive behavior---reservoir computers and systems exhibiting "strong anticipation"---put pressure on this view, or at least on the utility of information talk as anything more than metaphorical. In both cases, it seems that a straightforward application of information theoretic approaches yield contradictory judgments on the amount of information the system must harvest from its environment, and that it's probably best to view these cases in dynamical, rather than information-theoretic terms. This in turn suggests ways of improving our quantitative study and application of such systems by selecting more appropriate metrics of their behavioral capacities.

Diskussionsgruppen

(A) Tim Riedel und Annica Vieser: It from Bit
(B) Oxana Shaya: What are resources in quantum computing?
(C) Sophie Atzpodien: Entropie und Information

Studierendenvorträge

Piet Fritz Pankratz: A Non-Factivist Approach to the Epistemic Aims of Evolutionary Biology
My talk explores how Potochnik’s (2017, 2020) non-factivist account of scientific understanding can inform the epistemic aims of evolutionary biology. I argue that idealized models are not merely pragmatic tools but essential for scientific inquiry in evolutionary theory, as they isolate key causal mechanisms. This clarifies how evolutionary biology prioritizes understanding over truth.

Arnulf Kung: Die Definitionen von Entropie in der Statistischen Mechanik
Statistische Mechanik ist jene Theorie, die das Verhalten physikalischer Systeme auf makroskopischer Ebene mit dem seiner mikroskopischen Konstituenten verknüpft und ist seit gut 100 Jahren unentbehrlicher Bestandteil der Physik. Umso erstaunlicher ist es, dass bis heute für eine ihrer zentralen Größen, die Entropie, zwei unterschiedliche Definitionen existieren. Im Vortrag sollen diese vorgestellt und ihre Bedeutung für unser Verständnis des zweiten Hauptsatzes der Thermodynamik erläutert werden.

Rebecca Spörl: Real Patterns and Algorithmic Information Theory
In recent years, there has been a renaissance of the ‘Real Patterns’ approach to the question of the ontological commitments of the special sciences. Originating in Daniel Denntt’s 1991 paper about the ontology of believes, Real Patterns have found their way into many areas of philosophy of science. This is mainly because they establish a close link between regularities and compression. Real Patterns allow for a compelling account of how regularities structure physical phenomena while offering a high degree of explanatory utility and predictability. However, as Dennett’s criterion for a pattern’s reality lies solely in its explanatory success, this places Real Patterns in an uneasy position between realism and instrumentalism. By understanding patterns as an algorithmic compression of information, I am proposing a more realist reading of real patterns. Using Algorithmic Information theoretical methodology, I will illustrate how Real Patterns can establish a relationship between different levels of ontology.

Finn Kuhl: Kultur in den quantentheoretischen Diskussionen zu Pragmatismus und Ontologie