Workshop: Formalization in Philosophy II – University of Copenhagen

Forward this page to a friend Resize Print kalender-ikon Bookmark and Share

Department of Media, Cognition and Communication > Calendar > Workshop: Formalizatio...

Workshop: Formalization in Philosophy II

The second workshop on Formalization in Philosophy brings together researchers applying logical and mathematical tools to philosophical problems, to discuss the methodological, technical and conceptual issues that are common to such areas of inquiry. What is the status of idealization while applying formal methods, and what are acceptable standards of idealization? How do we choose the most appropriate formalism to model a given problem, and what does it mean for a formalization to adequately capture the phenomena it is intended to describe? How do we keep track of the assumptions that are implicit in a given formalism, and how may such assumptions be motivated? The purpose of the workshop is to take  step back and discuss these issues relevant to all areas of philosophy where formal methods are used, such as epistemology, philosophy of language, metaphysics or philosophy of mathematics.

The workshop on Formalization in Philosophy is a joint venture between Lund University and University of Copenhagen.

Organizers:

Sebastian Engvist, University of Lund, Sweden and
Vincent F. Hendricks, University of Copenhagen, Denmark
Henrik A. Boensvang, University of Copenhagen

Invited speakers, their affiliation and title & abstracts:

Patrick Blackburn, University of Roskilde, Denmark

Title:

"Meaning, interaction and natural language"

Abstract:

Understanding how natural language works, and what meaning is, has been an (elusive) goal since ancient times, but the work of such figures as Turing, Chomsky, and  Montague, and the rise of computational linguistics, has brought deeper understanding. In the first part of this talk, I will attempt an overview of this line of inquiry. In the second, I will focus on where I feel the new challenge lies: moving from static meanings to the dynamics of interaction, or (to put it in linguistic terminology) to indicate what is involved when move beyond semantics to pragmatics, and attempt to model conversational interactions.

Sven Ove Hansson, Royal Institute of Technology, Sweden

Title:

"Belief change with direct choices"

Abstract:

A framework for belief change is introduced in which the selection mechanism operates directly on potential outcomes rather than on possible worlds or maximal consistent subsets of the original belief set. A general form of belief revision (descriptor revision) is introduced that revises by generalized success conditions. It includes revision and contraction as special cases. A generalization of epistemic entrenchment, epistemic accessibility, is defined that applies to all belief changes in general, not only to contraction. It has standard entrenchment (for contraction) as a special case. Furthermore, Ramsey test conditionals can be introduced into this system, since the impossibility theorem does not apply.

Wlodek Rabinowicz, Lund University, Sweden

Title:

"Transferring Formal Models: From Values to Probabilities"

Abstract:

Setting off from the Fitting-Attitudes analysis of value, I will briefly recapitulate my account of value relations which makes room for both incommensurability and vagueness in value comparisons. I will then suggest that an analogous fitting-attitude approach and an analogous formal modeling can be used to account for incommensurability and vagueness in probability comparisons.  On this format of analysis, to be probable is to be credible: A proposition (or an event) A is probable if it is fitting, given evidence, to give high credence to A. This 'epistemic' interpretation of probability has a long tradition that culminates with Keynes's A Treatise on Probability (1921). The modeling I use might be seen as an elucidation of  Keynes's ideas about incommensurable probabilities.

Thomas Bolander, Technical University of Denmark, Denmark

Title

"From Human Brains to Computer Programs: Formalisation and Idealisation in Artificial Intelligence"

Abstract:

According to the father of artificial intelligence (AI), John McCarthy, ``AI is the science and engineering of making intelligent machines''.   Towards realising this goal, AI researchers as myself take substantial inspiration from human reasoning. We A) informally study how humans solve a particular class of problems, and B) try to mimic this in a computer. The intermediate stage between A and B is a mathematical, formal model of the reasoning taking place. It could be a sub-symbolic model, e.g. a neural network, or a symbolic model, e.g. a logic-based automated planner. A natural question is to which extend these mathematical, formal models can be seen as formalisations or idealisations of cognitive processes in human brains? Another question is: Should AI researchers even care, if the goal is to build ``intelligent machines'', not to understand the human brain or human reasoning? In my talk, I will try to address these questions, taking departure in -- and relating to -- my own research on using dynamic epistemic logics and automated planning to construct AI systems with social reasoning capabilities. 

Sara Negri, University of Helsinki, Finland

Title:

"Formal counterfactual reasoning"

Abstract:

Counterfactual conditionals escape the usual laws of logic and thus formal systems that represent counterfactual reasoning cannot be obtained as extensions of traditional logical systems. Even the powerful relational semantics, successful on a host of philosophical logics, is not sufficiently expressive to handle them.

Complete proof systems for counterfactual reasoning are here formulated on the basis of a topological semantics inspired by Lewis' sphere semantics.

The approach is a further development of sequent systems extended by labels for possible worlds.

The goal is to obtain with minimal semantical assumptions analytic calculi for counterfactual reasoning that obey the strict methodological requirements of traditional Gentzen-style proof systems.

Justine Jacot, Lund University, Sweden

Title:

"Independence-friendly languages for signaling games"

Abstract:

Signaling games developed by David Lewis represent coordination problems which involve a sender observing a state of the world and sending a message (signal) to a receiver, who can only observe the message but not the state. Each of them will act according to a contingency plan chosen in relation with expectations about the other player's choices. Communication is successful if they reach some coordination equilibrium where agents' strategies led to sending the right signal. The phenomenon of signaling in independence-friendly languages has been studied widely by Jaakko Hintikka and Gabriel Sandu through game-theoretic semantics, where Abelard (the player for universal quantifiers and conjunctions in game-theoretic terms) stands for the state of nature, and Eloise (the player for existential quantifiers and disjunctions) stands for a team of a sender and a receiver.

Brian Skyrms developed variants of signaling games in order to account for the emergence of inference in signaling situations. In these games, there are four states, two senders, each of whom has only two signals, and one receiver who has to determine the state of the world given the partial messages she receives. Given the fact that a single message will be sent to signal a state or another, Skyrms thinks that this kind of signaling system provides an example of the emergence of the meaning of `proto-truth functions', and in turn of logical connectives (conjunction and disjunction). 

The aim of this talk is to explore the possibility to capture the emergence of these `proto-truth functions' in terms of independence-friendly languages.

Peter Pagin, Stockholm university, Sweden

Title:

"Semantic theory and linguistic processing"

Abstract:

A semantic theory should have the potential of contributing to the explanation of success in linguistic communication. What does this desideratum require of the relation between the semantic theory on the one hand and cognitive states and processes on the other? In particular, what does it require with respect to meaning operations, which semantically correspond to grammatical constructions? I shall explore the question by way of a theory of higher-order dispositions as the cognitive counterparts to meaning operations.

Stig Andur, University of Roskilde, Denmark

Title:

"The Role of Ideal Elements in Mathematics"

Abstract:

The physicist Richard Feynman distinguishes between two styles of mathematical reasoning. The Babylonian style is characterized by knowledge of a large number of examples and some theorems that are central for handling similar problems. The Euclidian style is the modern axiomatic style where formal demonstrations within a definite framework are essential. Feynman argues that physicists require the Babylonian style and that the Euclidian one may impede progress in physics. We will argue that Feynman’s view to some extent is right both with respect to applications in physics and in other empirical fields. What is less known is that David Hilbert held a similar view as can be seen from his work on mathematical physics. According to Hilbert both views play a pivotal role in mathematics. We shall argue that they are hold together by construction of ideal elements (idealen Gebilden).

Lars Birkedal, Aarhus University, Denmark

Title:

"Formalized Models and Logics for Reasoning about Programs"

Abstract:

I will give an overview of my groups work on a formalization of higher-order separation logic for an object-oriented programming language in the Coq theorem prover.   In particular, I will explain how we use the higher-order features of Coq crucially to specify and reason about challenging programs.  Moreover, I will outline how we make use of Coq tactics to simplify reasoning in the embedded logic.

Finally, I will discuss some of the assumptions that are underlying our formalization and reflect upon these.

Carsten Schürmann, IT University of Copenhagen, Denmark

Title:

"Epistemic Proofs 'R Us"

Abstract:

Security, voting, communicating, and cryptographic protocols share some of the same properties: They are mathematical in nature, they are complex and difficult to implement, and society at large is depending on them.  Specifying and proving protocols correct is an increasingly important socio-technical challenge.  Epistemic logic provides a common language that can bridge the two cultures.

In my talk, I will show how we use epistemic logic to express voting schemes and their operational meaning, and will conjecture how to reason about their properties using hybrid-logic.

This work is part of the DemTech research project that was sponsored by Det Strategiske Forskningsrådet grant CCR-0325808.

Paula Quinon, Lund University, Sweden

Title:

"Is the Church-Turing Thesis a Carnapian Explication?"

Abstract:

In 1950 Carnap denied that semi-formal axiomatic systems constituted explications. He exemplified his argument with Peano arithmetic, which sometimes is taken to be an explication of such arithmetical concepts as ``one’’, ``two’’ or ``plus’’. But according to Carnap the lack of an explicit interpretation of primitive axiomatic terms used in axioms formulation prevents the Peano axioms from fulfilling this demand. In this talk I disagree with Carnap. I claim that some semi-formal axiomatic systems actually do explicate. As my example I consider the concept of recursive functions used to formalise the Church-Turing thesis concerning computable function.

Staffan Angere, Lund University, Sweden

Title:

"When is the proper time to publish?"

Abstract:

There is more and more pressure on scientists to publish often, and publish early. I will investigate this practice and its consequences from a social-epistemological point of view, using the simulation environment Laputa.

Valentin Goranko,Technical University of Denmark, Denmark

Title:

"Etudes on formalizing knowledge and  ignorance"

Abstract:

I will discuss some classical and some more recent ideas on formalizing the concepts of knowledge and ignorance. In particular, I will focus on modeling evolving knowledge over time and through events. Then I will discuss the notion of  ignorance, of a single agent and in a multi-agent environment. I will propose some approaches and will present some challenges in capturing and formalizing adequately the relationship between knowledge and ignorance.

Erik Olsson, Lund University, Sweden

Title:

"Gettier and Explication: A 60 Year Old Solution to a 50 Year Old Problem"

Abstract:

I challenge a cornerstone of the Gettier debate: that a proposed analysis of the concept of knowledge is inadequate unless it entails that people don’t know in Gettier cases. I do so from the perspective of Carnap’s methodology of explication. It turns out that the Gettier problem per se is not a fatal problem for any account of knowledge, thus understood. It all depends on how the account fares regarding other putative counter examples and the further Carnapian desiderata of exactness, fruitfulness and simplicity. Carnap proposed his methodology more than a decade before Gettier’s seminal paper appeared, making the present solution to the problem a candidate for being the least ad hoc proposal on the market, i.e. one whose independent standing cannot be questioned. As an illustration of the method at work, I reconstruct reliabilism as an attempt to provide an explication of the (externalist) concept of knowledge.