The Mathematical Foundations group at the Department of Computer Science, University of Bath, will host a 2.5-day workshop on structural proof theory, starting in the afternoon of 14 December, preceded by a short course on deep inference in the morning of 14 December.
The workshop will focus on the various aspects of structural proof theory, including but not limited to the following topics:
There is no fee or formal registration for the workshop and anyone is welcome to attend. However, at this stage, we ask that anyone who intends to attend informs us as soon as possible (deadline for expression of interest was 20 November) so that we may accordingly plan coffee breaks and social activities.
All enquiries should be made to firstname.lastname@example.org.
The current list of participants is available.
Room 1W 2.103, University of Bath.
(Due to the high quality and number of contributions received by the committee, we have decided to replace the previously advertised course on deep inference by an abridged version on the morning preceding the workshop.)
Deep inference is a modern proof theory offering a better understanding of proofs and extending the range of applications of traditional Gentzen proof theory. This course will offer a brief introduction to deep inference.
|Monday, December 14|
|10:30 -- 11:00||Welcome coffee and cookies|
|11:00 -- 13:00||
|Andrea Aler Tubella, Paola Bruscoli, Alessio Guglielmi, Benjamin Ralph (Bath).
Deep Inference - Normalisation and Complexity.
|13:00 -- 14:20||Lunch|
|14:20 -- 15:10||
|Norbert Preining (JAIST). Proof theory of projective geometry -- Orevkov's speed up result in strange surroundings Invited talk|
|15:10 -- 15:30||Coffee Break|
|15:30 -- 17:00||
|Arnold Beckmann (Swansea). Hyper natural deduction. Invited talk|
|Sonia Marin (INRIA and LIX). Focused and Synthetic Nested Sequents|
|Tuesday, December 15|
|9:00 - 10:30||Session 3:||Roy Dyckhoff (St Andrews). Coherentisation for first-order logic. Invited talk|
|Marco Volpe (INRIA and LIX). Focused Proof Systems for Modal Logic.|
|10:30 - 10:50||Coffee Break|
|10:50 - 12:20||Session 4:||Alessio Guglielmi (Bath). Designing a Proof System Around a Normalisation Procedure. Invited talk|
|Andrea Aler Tubella (Bath). A generalised cut-elimination procedure through subatomic logic.|
|12:20 - 13:40||Lunch|
|13:40 -- 15:10||
|Michel Parigot (CNRS and PPS). <TBA>. Invited talk|
|Ben Ralph (Bath). A natural cut-elimination procedure for classical first-order logic|
|15:10 -- 15:30||Coffee Break|
|15:30 -- 17:00||
|Simona Ronchi della Rocca (Torino). Intersection types and implicit computational complexity. Invited talk|
|Fanny He (Bath). Towards an atomic lambda-mu-calculus.|
Social dinner (served at 19:30)
The Chequers of Bath, 50 Rivers St, Bath BA1 2QA
|Wednesday, December 16|
|9:00 - 10:30||Session 7:||Stefano Berardi (Torino). Game Semantics and the Complexity of Interaction (about a result of Federico Aschieri) Invited talk|
|Thomas Powell (Innsbruck). Variations on learning: relating the epsilon calculus to proof interpretations.|
|10:30 - 10:50||Coffee Break|
|10:50 - 12:20||Session 8:||Marc Bagnol (Ottawa). Complexity of MALL proofnets and binary decision trees.|
|Luca Roversi (Torino). A class of recursive reversible functions. Invited talk|
|12:20 - 13:40||Lunch|
|13:40 -- 15:10||
|Tom Gundersen (Red Hat). <TBA>. Invited talk|
|Björn Lellmann (Vienna). Linear Nested Sequents.|
|15:10 -- 15:30||Coffee Break|
|15:30 -- 17:00||
|Taus Brock-Nannestad (INRIA and Copenhagen). Reconciling two notions of cut-elimination.|
|Georg Moser (Innsbruck). The Epsilon Theorems: Simple Things Made Simple. Invited talk|
Andrea Aler Tubella, Paola Bruscoli, Alessio Guglielmi, Ben Ralph (Bath)
Deep Inference -- Normalisation and Complexity (two-hour mini course)
This is a course for experts in Gentzen proof theory who want to quickly get the basics of deep inference. We know that in Gentzen theory the normalisation options are limited, for example cut elimination for classical logic is not confluent. Moreover, the complexity of analytic proofs can be high, for example Statman tautologies need exponential size analytic proofs. We argue that these problems are due to unnecessary restrictions of proof composition.
Deep inference is nothing else than allowing proof composition by the same connectives used to compose formulae. This simple stipulation has many benefits. In particular, it is extremely simple to obtain confluent cut elimination for classical logic and an exponential speed-up over analytic proofs in the sequent calculus; for example analytic proofs of Statman tautologies only need polynomial size.
More in general, deep inference allows us to break proofs into elementary components of bounded size, a property that we call locality. This way we can use normalisation methods based on topology. We also obtain a much finer control over the complexity of proofs because the various compression mechanisms, such as cut and sharing, can be separated and dealt with independently.
The aim of this course is to introduce these concepts based on six ideas illustrated by examples, as follows:
1. Free composition of proofs leads to locality (AG) Notes for 1, 3
2. Cut elimination by splitting and decomposition (AAT)
3. Locality leads to topological control of normalisation (AG)
4. Confluent cut elimination based on experiments (BR)
5. Exponential speed-up over Gentzen analytic proofs (PB) Slides for 5, 6
6. Cut and sharing in deep inference vs. the sequent calculus (PB)
Andrea Aler Tubella (Bath)
A generalised cut-elimination procedure through subatomic logic
Through subatomic logic we are able to present sufficient conditions for a proof system to enjoy cut-elimination. In this talk I will present subatomic logic, how it enables us to present proof systems that have single (linear) rule scheme and a recent result: we can generalise the splitting procedure for cut-elimination to any proof system whose rules and connectives have certain properties.
Marc Bagnol (Ottawa)
Complexity of MALL proofnets and binary decision trees
We discuss the complexity of proofnets for the multiplicative-additive without units (MALL-) fragment of linear logic. Recent results shown that even a simple fragment such as MLL (multiplicative linear logic with units) cannot have a tractable notion of proofnet due to its equivalence problem being PSpace-complete. In MALL- the situation turns out to be different, with a low-complexity equivalence problem. Still a correspondence between MALL- proofs and binary decision trees allows for (partial) negative results in this fragment too.
Arnold Beckmann (Swansea)
Hyper Natural Deduction (joint work with Norbert Preining – JAIST)
We introduce Hyper Natural Deduction as an extension of Gentzen's Natural Deduction system by communication like rules. The motivation is to obtain a natural deduction like system which is equivalent to Avron's Hypersequent Calculus for Goedel-Dummett Logic, and which permits natural conversion rules for normalisation as an extension of the well-known conversion rules for Natural Deduction. The ultimate goal of this project is to establish a Curry-Howard correspondance to some (yet to be found) model of parallel computation. An additional aim is to shed further light on Avron's suggestion [in A. Avron: Hypersequents, Logical Consequence and Intermediate Logics for Concurrency. Annals of Mathematics and Artificial Intelligence, 4(1991), 225-248] that a hypersequent can be thought of as a multiprocess.
(Supported by a Royal Society Daiwa Anglo-Japanese Foundation International Exchanges Award No. IE130639)
Stefano Berardi (Torino)
Game Semantics and the Complexity of Interaction (about a result of Federico Aschieri)
We present a new abstract complexity result obtained by F. Aschieri about Coquand and Hyland-Ong game semantics, that will lead to new bounds on the length of first-order cut-elimination, normalization, interaction between expansion trees and any other dialogical process game semantics can model and apply to. In particular, Aschieri provides a novel method to bound the length of interactions between visible strategies and to measure precisely the tower of exponentials defining the worst-case complexity. This study improves the old estimates by possibly several exponentials and is based on Berardi-de'Liguoro notion of level of backtracking.
Taus Brock-Nannestad (INRIA)
Reconciling two notions of cut elimination (joint work with Nicolas Guenot – ITU Copenhagen)
The admissibility of the cut principle, often called "cut elimination" when it refers to the elimination of all instances of the cut rule, is the central result of proof theory. There is however a distinction to be made between two forms that this result can take, depending on the choice of either the provability viewpoint or the structural viewpoint. We propose a thorough study of the relation between these two ways of assigning a "meaning" to cut elimination.
Roy Dyckhoff (St Andrews)
Coherentisation of first-order logic (joint work with Sara Negri – Helsinki)
In a famous paper of 1920, Skolem showed (as an improvement on a technique of Löwenheim) that from every first-order sentence one can construct a co-satisfiable forall\exists-sentence; co-satisfiability can easily be improved to "forming a conservative extension". The same paper illustrated his approach to lattice theory, where the simple form (i.e. the coherence, aka geometricity) of the theory's axioms allows a simple form of reasoning using rules rather than axioms. Much later, it was realised that his result can be modified to show that every first-order theory has (by use of new relation symbols) a coherent conservative extension and that the lattice theory techniques are examples of the replacement of axioms by rules, as in work of modern authors such as Bezem, Coquand, Lombardi, Negri and Simpson. The efficiency and naturality of this method will be illustrated and discussed. Our novel contribution is an algorithm that converts any first-order axiom into a finite set of coherent axioms, using a technique that avoids any major pre-processing such as PNF or NNF and is idempotent, i.e. it leaves axioms that are [almost] of the right form [almost] unchanged.
Alessio Guglielmi (Bath)
Designing a Proof System Around a Normalisation Procedure
I will show a proof system, called KV, which is a multiplicative linear logic with a self-dual non-commutative connective 'seq' and a self-dual modality 'star' providing contraction over seq. Given this characteristics, KV could form the core of a proof system for a Kleene algebra. KV can also be seen as an extension of BV with the star modality.
My interest in KV stems from its design principles. Usually, we give a computational interpretation to the normalisation procedures of existing proof systems. Contrary to this tradition, my starting point has been a certain computational behaviour, and KV arises as a way to comply with that behaviour. More specifically, the design constraints of KV consist in respecting the separation of three compression/composition mechanisms: 1) sharing/contraction, 2) reuse/cut, 3) modularity/substitution. These three mechanisms are subject to three independent normalisation procedures, respectively: 1) decomposition, 2) splitting, 3) 'Formalism B' normalisation.
This work helps clarifying the three mentioned techniques, which are still under development, although decomposition and splitting have been used for almost fifteen years. Nonetheless, both decomposition and splitting receive a substantial simplification in KV, and especially so compared to recent related work. One remarkable phenomenon is that under the natural and logic-independent constraints that I imposed, system KV seems to be canonical, in the sense that there is no room to manoeuvre in the design of the rules.
Tom Gundersen (Red Hat)
Strong normalisation in the typed atomic lambda calculus
Fanny He (Bath)
Towards an atomic lambda-mu-calculus
We develop the atomic lambda-mu-calculus, a lambda-mu-calculus with explicit sharing and atomicity, refining the classical lambda-mu-calculus and implementing smaller steps of reduction on individual constructors. Our aim is to study explicit sharing in a calculus with classical operators using the concept of atomicity, which comes from deep inference.
Björn Lellmann (Vienna)
Linear nested sequents
A major objective of modern proof theory for modal logics is the development of extensions of the sequent framework which permit satisfactory internal calculi for the standard modal logics. In particular, one of the main desiderata for such calculi is the presence of separate left and right introduction rules for the modalities. This objective has been met for example in the framework of nested sequents or tree-hypersequents, where sequents are replaced with trees of sequents. However, from a conceptual point of view it is also interesting to develop the simplest extension of the sequent framework in which such calculi can be given. This gives rise to the question whether the tree structure of nested sequent resp. tree-hypersequents can be restricted to a simpler structure while retaining the advantages of the framework. Inspired by Masini’s work on 2-sequents from the early nineties, recent work introduced the framework of linear nested sequents, where the tree structure of nested sequents is restricted to that of a line. This framework turns out to be reasonably strong: it permits satisfactory formulations of calculi for a number of standard normal modal logics (via linearisation of the standard nested sequent calculi for these logics) and for propositional and first-order intuitionistic logic (via linearisation of the calculi), but also for the standard non-normal (classical) modal logics and for a number of weak conditional logics. The completeness proofs make use of a connection between the structure of linear nested sequents and that of histories in a backwards proof-search procedure in standard sequent calculi. Since linear nested sequents are special instances of standard nested sequents, these results moreover yield quick completeness proofs for the corresponding full nested sequent calculi. In this talk I am going to discuss the framework of linear nested sequents and some of these results.
Sonia Marin (INRIA & LIX)
Focused and synthetic nested sequents (joint work with Kaustuv Chauduri and Lutz Straßburger)
Focusing is a general technique for transforming a sequent proof system into one with a syntactic separation of non-deterministic choices without sacrificing completeness. This not only improves proof search using sequent calculi, but also has the representational benefit of distilling sequent proofs into synthetic normal forms. We show how to apply the focusing technique to nested sequent calculi, a generalization of ordinary sequent calculi to tree-like instead of list-like structures. We thus improve the reach of focusing to the most commonly studied modal logics, the logics of the modal S5 cube. Among our key contributions is a focused cut-elimination theorem for focused nested sequents.
Georg Moser (Innsbruck)
The Epsilon Theorems: Simple Things Made Simple
Thomas Powell (Innsbruck)
Variations on learning: relating the epsilon calculus to proof interpretations
Like many computational interpretations of classical logic, the epsilon calculus has its roots in Hilbert’s program, its importance originally lying in its foundational role as a means of proving consistency or characterising provably recursive functions. Modern proof theory utilises computational interpretations in a quite different way: As tools for ‘unwinding’ non-constructive proofs and producing concrete realizers for existential statements. As such, there is an increasing interest in understanding the finer details of traditional proof theoretic techniques such as the epsilon calculus, and in particular, understanding how programs extracted using different methods compare to one another. This is a difficult task: Because formal interpretations work on the logic structure of proofs, the process of actually extracting programs from anything but the simplest proof is hugely complex, and the core algorithm implemented by such a program is typically obscured beneath a heavy layer of syntax, with the result that fundamental similarities in the qualitative behaviour of programs extracted using separate methods are not easy to appreciate. The aim of my presentation will be to discuss how programs extracted from classical proofs with the epsilon calculus can be related those extracted via the more widely used family of proof interpretations: here I will focus in particular on Gödel’s Dialectica interpretation and modified realizability. At first glance the epsilon calculus appears radically different to proof interpretation in its approach to realizing existential formulas. The former uses an iterative updating procedure which attempts to solve critical formulas arising in a translated proof, while the latter extract explicit realizing terms in some calculus of higher-type functionals. In this talk I will explore how, despite their differences, these techniques can be linked on a deep level in that they can all yield algorithms which carry out some form of ‘learning’.
Norbert Preining (JAIST)
Proof theory of projective geometry -- Orevkov's speed up result in strange surroundings
This talk will introduce projective geometry and a proof theoretic approach to deal with it, especially a Gentzen style calculus for projective geometry. One of the most important tools in practical proving of theorems in projective geometry is the use of sketches. We present a formalization of sketches and show the equivalence between the full Gentzen system and sketches. Finally we discuss using Orevkovs result on non-elementary speed up between cut-free proofs and proofs using cut and how his ideas and proof can be used to show that under certain circumstances sketches are non-elementary slower than using cuts.
Ben Ralph (Bath)
A natural cut elimination procedure for classical first-order logic
Deep inference has long afforded proof theorists new and powerful techniques for cut elimination in a wide variety of logics. However, these methods can often appear obscure and unintuitive, especially to those unfamiliar with the subject. In this talk, a new method of cut elimination for first-order predicate logic in a deep inference system will be presented, showing how natural and familiar logical concepts, in this case Herbrand’s Theorem, can be incorporated into a cut elimination procedure in a way not possible in Gentzen-style proof theory.
Simona Ronchi Della Rocca (Torino)
Intersection types and implicit computational complexity
Intersection types where either idempotence and/or associativity are erased can supply quantitative information on the the operational behavior of terms. We explore their use in the Implicit Computational Complexity setting, and we supply a type system characterizing PTIME and a type system sound for elementary time.
Luca Roversi (Torino)
A class of recursive reversible functions (joint work with Luca Paolini and Mauro Piccolo)
Reversible Computation originates from searching an answer to a fundamental question about the Second Law of Thermodynamics. Reversible Computation can be modeled at various levels of abstraction and following different styles. The talk will be about the problematics inherent to answering a question like: "Is there any reversible analogous of Primitive Recursive Functions?"
Marco Volpe (INRIA & LIX)
Focused proof systems for modal logic (joint work with Dale Miller)
Focused proofs are sequent calculus proofs that group inference rules into alternating positive and negative phases. These phases can then be used to define macro-level inference rules from Gentzen's original and tiny introduction and structural rules. We show here that the inference rules of labeled proof systems for modal logics can similarly be described as pairs of such positive and negative phases within the LKF focused proof system for first-order classical logic. In particular, we consider the system G3K of Negri for the modal logic K and define a translation from labeled modal formulas into first-order polarized formulas and show a strict correspondence between derivations in the two systems, i.e., each rule application in G3K corresponds to a bipole, i.e., a pair of a positive and a negative phases in LKF. Since geometric axioms (when properly polarized) induce bipoles, this strong correspondence holds for all modal logics whose Kripke frames are characterized by geometric properties. We extend these results to present a focused labeled proof system for this same class of modal logics and show its soundness and completeness. The resulting proof system allows one to define a rich set of normal forms of modal logic proofs.