This page concerns the Lectures on Logic and its Mathematical Aspects (LLAMA) seminar series at the Institute for Logic, Language & Computation (ILLC) of the University of Amsterdam.
Click to reveal the abstract.
I will give an overview of Dafny, a verification-aware programming language that is currently maintained by my colleagues and me at Amazon Web Services. Dafny is a mature language with support for common programming concepts such as inductive datatypes, lambda expressions, and classes with mutable state. It can be compiled to C#, Java, Javascript, Python, and Go, making integration with existing workflow possible. At the same time, Dafny offers an extensive toolbox for mathematical proofs about programs, including quantifiers, (co-)induction, and the ability to use and prove lemmas. Programs can be annotated in the style of Hoare-Logic, with pre-and post-conditions, loop-invariants, as well as termination and read/write framing specifications. Proofs can be verified instantaneously, through compilation to the Boogie intermediate language, which uses the Z3 SMT-solver to discharge proof obligations. After an initial introduction, I’ll give a live-demo of Dafny. I’ll review some of its publicly known use-cases at Amazon, and highlight its potential for research efforts at the ILLC.
Conditional Boolean algebras form a variety that provides algebraic semantics for Chellas’s basic conditional logic from [5]. Conditional logics trace their roots to the works of Lewis [6, 7] and Stalnaker [9] and have been used to model a wide range of conditional statements, including counterfactuals and conditional obligations. While most conditional logics studied in the literature are extensions of classical logic, recent research have also explored conditional logics that extend intuitionistic logic.
Any Boolean algebra can be turned into a conditional algebra by expanding it to a structure \(\mathfrak{B} = ⟨B, \cdot, +, \rightarrow, -, 0, 1⟩\) where \(\rightarrow\) is a binary operator that satisfies for all \(a, b, c \in B\) that (C1) \(a \rightarrow 1 = 1\) and (C2) \((a \rightarrow b) \cdot (a → c) = a \rightarrow (b \cdot c)\). In this presentation, we focus on those conditional algebras that additionally satisfy (C3) \((a+b) \rightarrow c \leq (a \rightarrow c) \cdot (b \rightarrow c)\). The variety of conditional algebras with (C3) provides algebraic semantics for an extension of Chellas’s conditional logic.
This variety contains two important subvarieties. On one hand, we have pseudo-subordination algebras [4], which are conditional algebras \(\mathfrak{B} = ⟨B, ·, +, \rightarrow, -, 0, 1⟩\) that satisfy (1) \(0 \rightarrow a = 1\) and (2) \((a + b) \rightarrow c = (a \rightarrow c) \cdot (b \rightarrow c)\) for all \(a, b, c \in B\). The variety of pseudo-subordination algebras is term-equivalent to the variety of Boolean algebras with a binary modal operator. The connection between pseudosubordination algebras and subordination algebras has been established in [4] and [2].
On the other hand, we have the subvariety of Boolean weak Heyting algebras. Recall that a weak Heyting algebra is an algebra \(⟨B, \cdot, +, \rightarrow, 0, 1⟩\) such that its reduct \(⟨B, \cdot, +, 0, 1⟩\) is a bounded distributive lattice, and the binary arrow operator satisfies the following equations: (1) \(a \rightarrow a = 1\), (2) \((a \rightarrow b) \cdot (a \rightarrow c) = a \rightarrow (b \cdot c)\), (3) \((a + b) \rightarrow c = (a \rightarrow c) \cdot (b \rightarrow c)\), (4) \((a \rightarrow b) \cdot (b \rightarrow c) \leq a \rightarrow c\). The class of weak Heyting algebras was defined in [3] and it corresponds to the strict implication fragment of the normal modal logic K which is also known as the subintuitionistic local consequence of the class of all Kripke models. A Boolean weak Heyting algebra is an algebra \(⟨B, \cdot, +, -, \rightarrow, 0, 1⟩\) such that \(⟨B, \cdot, +, \rightarrow, 0, 1⟩\) is a weak Heyting algebra and \(⟨B, ·, +, -, 0, 1⟩\) is a Boolean algebra.
The core concept of conditionality is closely related to the notion of necessity. Specifically, conditionals can be seen as expressions of relative necessity: "if A, then B" means that the proposition expressed by B is, in some sense, necessary with respect to the one expressed by A. From an algebraic perspective, each element \(a \in B\) of the Boolean algebra gives rise to a normal necessity operator \([a]\), defined as \([a](b) := a \rightarrow b\). This yields a family of modal operators indexed by the elements of the algebra (see [8, 10, 5]). It is known that modal operators on B are in a one-to-one correspondence with modally definable subordinations on B, as discussed in [2]. In the same way, it can be proved that conditional algebras that satisfy condition (C3) are in one-to-one correspondence to a subclass of weak extended contact algebras, introduced in [1], that on the other hand are a generalization of extended contact algebras, and can be seen as Boolean algebras endowed with a family of subordinations.
Our aim is to discuss representation and duality theorems for conditional algebras and use them to study significant subvarieties within this framework. We will also make comparisons with existing duality theorems in the literature for precontact algebras. Additionally, we will use the duality to establish correspondence and canonicity results.
This is joint work with Sergio Celani and Rafał Gruszczyński.
A map \(f : M \to N\) between two structures \(M\) and \(N\) is said to be a homomorphism when for every atomic formula \(\phi(x_1, \ldots , x_n)\) and \(a_1, \ldots , a_n \in M\), \(M \models \phi(a_1, \ldots , a_n)\) implies \(N \models \phi(f(a_1), \ldots , f(a_n))\). Positive model theory is the part of model theory that deals with the formulas that are preserved by homomorphisms (see, e.g., [9, 10, 11, 14, 15, 16]). It is well known that these coincide with the positive formulas, i.e., formulas built from atomic formulas and \(\bot\) using only \(\exists\), \(\wedge\), and \(\vee\).
The Keisler-Shelah’s Isomorphism Theorem states that two structures are elementarily equivalent if and only if they have isomorphic ultrapowers. This celebrated result was first proved by Keisler under the generalized continuum hypothesis (GCH) [4, Thm. 2.4]. This assumption was later shown to be redundant by Shelah [12, p. 244]. The aim of this talk is to prove a version of Keisler’s original theorem in the context of positive model theory.
To this end, we say that two structures are positively equivalent when they have the same positive theory. Then we introduce a generalization of the ultraproduct construction that captures positive equivalence. We term this construction a prime product because it is obtained by replacing the index set I of an ultraproduct by a poset \(\mathbb{X}\) and the ultrafilter over I by a prime filter of the bounded distributive lattice of upsets of the poset \(\mathbb{X}\). The case of traditional ultraproducts is then recovered by requiring the order of \(\mathbb{X}\) to be the identity relation.
Prime products and positive formulas are connected by a positive incarnation of Łos’ ́Theorem. As a consequence, prime products preserve not only positive formulas, but also the universal closure of the implications between them, known as basic h-inductive sentences [11]. This allows us to describe the classes of models of h-inductive theories as those closed under isomorphisms, prime products, and ultraroots.
Our main result states that under GCH two structures have the same positive theory if and only if they have isomorphic prime powers of ultrapowers. The same result holds without GCH, provided that prime powers are replaced by prime products in the statement. Notably, the presence of ultrapowers cannot be removed from these theorems, as there exists positively equivalent structures without isomorphic prime powers. The results of this talk have been collected in the paper [8].
This is joint work with J.J. Wannenburg and K. Yamamoto.
We shall present an overview of recent results on the minimal undecidable fragments of modal and superintuitionistic predicate logics. In particular, we shall prove that every sublogic of QKTB, QGL, and Grz, as well as every sublogic of QKC are undecidable in languages with a single predicate letter and two individual variables. We shall also prove analogous results for predicate logics determined by the Kripke frames with finitely many possible worlds validating QKTB, QGL, Grz, and QKC. (Here, QL denotes the minimal predicate extension of a propositional logic L.)
This is joint work with Mikhail Rybakov.
In this talk I will introduce a new notion of the degree of the finite model property (the degree of FMP) for intermediate and modal logics. This notion resembles that of the degree of incompleteness of modal logics introduced by Fine (1974). I will show that any countable cardinal, as well as the continuum, can be realized as the degree of FMP for some intermediate and transitive modal logic.
This is joint work with Guram Bezhanishvili and Tommaso Moraschini.
Provability logic is a modal logic in the usual propositional modal language with unary modal operator \(\Box\) in which \(\Box A\) is interpreted as provability of \(A\) in a fixed first-order mathematical theory. An early celebrated result is due to Robert Solovay 1976, which classifies the Gödel-Löb logic \(\sf GL\) as the provability logic of all strong-enough mathematical first-order theories.
Natural question for the characterization of provability logic of Heyting Arithmetic \(\sf HA\), was considered by Albert Visser since 1980 and left open as a major open problem since then.
Here in this talk we go through a complete characterization and decidability of the provability logic of \(\sf HA\). As one might expect, some new subjects shows up, namely unification, projectivity and admissible rules. Also a Kripke style semantic which appears to be new, is provided for the purpose of arithmetical completeness. In this semantic, possible words in the Kripke-style semantic are augmented with derivability in a simple modal system.
There is a lively debate in the current literature on epistemology on which type of ignorance may provide a moral excuse. A good candidate is the one in which an agent has never thought about or considered as true a proposition p. From a logical perspective, no formal analysis was provided for ignorance as an excuse. We fill this gap by providing an original logical setting for modelling this type of ignorance. In particular, we introduce a complete and sound logic in which ignorance is expressed as a primitive modality. Semantically, the logic is characterized by Kripke semantics with possibly incomplete worlds, or, equivalently, Kripke models equipped with a three-valued valuation function. Moreover, to consider the conditions of a possible change of an agent’s ignorance, we will extend the setting to public announcement logic. First, we show that standard update procedures are not suitable for an adequate representation of scenarios involving excusable ignorance. Secondly, we define an original update procedure, which modifies the initial models by creating new worlds and accessibilities, instead of eliminating them. We exemplify the fruitfulness of this procedure via an intuitive example.
Orthologic was introduced by Goldblatt (1974) as a formalism intended to capture some of the logical properties of ortholattices — algebras satisfying the same laws as Boolean algebras, except possibly distributivity. In the same paper, the author introduced a Kripke semantics for orthologic, and a sound and faithful translation of orthologic into the modal logic system KTB, having some striking similarities to the Godel-McKinsey-Tarski (GMT) translation between intuitionistic logic and the modal logic system S4. Later work by Miyazaki further highlighted these connections by developing a theory of "modal companions", connecting extensions of orthologic with normal extensions of KTB.
In this talk I will show that despite these similarities, the Goldblatt translation does not enjoy most of the nice properties of the GMT. Using elementary observations, I show that there can be no isomorphism between the lattices of extensions. Using the duality for ortholattices introduced by Goldblatt (1974) and Bimbo (2004), I also show that a classical theory of modal companions cannot be developed. In the end I will remark on the role of implication in these results, and some further developments pursued in this direction. I also comment on the consequences of these results for a broader understanding of translations into modal logic. This talk reports on work done in my masters thesis, supervised by N. Bezhanishvili and T. Moraschini.
Non-normal modal logics are called in this way because they do not validate some axioms or rules of the basic normal modal logic K. They have been studied since the seminal works by C.I. Lewis, Lemmon, Kripke, Segerberg, Scott, and Montague, and are applied in many areas, such as deontic, epistemic, multi-agent, and probabilistic reasoning. In this talk, we present an original semantics and a sequent calculus for non-normal modal logics. We first analyse the properties of the calculus, then we show how semantical properties, like complexity of the satisfiability problem and the bounded model property, can be derived from the properties of the calculus. In the last part of the talk, we apply these results to some specific systems. This talk is based on joint works with Charles Grellois, Nicola Olivetti, Björn Lellmann, Sara Negri, Elaine Pimentel and Gian Luca Pozzato.
We analyze modal definability and axiomatizability of modal reasoning in commutative groups and vector spaces, showing where standard modal notions apply and where the fit is less ideal. Our positive results include several completeness theorems using techniques from hybrid logic. In the process, we also identify broader connections with complex algebras, substructural logics, and logics of (in-)dependence.
This talk is based on a recently published article, found here.
Turing introduced progressions of theories obtained by iterating the process of extension of a theory by its consistency assertion. Generalized Turing progressions can be used to characterize the theorems of a given arithmetical theory of quantifier complexity level \(\Pi^0_n\), for any specific \(n\). Such characterizations reveal a lot of information about a theory, in particular, yield consistency proofs, bounds on provable transfinite induction and provably recursive functions.
Conservativity spectrum of an arithmetical theory is a sequence of ordinals characterizing its theorems of quantifier complexiy levels \(\Pi_1\), \(\Pi_2\), etc. by iterated reflection principles. We describe the class all such sequences and show that it bears a natural structure of a Kripke frame similar to the one for the variable-free fragment of the polymodal logic GLP introduced by K. Ignatiev. Furthermore, this structure can also be considered as a natural algebraic model of a strictly positive modal logic - reflection calculus with conservativity modalities.
Dependence logic was introduced by Väänänen (2007) as a novel formalism for reasoning about dependence and independence relations. The logic adopts the team semantics of Hodges (1997). The basic idea of team semantics is that dependency properties can only manifest themselves in multitudes, and thus formulas of dependence logic are evaluated on sets of assignments (called teams) instead of single assignment as in the usual Tarskian semantics.
Teams are essentially relations, which are second-order objects. Dependence logic is known to be equivalent to existential second-order logic, and thus cannot be effectively axiomatized in full. In this talk, I will survey some recent developments in finding partial axiomatizations for dependence logic. In particular, I will also discuss the role of “negation” in this project.
Hodges famously said "model theory = algebraic geometry minus fields". However, the current exercise of model theory, to a large extent, is still "set-based", and lacks a functorial approach — the latter is essential to modern algebraic geometry, since it provides us very powerful geometry tools to study algebraic varieties. In this talk, I will discuss how to adapt a functorial language to model theory, raise several examples to indicate why a categorical approach will conceptualise many classical constructions and theorems in model theory, and show how such a language leads to a duality result for first-order logic, with some applications included. The content of this talk is largely expository with few genuine new results, except possibly for some partial observations and a coherent categorical narrative. However, I see this as an invitation to the development of a functorial model theory.
Uniform interpolation is a strengthening of interpolation in which the interpolant only depends on the premise or the conclusion of the implication. There are well-known logics with uniform interpolation, such as K and IPC. But not in all logics does interpolation imply this stronger version. The modal logics K4 and S4 are examples of this.
Why some logics with interpolation have the stronger property while others that seem quite similar do not is not quite clear, neither technically nor conceptually. Especially in proof theory there is the phenomenon of logics for which there is a perfectly constructive proof-theoretic proof of interpolation that does not extend to a proof of uniform interpolation. In this talk it will be shown how for reductive sequent calculi a proof-theoretic proof of interpolation does imply that the corresponding logic has uniform interpolation. Here a calculus is reductive if proof search in the calculus is terminating and it has some other desirable properties.
The motivation behind this research in universal proof theory is the search for methods to establish which logics do and which do not have good sequent calculi, meaning calculi with good structural properties. In the talk the connection between the technical part and this general aim will be explained.
It is customary to think that descriptions of the material reality by different sciences form "layers of abstraction": chemistry is more high-level than physics, biology more high-level than chemistry and so on. We introduce layered props: a diagrammatic framework that is able to handle situations with multiple levels of description, as well as translations between them. To motivate the formalism, we express two examples of scientific modelling as layered props: electrical circuits and retrosynthetic analysis in chemistry. We conclude by showing that the syntactic description of a layered prop can be interpreted naturally as a certain subcategory of pointed profunctors.
This is joint work (in progress) with Fabio Zanasi (UCL) and Ella Gale (Bristol).
The Dutch Association for Logic (VvL) will hold an annual in-person joint seminar organized by the University of Groningen. The event is inspired by the departmental logic seminars that are organized at each university, and aims to unify the universities for a collaborative seminar. Besides hosting a main speaker, the seminar will also be the location of the award ceremony of the VvL MSc Thesis Prize winners, who will give a short presentation of their thesis.
The Goldblatt-Thomason theorem provides a characterization of modally definable classes of Kripke frames. There is a version of this theorem for topological spaces with respect to the topological semantics interpreting the modal diamond as the closure operator. In this talk, we investigate the definability problems for topological spaces with respect to the topological derived-set operator. This is joint work with N. Bezhanishvili and D. Fernandez Duque.
Starting from the Thirties, Bruno de Finetti layed the ground for the development of subjective probability (also called Bayesian) in form of a betting game: In his setting, the probability of an event is the amount that a rational agent is willing to bet on it. de Finetti also proved that this point of view on probability is consistent with the widely used axiomatic approach. Indeed, it is possible to define a notion of coherence for the choices of the bets on the events and to prove that a book is coherent if, and only if, it can be extended to a probability measure on the Boolean algebra generated by the events. Coherence is therefore a bridge between subjective and objective, axiomatic, probability.
Another point of view on probability theory is the so-called frequentist approach. In this case, the probability of an event is defined by the frequency of that event based on previous observations. In this setting, it is common to define a statistical model that fits the observed data, and to derive the properties of the hidden probability distribution in this way. Perhaps surprisingly, subjective probability is related to frequentist probability again via de Finetti’s work, and in particular via the notion of exchangeability that, loosely speaking, shows how statistical models appear in a Bayesian framework, and how probabilities can come from statistics.
In this talk, we discuss the results of [1, 2]. In particular, we discuss the notion of coherence and exchangeability, and give a version of the celebrated de Finetti's theorems in a peculiar infinitary logic that expands the well known Lukasiewicz logic.
The “happened-before” relation is an order relation over events playing a special role in formal verification. Pomset logic handles this relation logically, that is, as a proper connective of the logic. However, the proof-theoretical tools developed for this logic are limited to series-parallel orders.
In this talk I will present proof systems operating on graphs instead of formulas to overcome the restrictions coming from the use of in-line formulas. To this aim, I will recall some notions from graph theory and explain the proof theoretical methodologies required to prove analyticity and conservativity in this formalism.
Uniform interpolation is a property of logics that is stronger than the well-known Craig interpolation property, in the sense that a uniform interpolant only depends on the antecedent or the conclusion of the implication. It has close connections to so-called bisimulation quantification in Kripke models. Proof-theoretic methods proving uniform interpolation have been successfully applied to logics described by terminating sequent calculi. In this talk we turn to nested sequents. We reprove the uniform interpolation property for normal modal logics K, D, and T via terminating nested sequents. While our method of defining the interpolants is proof-theoretic, we use semantic notions including bisimulation modulo propositional variables to prove their correctness. We are also able to adjust our method to provide a direct proof of uniform interpolation for S5 using hypersequents.
This is joint work with Raheleh Jalali and Roman Kuznets.
A surprising result of Pitts (1992) says that propositional quantifiers are definable internally in intuitionistic propositional logic (IPC) - this result is commonly known as the uniform interpolation theorem for IPC. We recently completed a formalization of Pitts' result in the Coq proof assistant, and thus a verified implementation of the propositional quantifiers. Our Coq formalization in addition allowed us to extract an OCaml program that computes propositional formulas that realize intuitionistic versions of ∃p φ and ∀p φ from p and φ, and runs quickly on examples.
In this talk I will give some background for the result, and I will make some points about its proof that we clarified during its implementation, which also give some new insights about the "pen-and-paper" proof. In our implementation, as in Pitts' original proof, the propositional quantifiers are defined by a mutual induction based on Dyckhoff's proof calculus LJT. The correctness of the construction is then proved by an induction on the size of an LJT-proof. The techniques used and lessons learned from our implementation may also be of more general interest for verified implementations of other proof-theoretic results.
This is joint work with Hugo Férée (IRIF, Université Paris Cité).
The states of a deterministic finite automaton can be identified with collections of words, each state corresponding to the words arriving to the state, when computed by the automaton. But words can be ordered and among the many possible orders; a very natural one is the co-lexicographic one because it suggests a transfer of the order from words to the automaton’s states. This suggestion is, in fact, concrete and in a number of papers. Automata admitting a total ordering of states coherent with the co-lex ordering of the set of words reaching them have been proposed, studied, and implemented. Such a class of ordered automata—the Wheeler automata—turned out to be efficiently stored/searched; moreover, the set of words reaching each of their states have been proved to be intervals of the co-lexicographic order, implying some nice properties on languages accepted, determinization and the like.
Unfortunately not all automata can be totally ordered as previously outlined. However, automata can always be partially ordered using the co-lexicographic order over words, and an intrinsic measure of their complexity can be defined and effectively determined, as the minimum width of one of their admissible partial orders. We will show that this new concept of width of an automaton has useful consequences.
This talk is an overview of several work done in collaboration with
Transitive Closure Logic (TCL) enriches the language of first order logic with a recursive operator, expressing the transitive closure of a binary relation. Cyclic sequent calculi proofs for this logic have been introduced in the literature. In this talk, I will present a new cyclic proof system for TCL, which employs hypersequents. The main motivation behind this work is that, differently from sequent TCL proofs, hypersequent proofs for TCL successfully simulate cyclic proofs in the transitive fragment of Propositional Dynamic Logic (PDL+), without using the non-analytic rule of cut.
After introducing the cyclic hypersequent calculus for TCL, I will present two main results. First, I will sketch a proof of soundness for the calculus, which requires a more involved argument than the one used in traditional cyclic proofs. Second, I will show how PDL+ cyclic proofs can be translated into TCL hypersequent proofs. This result allows to lift the completeness result for cyclic PDL+ to the hypersequent calculus.
This talk is based on joint work with Anupam Das.
Cyclic proof systems permit derivations that are finite graphs rather than conventional derivation trees. The soundness of such proofs is ensured by a condition on the infinite paths through the derivation graph, known as the global trace condition. The usual proof strategy for showing soundness of a cyclic proof system is highly classical. If the cyclic proof system is that of an intuitionistic theory, such as Heyting arithmetic, a constructive soundness proof would be preferable.
We present ongoing work on a procedure which translates proofs in a cyclic proof system of Heyting arithmetic to proofs in a standard proof system of the same. We believe that this procedure is not only effective but also that its correctness can be verified constructively. If this is the case, combining it with the standard soundness proof for Heyting arithmetic would yield a constructive soundness proof for the cyclic proof system of Heyting arithmetic. Much of the procedure deals with combinatorial aspects of cyclic proofs, rather than Heyting arithmetic specifically, and should thus be transferable to other theories to obtain analogous results.
One of the foundational results of (classical) finite model theory, due to Ronald Fagin, shows that every formula of a finite relational language is either almost surely true or almost surely false over finite structures. This has been called a 0-1 law. This talk intends to contribute to the finite model theory of many-valued logics by generalizing this 0-1 law to first-order finite models valued over finite MTL-chains. We will prove that for any first-order (or infinitary with finitely many variables) formula Phi, there is a truth-value that Phi takes almost surely in every finite many-valued model and such that every other truth-value is almost surely not taken. This result also generalizes a theorem for Lukasiewicz logic due to Robert Kosik and Christian G. Fermüller. (This talk is based on joint work with Guillermo Badia.)
Stone’s representation theorem for Boolean algebras gives a bridge between algebra and topology in the form of a categorical duality. In his PhD thesis, de Vries generalized this duality to a duality between compact Hausdorff spaces and what are nowadays called de Vries algebras, which are structures that encode the set of regular open sets of a given compact Hausdorff space. One drawback of the category of de Vries algebras is that the composition of morphisms is not function composition. We propose to work with relations (rather than functions) as morphisms between de Vries algebras: this has the advantage that the composition of morphisms is usual relation composition. Moreover, this approach allows for an extension of de Vries duality to a duality for the category of compact Hausdorff spaces and closed relations between them.
We consider generalisations of Kripke's semantics for Intuitionistic logic adequate for a range of constructive, contractionless logics. These same are obtained partly thru algebraic insights gleaned from Jipsen-Montagna's work on poset products, as well as more recent work by others, including e.g. Castiglioni and Zuluaga. We compare these approaches to other so-called "relational" approaches from the literature, and consider some problems left open in this line of research.
We introduce a cut-free sequent calculus for the alternation-free fragment of the modal mu-calculus. This system allows both for infinite and for finite, circular proofs and uses a simple focus mechanism to control the unravelling of fixpoints along infinite branches. We show that the proof system is sound and complete for the set of valid guarded formulas of the alternation-free mu-calculus. As an application we use the system to show that the alternation-free mu-calculus is closed under Craig interpolation.
(joint work with Yde Venema)
In this talk, I discuss Kleene algebra as a framework for program verification, and in particular focus on a concurrent extension of NetKAT (CNetKAT). NetKAT is an algebraic theory used to describe network behaviour, and CNetKAT extends that with operators for specifying and reasoning about concurrency in scenarios where multiple packets interact through state. More generally, CNetKAT can be understood as an algebraic framework for reasoning about programs with both local state (in packets) and global state (in a global store). The talk will cover some of the syntax and semantics of CNetKAT through an example, and present a soundness and completeness result. The remaining time will be spent on highlighting some of the issues that were encountered creating CNetKAT, and in particular the general issues surrounding extensions of KA with concurrency.
In this talk, I will try to present several results concerning the monadic second-order (MSO) theory of (labelled) countable linear orders. This theory is known to be decidable since the results of Rabin (69) and Shelah (75). After setting up the more recent algebraic framework of countable monoids, I will give a high level intuition on how it can be used to answer several questions:
Propositional provability logics capture in a sense the always provable or always true principles that concern the formalised provability predicate. Since Solovay we know that all sound theories of some minimal strength have the same logic: the PSPACE complete logic GL (Gödel-Löb). The hope arose that something similar could be obtained for predicate provability logics. Through beautiful results by Montagna, Artemov, Vardanyan and McGee, among others, it turned out that the situation is as ghastly as it could possibly be and the resulting logics are highly undecidable or even non-arithmetical. Furthermore, various results by Vardanyan, Berarducci and Visser-de Jonge have shown that, in a sense, this wild behaviour is pretty stable as the influential paper “No escape from Vardanyan’s Theorem” indeed suggests.
Recently we wrote a paper “An escape from Vardanyan’s Theorem” where we carve out a decidable fragment of quantified provability logic that is sound and complete in a certain sense. The key move is switching to so-called strictly positive fragments. In these fragments, we consider statements of the form “A entails B” where A and B allow only for predicate symbols, the logical constant true, conjunctions, universal quantifications and consistency statements. It was a known phenomenon that strictly positive fragments typically lower the complexity of the logic, mostly from PSPACE to PTIME. In our case we fall from Π_{2} complete all the way back to decidable. We will expose the results and comment on some proof techniques. Near the end of the talk we will comment on work in progress where instead of one consistency statement we wish to allow for an arbitrary sequence of consistency statements of increasing strength.
This is joint work with Ana de Almeida Borges: https://arxiv.org/abs/2102.13091
Composing monads via distributive laws is tricky business, as too often small mistakes are overlooked. After failing to find a distributive law for the list monad over itself, I started proving that finding such a distributive law is impossible. At the same time, Dan Marsden was studying the famous counterexample by Gordon Plotkin that probability does not distribute over non-determinism. Together we developed an algebraic method to find and generalise such counterexamples, resulting in our no-go theorems. In this talk I will explain the main ideas behind our method, illustrated by a proof that 'plus does not distribute over times'. Then, I will highlight some crucial steps in our method, which tell us which type of monads are "high risk" in failing to compose with other monads. Lastly (time permitting), I will say a few words about my current research on combining monads with guarded recursion.
It is well-known that for any regular language there exists a unique minimal DFA accepting it. Establishing a similar result for NFA is more difficult, but of great practical importance, as NFA can be exponentially more succinct than DFA. The main issue is that a regular language can be accepted by several size-minimal NFA that are not isomorphic. A number of sub-classes of NFA have been identified in the literature to tackle this issue, which all admit canonical representatives. In this paper we provide a general categorical framework based on distributive laws that unifies constructions of canonical NFA and unveils new ones.
I present a generic partition refinement algorithm that quotients coalgebraic systems by behavioural equivalence, an important task in system analysis and verification. Coalgebraic generality allows to cover not only classical relational systems but also, e.g. various forms of weighted systems and furthermore to flexibly combine existing system types. Under assumptions on the type functor that allow representing its coalgebras in terms of nodes and edges, the algorithm runs in time O(m log n) for input coalgebras with n states and m edges. The generic complexity result and the possibility of combining system types yields a toolbox for efficient partition refinement algorithms. Instances of our generic algorithm match the run-time of the best known algorithms for unlabelled transition systems, Markov chains, deterministic automata, Segala systems, and for color refinement. For weighted tree automata, the instance of the generic algorithm even improves the best run-time known. The algorithm is implemented in a tool (CoPaR) in full generality and allows the composition of existing systems types at run-time.
We consider Horn formula equations, which are special existential second-order formulas. Our main result states that Horn formula equations are valid iff there are solutions expressible in least fixed-point logic. These canonical solutions correspond to the least and greatest solution in the Horn and dual-Horn case, respectively. As applications and corollaries of the fixed-point theorem we obtain new results and simpler proofs of existing results in second-order quantifier elimination, program verification and proof theory.
This is joint work with Stefan Hetzl, see also https://arxiv.org/abs/2109.04633.
Recently, Baltag & van Benthem introduced a logic of functional dependence LFD with atomic dependence statements and dependence quantifiers that is interpreted over generalised assignment models. Indeed, the logic is closely related to the logic of Cylindrical Relativized Set Algebras (CRS); CRS quantifiers and dependence quantifiers are interdefinable. It follows that LFD can be seen as an extension of CRS with dependence atoms. Decidability was shown by Baltag & van Benthem by a detour via quasi-models. However, this essentially involved an infinite construction, leaving the finite model property (fmp) as an open problem. We solve this by showing that LFD has the fmp, using Herwig's theorem on extending partial isomorphisms. Furthermore, we introduce dependence bisimulations and prove a van Benthem-characterization theorem, characterizing LFD as the largest fragment FOL invariant under this notion.
(These results are the culmination of a project I did under the supervision of Alexandru Baltag.)
https://link.springer.com/article/10.1007/s10992-020-09588-z
https://arxiv.org/abs/2109.08313
I will first explain STV counting and the parlous state of computer-counting code implemented by various Election Commissions from around Australia. I will then explain how we used Coq to specify a "vanilla" version of STV as a proof-calculus and used it to extract a computer program which not only counts votes according to this specification but also produces a certificate during the count. The specification of the certificate is derived from the counting rules. We have proved, in Coq, that if the certificate is correct with respect to its specification, then the result it encapsulates must be correct with respect to the relevant specification of STV. The certificate is designed so that an average third-year computer science student could write a computer program to check the correctness of the certificate. In particular, each political party could hire their own programmer to easily scrutiny the count produced by any computer program, including our own, that produces such certificates. The only caveat is that we require the publication of all ballots.
In this talk, we propose a general method to "decode" (propositional) intuitionistic logic and various intermediate logics as (dynamic) epistemic logics of knowing how. Our approach is inspired by some scattered ideas hidden in the vast literature of math, philosophy, CS, and linguistics about intuitionistic logic in the past 100 years, which echoed Heyting's initial (but forgotten?) conception about intuitionistic truth as "knowing how to prove". The core technical idea is to use a bundled know-how modality to unify the formalized Brouwer–Heyting–Kolmogorov interpretation and the Kripke semantics of intuitionistic logic. As we will see, this approach can reveal the hidden complicated dynamic-epistemic information behind the innocent-looking connectives but makes intuitionistic logic intuitive. With the natural but precise epistemic readings of intuitionistic formulas, various technical results about intuitionistic logic and intermediate logics become transparent, if not trivial. If time permits, we show how to decode inquisitive logic and some version of dependence logic as epistemic logics in order to demonstrate the power of this general approach, which could also be a meeting point of various research themes at ILLC.
(The talk is partly based on the joint work with Haoyu Wang and Yunsong Wang)
I will present recent results establishing decidability and complexity upper bounds for many axiomatic extensions of FLec and FLew. The substructural logics FLec and FLew are obtained from the sequent calculus for intuitionistic propositional logic (FLecw) by omitting the weakening rule and contraction rule respectively. This work can be seen as extending Kripke's decidability argument (1959) and Urquhart's upper bound (1999) for FLec. A notable consequence is a first upper bound for the basic fuzzy logic MTL.
Joint work with A. R. Balasubramanian and Timo Lang. See https://arxiv.org/abs/2104.09716 for details.
While modal logics are robustly decidable, first order modal logics (FOML) are notoriously undecidable, in the sense that modal extensions of decidable fragments of first order logic are undecidable. For example, the two-variable fragment of FOML is undecidable.
Despite such discouragement, this century has seen several positive results on new fragments, such as the monodic fragment, bundled fragment, term-modal logic etc. We give an overview of this exciting area of research, including our own results.
The results presented are joint with Anantha Padmanabha.
Various problems arising in the context of example-driven approaches to conjunctive query discovery turns out to be intimately related to basic structural properties of the homomorphism lattice of finite structures, such as density, or the existence of duals. I will discuss some of these connections, and highlight relevant results from recent work together with Victor Dalmau. See also https://arxiv.org/abs/2008.06824.
Cyclic and non-well-founded proofs have turned out to be very useful tools in the proof theory of modal fixpoint logics. However, most of the known results in this area only treat a particular modal fixpoint logic at once. This is in contrast to basic modal logic, where much more general proof-theoretic methods exist. In this talk I will outline a first attempt to combine these existing general methods with non-well-founded proof theory in order to uniformly treat multiple modal fixpoint logics.
More precisely, I will build on a technique presented by Ori Lahav (LICS 2013) for constructing cut-free hypersequent calculi for basic modal logics characterised by frames satisfying so-called 'simple' first-order conditions. I investigate the generalisation of this technique to a relatively inexpressive modal fixpoint language, namely unimodal logic with the master modality (a.k.a. the reflexive-transitive closure modality). Soundness is obtained for all simple frame conditions, completeness only for those that will be called 'equable'.
Talks from 2020 and earlier can also be found on the Algebra|Coalgebra seminar website.
The seminar is currently organized by Bahareh Afshari, Benno van den Berg, Nick Bezhanishvili, Franziska Jahnke, Tobias Kappé and Yde Venema. If you're interested in giving a talk at the seminar or have any questions regarding the seminar please contact Tobias at t.w.j.kappe@uva.nl.
If you want to subscribe to or unsubscribe from the LLAMA seminar mailing list, you can do so here. Please heed the warning about not re-using a valuable password.
The LLAMA seminar, under the Institute for Logic, Language and Computing (ILLC), located at Science Park 107, 1098 XG Amsterdam, is responsible for the processing of personal data as shown in this privacy statement.
The LLAMA seminar does not process personal data via this website (https://events.illc.uva.nl/llama), because it is not possible to leave personal data behind. We also do not use social media plugins.
The LLAMA seminar does not take decisions based on automated processing related to this website on matters that can have (significant) consequences for individuals, e.g., decisions taken by computer programs or systems, without involving a person (for example an organiser of the LLAMA seminar).
The LLAMA seminar does not use cookies or similar techniques on this website.