A broad overview of a set of research themes I work on. The descriptions and pointers are not meant to be exhaustive -- for this, see the other dedicated pages.
I'm currently working on the question of knowing whether a given type T has a unique inhabitant -- modulo a given notion of program equivalence. I wrote two blog posts (part 1, and part 2) on the idea. I gave a talk in may 2013 at the PLUME seminar in Lyon, whose slides are available. I discussed some examples applications at the Dependently Typed Programming (DTP'13) Workshop (abstract, slides)
In early 2015 I proposed an algorithm to decide unique inhabitation in the simply-lambda calculus with sums, modulo beta-eta-equivalence. This work is described in a paper and a prototype implementation is available there.
My thesis, defended in March 2016, expands on this. See the specific phd page or the manuscript. Abstract:
Some programming language features (coercions, type-classes, implicits) rely on inferring a part of the code that is determined by its usage context. In order to better understand the theoretical underpinnings of this mechanism, we ask: when is it the case that there is a unique program that could have been guessed, or in other words that all possible guesses result in equivalent program fragments? Which types have a unique inhabitant?
To approach the question of unicity, we build on work in proof theory on more canonical representation of proofs. Using the proofs-as-programs correspondence, we can adapt the logical technique of focusing to obtain more canonical program representations.
In the setting of simply-typed lambda-calculus with sums and the empty type, equipped with the strong beta-eta-equivalence, we show that uniqueness is decidable. We present a saturating focused logic that introduces irreducible cuts on positive types “as soon as possible”. Goal-directed proof search in this logic gives an effective algorithm that returns either zero, one or two distinct inhabitants for any given type.
To start the research on unique inhabitants, I have been looking at the work on ((maximal) (multi-)focusing for type/proof systems with sum types. (See these 2014 slides introducing maximal multi-focusing.)
There are interesting connections between these proof-theoretical techniques and the problem of deciding equivalence of pure program. I have looked at these connections from two different angles:
If multi-focusing is formulated in a logic closer to the lambda-calculus (intuitionistic natural deduction), it reveals close links with existing work on equivalence algorithms for simply-typed lambda-calculus with sums, in particular by Sam Lindley. This is described in this preprint (written in 2013, revised in early 2015).
Another direction is to bring programming languages closer to the elegant symmetries of the sequent calculus. I have collaborated with Guillaume Munch-Maccagnoni (2015) on the study of normal forms in a calculus of sequent-style terms, polarized System L, in presence of sums. This allows to describe algorithms deciding beta-eta equivalence in a very uniform way, directly at the level of untyped terms.
When designing type systems for programming languages, there is often an unnoticed mismatch between the dynamic semantics we use and the soundness guarantees given by the type system. We use weak reduction strategies (for implementation reasons or to reason about side-effect order) but type-soundness would actually hold ("Well-typed program do not go wrong") if we used full reduction, allowing to reduce under lambdas.
This is all fine, until we end up designing language feature that break the latter property (safety with respect to full reduction), and we don't notice because we only play with the weak reduction. It means there is something fundamental about those features that we haven't noticed/understood, that should be important for reasoning about programs. GADTs are such a feature.
Didier and I study this question in the following draft: Full reduction in the face of absurdity (2014). To technically support our consideration, this document uses a consistent coercion calculus as developped by Julien Crétin during his PhD thesis -- it is a very convenient and expressive framework in which to study these issues. For a simplified setting (System F plus logical assumptions), you may want to look at my slides (with notes) from the Journées LTP du GDR GPL, October 2014.
When GADTs were added by Jacques Garrigue and Jacques Le Normand in OCaml 4.00, we found out that their interaction with subtyping had not been studied theoretically, in particular we were not sure how to soundly assign variances to GADT parameters, and a very restrictive criterion was implement in OCaml. I worked in 2012 on finding the right variance check for GADTs, and this resulted in the article GADTs meet subtyping (arxiv), also available as a long version or slides.
In 2011, I worked with Andreas Abel on meta-theoretical aspects of predicative dependent type theories -- definitional-algorithmic correspondence with logical relations, and universe subtyping and polymorphism. I enjoyed collaborating to his article On Irrelevance and Algorithmic Equality in Predicative Type Theory (arxiv), and also wrote a much less rigorous internship report on the question of universe subtyping.
In a former internship with Didier Rémy at Gallium (2010), I worked on MLF, an System F-like type system with type inference; I did some prototyping and explored adding Fω-style type-level operators. A horribly-formated report is available, but this topic would need some more work because it raised more questions than it answered.
In June-July 2009, I worked with Jérôme Vouillon on a statically typed domain-specific language for database queries embedded in OCaml, Macaque. The library is publicly available and in a minimal maintainance mode (if bugs are found, they get fixed). We wrote a french-language article for JFLA'10.