Checking Algorithms for Pure Type Systems??? L.S. van Benthem Jutting 1 , J. McKinna 2 and R. Pollack 2 Faculty of Mathematics and Computer Science, University of Nijmegen, Toernooiveld 1, 6525 ED Nijmegen, The Netherlands Laboratory for Foundations of Computer Science, University of Edinburgh, The King's Buildings, Edinburgh, EH9 3JZ, Scotland
[email protected],
[email protected] 1 2
1 Introduction This work is motivated by the problem of nding reasonable algorithms for typechecking Pure Type Systems [Bar91] (PTS). There are several implementations of formal systems that are either PTS or closely related to PTS . For example, LEGO [LP92] implements the Pure Calculus of Constructions (PCC) [CH88], the Extended Calculus of Constructions [Luo90] and the Edinburgh Logical Framework (LF) [HHP87]. ELF [Pfe89] implements LF; CONSTRUCTOR [Hel91] implements arbitrary PTS with nite set of sorts. Are these implementations actually correct? Of course, we may enumerate all derivations of a given PTS , and Jutting [vBJ93] has shown that a large class of normalizing PTS have decidable typechecking by computing the normal forms of types, but such techniques are obviously not usable in practice. Algorithms in the literature for particular type systems, such as Huet's Constructive Engine [Hue89], do not obviously extend even to such tame classes as the normalizing and functional PTS . In the rest of this section we brie y review the de nition and well-known theory of PTS , outline the basic approach to checking algorithms, and analyse the diculty in using this approach for checking PTS . In section 1.5, we outline the plan for the rest of this paper.
1.1 Pure Type Systems
A Pure Type System is a quadruple S = fS ; V ; A; Rg , where S is the set of sorts ; elements of S will be denoted by s; s0; s1; : : :, V is the set of variables ; elements of V will be denoted by x; y; z , S \ V = ;, A S S is the set of axioms which we assume to be nonempty, R S S S is the set of -rules. We usually assume we are discussing some speci c PTS , S = fS ; V ; A; Rg, which may be assumed to have special properties in later sections. Let us also assume S is denumerable, and A and R are decidable relations, although these assumptions are only used in discussing algorithmic properties of inductive presentations of relations, not the relationships between dierent presentations. De nition 1 Pseudoterms. The set T of pseudoterms of S is the smallest set satisfying S [ V T, This work was supported by the ESPRIT Basic Research Actions on Logical Frameworks and Types for Proofs and Programs, and by grants from the British Science and Engineering Research Council. ?? A version of this paper appears in Types for Proofs and Programs: International Workshop TYPES'93, Nijmegen, May 1993, Selected Papers, LNCS 806 ?
If a 2 T and b 2 T then a b 2 T , If A 2 T , B 2 T and x 2 V then ( x:A:B ) 2 T , If A 2 T , b 2 T and x 2 V then ( x:A:b) 2 T . Elements of T will be denoted by a , b , c , : : :, A , B , C , : : :. The notions of free and bound variables are de ned as usual, with FV (a) denoting the set of free variables of a . We consider equality between terms to be equivalence modulo -conversion and this equivalence will be denoted by =.
Convention 2 Variables. In this presentation we are informal about variables, and omit side
conditions such as \ x is a fresh variable". Much of this paper has been formalized in the LEGO system in a presentation with explicit variable names [MP93], but thinking in terms of de Bruijn nameless variables will clarify many questions that might arise about variables in the following.
Reduction and conversion We denote the substitution of a for x in b by b[x := a], and write
! for one step -reduction, for -reduction and ' for -convertibility.
From section 4 onward we extend our concept of reduction by allowing for contractions of the form ( x:A:B ) a ! B [x := a] i.e. for application of a product to an argument. Intuitively a -redex ( x:A:B ) a denotes a coordinate axis in the product x:A:B . Such considerations rst appear in the pioneering work of the AUTOMATH group [vD80], and allows a presentation of the basic typing relation free of any direct appeal to substitution, which in contemporary presentations of type theory such as [Bar92] appears explicitly in the rule for typing an application (rule App below). We then have the new notion of -reduction, , and -conversion, ' , generated by the elementary - and - contractions. The well known properties of substitution and reduction, e.g. the Church-Rosser property, extend to and ' . These properties will be used freely in the sequel.
Contexts A context , ? , is a sequence of assignments x:A. If ? = x1:A1; x2:A2; : : :; xn:An (n 0) we write xi :Ai @ ? ? for 1 i Sn, and dom(? ) for the set fx1; x2; : : :; xng. The free variables of ? are de ned by FV (? ) = i FV (Ai ). The empty context is written , and the set of all contexts is C. Inclusion between contexts is de ned by
?1 v ?2 =4 8x; A [x:A @? ?1 ) x:A @? ?2 ] The notion of (one step) -reduction is easily extended to contexts: A ! B ) ?1 ; x:A; ?2 ! ?1 ; x:B; ?2 :
1.2 Correctness
We de ne a relation, ` , called correctness as follows. De nition 3 Correctness. The relation ` C T T is the smallest relation satisfying the following rules. Srt ` s1 : s2 hs1; s2i 2 A Var Wk Pi Lda App Cnv
? `A:s ?; x:A ` x : A ? `b:B ? `A:s ?; x:A ` b : B ? ` A : s1 ?; x:A ` B : s2 ? ` x:A:B : s3 ? ` A : s1 ?; x:A ` b : B ?; x:A ` B : s2 ? ` x:A:b : x:A:B ? ` a : x:B:A ? ` b : B ? ` a b : A[x := b] ? `a:A ? `B:s ? `a:B
b2 S [V
hs1; s2; s3i 2 R hs1; s2; s3i 2 R
A'B
The cognoscenti will notice that the rule Wk in the de nition above is not the usual one, as b is restricted to variables and sorts. It is easy to see from the lemmas below that the relation de ned above is equivalent to the usual PTS system. In fact the presentation above gives a better development of the basic metatheory, because the generation lemma does not depend on weakening. The following properties of PTS can be proved along the lines of [GN91] or [Bar92]. Most proofs are by induction on the structure of ` derivations.
Lemma 4 Free Variables. If ? `a:A then FV(a) [ FV(A) dom(? ): Lemma 5 Start. Suppose ? ` a : A. i hs1; s2i 2 A , ? ` s1 : s2 ii If x:B @? ? then ? ` x : B Lemma 6 Generation. i If ? ` s : A then 9s0 [A ' s0 and hs; s0i 2 A]: ii If ? ` x : A then 9A0 [A ' A0 and x:A0 @ ? ? ]: iii If ? ` x:C:D : A then 9s1 ; s2; s3 [A ' s3 ; hs1 ; s2; s3i 2 R; ? ` C : s1 and ?; x:C ` D : s2 ]:
? ` x:C:d : A then 9s1 ; s2; s3; D [A ' x:C:D; hs1 ; s2; s3i 2 R; ? ` C : s1 ; ?; x:C ` d : D and ?; x:C ` D : s2]: If ? ` cd : A then 9x; C; D [A ' C [x := d]; ? ` c : x:D:C and ? ` d : D]:
iv If v
Lemma 7 Weakening. If ?1 v ?2 ; ?1 ` a : A and ?2 ` b : B then ?2 ` a : A: Lemma 8 Substitution. If ?1; x:A; ?2 ` b : B and ?1 ` a : A then ?1; ?2[x := a] ` b[x := a] : B [x := a]: Lemma 9 Correctness of Types. If ? `a:A then either A 2 S or 9s 2 S [? ` A : s]: Lemma 10 Closure under -reduction, Subject Reduction. If ? ` a : A; ? ?0 ; a a0 and A A0 then ?0 ` a0 : A0 : 1.3 Syntax Directed Systems It is our purpose to describe algorithms, which for given ? , a and A construct a derivation for ? ` a : A . It is known that this problem is undecidable for some PTS , e.g. ? , so a semialgorithm is all we can hope for. There is a trivial such semi-algorithm by enumerating all possible derivations in turn. However we want an ecient semi-algorithm, and we want to know something about when the problem is decidable. In [vBJ93] it is shown that if a PTS is normalizing (i.e. all well-typed terms have a normal form) and has a nite set of sorts, S , then it is decidable, but the given algorithm computes the normal form of types, so while much better than the trivial semi-algorithm above, it is still infeasible. Consider the rules for correctness, de nition 3. There is one rule for deriving the type of a sort, Srt; one rule for the type of a variable, Var; etc. for Pi, Lda and App. Thus it is natural to construct a derivation of ? ` a : A by looking at the shape of a : if a is a sort, use Srt, etc. This is not quite right, because Srt and Var also specify the shape of the context in their conclusion; e.g. Srt only works on the empty context. But this is what rule Wk is for; it is applicable exactly when we want to use Srt or Var, but cannot because the context is of the wrong shape. (This is one reason we prefer our restricted weakening rule to the general one in [Bar92].) This improved plan, to build a derivation ? ` a : A guided by the shape of ? and a (which we call the subject of the judgement) still has one aw: the rule Cnv may be used at any point in a derivation without changing the shape of the subject. Putting it the other way around, you cannot decide when to use Cnv in a derivation by looking at the shape of the subject. A system which does not suer from such a drawback will be called syntax directed. The idea to use a syntax directed presentation for type checking is found in [Mar72] and [Hue89]. We de ne this notion somewhat informally.
De nition 11 syntax directed. A set of rules for a relation ` is called nearly syntax directed if for every ? , a there is at most one rule with a conclusion ? ` a : A . A set of rules for a relation ` is called syntax directed if for every ? , a there is at most one rule with a conclusion ? ` a : A and this rule (if present) produces exactly one type A for a . The relation de ned by a syntax directed set of rules is necessarily the graph of a partial function ?; a 7! A . Whether or not the rules allow us to decide this relation depends on the side conditions of the rules, that is, those \premisses" that are not the relation being de ned. Similarly, how eciently we can compute the partial function de ned by the rules depends on how eciently we can compute the side conditions. Let us assume we have a syntax directed set of rules for a relation equivalent to the correctness relation, ` . Given an algorithm to compute the corresponding partial function ?; a 7! A , which we may call a type-synthesis algorithm, how are we to solve our original problem of typechecking, that is, given ? , a and A to construct a derivation of ? ` a : A ? We use our algorithm to compute some type A0 for a , and some type B for A . Provided that B reduces to some sort s (otherwise A was not a possible type), it now suces to test A and A0 for convertibility, and then appeal to the Cnv rule. In order to produce a syntax directed system that is equivalent to the correctness relation, ` , we consider how the rule Cnv is used in derivations, and are led to propose a nearly syntax directed system of rules de ning a relation `nsd . In order to de ne it, we rst introduce some notation. Notation 12. We write ? `nsd a : A for ? `nsd a : A0 and A0 A . Similar notations eliding the names of intermediate terms will be used in the rest of the paper. De nition 13 `nsd . The relation `nsd C T T is the smallest relation satisfying the following rules. Srt-nsd `nsd s1 : s2 hs1; s2i 2 A
? `nsd A : s ?; x:A `nsd x : A ? `nsd b : B ? `nsd A : s b2S[V Wk-nsd ?; x:A `nsd b : B ? `nsd A : s1 ?; x:A `nsd B : s2 hs1; s2; s3i 2 R Pi-nsd ? `nsd x:A:B : s3 ? `nsd A : s1 ?; x:A `nsd b : B ?; x:A `nsd B : s2 Lda-nsd hs1; s2; s3i 2 R ? `nsd x:A:b : x:A:B ? ` a : x:B:A ? `nsd b : B App-nsd nsd ? `nsd a b : A[x := b] Var-nsd
For this system we can prove soundness.
Lemma 14 Soundness of `nsd . If ? `nsd a : A then ? ` a : A:
Proof. By induction on ? `nsd a : A , using correctness of types (lemma 9) and closure (lemma 10) for ` . ut
Conversely we would like to have Completeness of `nsd : If ? ` a : A then 9A0 2 T [A ' A0 and ? `nsd a : A0 ]. However we have not been able to prove this. In fact the rule Lda-nsd turns out to be an obstacle, both in proving this property directly and of proving some form of closure or correctness of types for `nsd .
1.4 Expansion Postponement In order to analyse the situation, split Cnv into two rules, one for expansion and one for reduction, getting a new system with correctness relation `er . De nition 15 `er . The relation `er C T T is the smallest relation satisfying the ordinary PTS -rules, but having instead of Cnv the following two rules. ? `er a : A Red-er AB ? `er a : B ? `er a : A ? `er B : s B A Exp-er ? `er a : B Lemma 16 Equivalence of ` and `er . ? ` a : A , ? `er a : A Proof. We treat the two cases. ) By induction on ? ` a : A . The interesting case is the use of Cnv: ? ` a : B as a consequence of ? ` a : A , ? ` B : s and A ' B . The induction hypothesis gives ? `er a : A and ? `er B : s . By Church-Rosser A and B have a common reduct C , so ? `er a : C by Red-er and ? `er a : B by Exp-er. ( By induction on ? `er a : A using closure for ` in the case Red-er. ut
In the spirit of our program to remove non-syntax-directed rules, we ask whether Exp-er has any use in derivations (the question was proposed in this form by Henk Barendregt), i.e. we consider a new system, `r , which does not have the expansion rule. De nition 17 `r . The relation `r C T T is the smallest relation satisfying the ordinary PTS -rules, but having instead of the rule Cnv the following rule. ? `r a : A Red-r AB ? `r a : B Observe that `r is equivalent to `nsd
Lemma 18 equivalence of `r and `nsd . i If ? `nsd a : A then ? `r a : A ii If ? `r a : A then 9A0 [A0 A and ? `nsd a : A0 ]
The proof is straightforward; in fact we constructed `nsd from `r by moving uses of Red-r to the end of derivations, i.e by permuting Red-r downward wherever possible through the premisses of other rules. Clearly `r is contained in ` , i.e. `r is sound for ` . We would also like to prove `r is complete for ` in the following sense Expansion Postponement: If ? ` a : A then 9A0 [A A0 and ? `r a : A0 ]: All attempts to prove this have failed to date. The reason is that no form of a closure lemma has been proved for `r and this is due to the failure of inductive proofs on the third premiss of the rule Lda-r. (This is the crux of the diculty in proving `nsd complete for ` .) Informally we might say that the diculty is that B moves from right of the colon in the second premiss, where it is an \output" of the subderivation ?; x:A `r b : B , to left of the colon in the third premiss, where it is an \input" to the subderivation ?; x:A `r B : s2 . We don't have enough structural information on outputs to reason about them as inputs. In proving closure for ` we use (the expansion part of) Cnv to adjust the shape of outputs, but this is not available in `r . Closure is a very delicate property of PTS . At this moment the authors feel that they cannot honestly call expansion postponement for arbitrary PTS a conjecture. Expansion postponement is an intensional concept. It concerns a set of derivation rules for some relation, rather then the relation itself. See remark 3.1 for a frustrating illustration of this point. We now have, somewhat inexactly, `nsd = `r `er = ` Expansion postponement is the property that `r `er . Thus, if we assume that expansion postponement holds for some speci c PTS , then `nsd is a nearly syntax directed presentation of the ` relation for that PTS . Remark. Since we cannot now prove `r has subject reduction or predicate reduction, it is interesting to de ne a relation `R , similar to `r , having the same rules as ` except for Cnv, which is replaced by ? `R a : A ? ` R B : s AB Red-R ? `R a : B There is an expansion postponement problem for `R , which we also cannot prove or disprove. Clearly `R `r , so `R is \even worse" than `r . However `R is easily seen to have the correctness-of-types property, which we cannot prove for `r . In fact, we can show that correctness-of-types for `r implies r-expansion-postponement for all functional PTS (functional PTS are de ned in section 2.2), but this has not helped in the expansion postponement problem for `R .
1.5 Plan for this paper There are two obstructions to deriving syntax directed presentations of arbitrary PTS , namely the possible non-determinism of the side conditions, and, more seriously, the third premiss of the Lda rule. The purpose of this premiss is to guarantee correctness of the type derived for a lambda term using the Lda rule, that is, to make Correctness of Types (lemma 9) true. In this paper we consider two dierent approaches for removing this troublesome premiss while preserving some form of equivalence with ` . We begin in section 2 with further introductory and motivating observations.
Our rst approach, in section 3, is to consider a subclass of PTS , called semi-full, which have the property that the rst two premisses of the Lda rule imply the third premiss. This allows a nearly syntax directed presentation for this class, that fails to be syntax directed only because of non-functionality of the relations A and R . We introduce a technique of schematization to make this presentation syntax directed, and to use it as a typechecking algorithm. This section is a warm-up for the use of schematization in later sections. Special subclasses of PTS such as semi-full are so redundant that the third premiss of the Lda rule is unnecessary. Even for the general PTS this premiss is more restrictive than necessary. In the rest of the paper our second, more general approach, looks more deeply at the presentation of ` in order to take advantage of this redundancy. In section 4 we discuss two closely connected typing relations `o ; `tp which are more liberal than ` . Alone, the system `tp is too weak to have any reasonable properties (for example, closure obviously fails), but corollaries 54 and 55 show that for terms already known to be well-typed, `tp suces for correct typechecking. As it stands, this last statement is somewhat inexact, but for now we postpone the precise account of the delicate interaction between ` ; `o and `tp . These liberal relations are used in section 5 to give a nearly syntax directed presentation of the general PTS , in particular replacing the troublesome third premiss of the Lda rule with an appeal to `tp . This presentation allows arguments about PTS that we do not know how to carry out in the standard presentation of ` given above; for example we prove strengthening for arbitrary PTS . Sections 4 and 5 are the core of the new ideas in this paper, and can be read on their own by a knowledgable reader. For non-functional PTS , the Lda rule retains an essential non-determinism, as terms may acquire more types by reduction (see example 1 in section 2.2 below), but for functional PTS the nearly syntax directed presentation of section 5 can easily be made syntax directed, as explained in section 6, and can then be seen as an ecient semi-decision procedure for all functional PTS , and as an ecient decision procedure for the decidable ones. In section 7 we give a syntax directed system that is related to the general PTS , and discuss how to make it into an ecient semi-decision procedure.
A machine-checked presentation From an early draft of this paper, the second and third au-
thors worked to give a completely machine-checked presentation in the LEGO system of all the results of this paper. We have achieved this objective [MP94], except for considerations of schematic terms and judgments (see subsection 3.2 below). For example, the strengthening theorem (63) is formally checked. Further extensions of this work, for example to systems with cumulativity rules for the sorts, may be found in the the third author's forthcoming Ph.D. thesis [Pol94].
2 Preliminary Observations This section is still introductory. We address several points that are used in future sections, but whose discussion there would be lost in the fray of other matters.
2.1 Making the Application Rule Syntax Directed
Remembering that, lacking a proof of Expansion Postponement, `nsd is not necessarily complete for ` , we nonetheless examine the App-nsd rule in more detail, as motivation for following sections. ? `nsd a : x:B:A ? `nsd b : B App-nsd ? `nsd a b : A[x := b]
It fails to be syntax directed because we haven't speci ed when to stop reducing in the left premiss. In particular A , which appears to the right of the colon in the conclusion, is not determined. The intention is to check that a has some functional type (i.e. some -type), and that the type of b matches the domain of a 's functional type. For this it is sucient to do weak-head reduction (denoted by wh ) on the type of a , since every -type is a weak-head normal form. We are led to an alternative, syntax directed rule ? `nsd a :wh x:B:A ? `nsd b : B 0 B ' B0 ? `nsd a b : A[x := b] in which the type of a b is uniquely determined by the type of a . Rules of this shape will be used in several following sections.
2.2 Functional Pure Type Systems We digress to de ne a well behaved class of PTS that occur commonly in practice. De nition 19 Functional PTS . A PTS is called functional i { hs1; si 2 A and hs1; s0i 2 A implies s = s0 , and { hs1; s2; si 2 R and hs1; s2; s0i 2 R implies s = s0 . For a functional PTS A and R are the graphs of partial functions from S to S and from S S to S respectively, but we do not necessarily have procedures to compute these functions. It is well known that for functional PTS we have uniqueness of types (cf. [GN91] or [Bar92]). Lemma 20 Uniqueness of Types. For functional PTS If ? ` a : A0 and ? ` a : A1 then A0 ' A1 :
Subject Expansion By the Subject Reduction Lemma (lemma 10) we know that a term does
not lose any types by reduction. Might a well-typed term gain types by reduction? For functional PTS this cannot happen. Lemma 21 Subject Expansion. For PTS with Uniqueness of Types If ? ` a : A; b a; and ? ` b : B then ? ` b : A The condition ? ` b : B in this lemma, that b has some type, is necessary to rule out untypable expansions. Proof. By closure (lemma 10) ? ` a : B . By uniqueness of types A ' B . By correctness of types (lemma 9) A 2 S or ? ` A : s for some s . In the rst case, B A , and we are done by closure. In the second case we are done by the rule Cnv. ut The general PTS does not have the subject expansion property: Example 1. Consider the non-functional PTS S = f?; 4; 5; g A = fh?; 4i; h?; 5i; h4; ig R = fh; ; ig
Let a = ( x: 4 :x ) ? . We have a ? , with ` ? : 4 and ` ? : 5 . Consider the types of a in the empty context. First notice that ` a : 4 `4: `4: `4: ` 4 : x:4 ` x : 4 x:4 ` 4 : h; ; i 2 R ` ? : 4 ` x: 4 :x : x: 4 :4 ` (x: 4 :x) ? : 4 Now we claim that ` a : 5 is not derivable, so the well-typed term a gains types by reduction to ? . To prove this claim we use the generation lemma (lemma 6). In general, if ` a : X for some X , then there exist Y , Z , with ` x: 4 :x : x:Y:Z and X ' Z [x := ?]. By the generation lemma again, x:4 ` x : W , x:4 ` W : s , and Z ' W . By the generation lemma again, W ' 4, hence 6` a : 5 . Notice also that this argument never appeals to \there is no rule of a certain shape"; even if we expand R to S S S , every type of a converts with 4 . Examining this example more closely, observe that a variable can have only one type (up to conversion), while a sort can have several types in non-functional PTS (see [vBJ93] for a deep analysis of this point). In the example, we use reduction to replace a variable by a sort, increasing the types derivable for a term. However, this is not the only way that subject expansion can fail: consider the term b = ( x: 4 :?) ? . We leave it to the reader to check that again b ? , ` b : 4 but 6` b : 5 . In this case the reason is that ` 4 : , but 5 has no type at all. Must this process of obtaining more types by reduction eventually stop or are there (nonfunctional) PTS with in nitely many sorts, and (non normalizing) terms in these PTS that keep acquiring more and more types by reduction? As an aside, notice that this PTS is strongly normalizing. It can be mapped into three levels of a strongly normalizing predicative type hierarchy (for de niteness we mention ECC [Luo90], but much weaker systems will do) by the PTS-morphism ([Geu, Geu93]) ? 7! Type(0) 4 7! Type(1) 5 7! Type(1) 7! Type(2): Since this mapping preserves sorts, axioms and rules, any well-typed term in the system above has a well-typed image, and therefore strongly normalizes. Further, since S is nite, this system has decidable typechecking [vBJ93]. ut It is quite dicult to think of a PTS of independent interest that is not functional!
2.3 Side Conditions and Decidability of Typechecking
Remember that we assumed A and R are decidable. Clearly, however, the set
A =4 f s 2 S j 9s0 2 S [hs; s0i 2 A] g
may not be decidable3 . (This is true even for functional PTS .) Furthermore, if A is not decidable, then the PTS does not have decidable typechecking, because x:s ` x : s , s 2 A : Similarly if the relation
R =4 f hs1; s2i 2 S S j 9s3 2 S [hs1; s2; s3i 2 R] g 3
4 The set of topsorts, A = SnA , is interesting in its own right; see [Ber90] or [Pol94].
is not decidable, then the PTS may not have decidable typechecking, but since some -rules may not be usable the situation is not as clear. In a functional PTS , A (respectively R ) is the graph of a partial function. If A (respectively R ) is decidable, then we can compute that function: use decidability of A (respectively R ) to decide if the function is de ned on particular input, and if so use decidability of A (respectively R) to search the denumerable set S for the function value. As A and R are arbitrary given relations, there is not much interesting to say about these side conditions.
2.4 An Optimization: Valid Contexts There is one easy optimization of the de nition of PTS which will shorten derivations, and therefore also the checking algorithms which could produce such derivations. In the premisses of the rules Wk, Pi, Lda, App and Cnv the context ? occurs more then once, and in order to construct a complete derivation its validity must be checked in each subderivation in which it appears. It is much more ecient to assume that we start with a valid context, and only check that when rules extend the context (i.e. the right premiss of Pi and the two right premisses of Lda) they maintain validity. This is also more in keeping with the implementations which are actually used, where we work in a \current context" of mathematical assumptions. (In practice, the context also contains de nitions, but in this paper we will not discuss de nitions.) We formalize this optimization in the following de nition. ? `vtyp a : A should be interpreted as \ a has type A relative to the (possibly incorrect) context ? " and ? `vcxt as \ ? is a valid (or correct) context". De nition 22 `vtyp and `vcxt. The relation `vtyp C T T is the smallest relation satisfying the following rules. Srt-vtyp ? `vtyp s1 : s2 hs1; s2i 2 A Var-vtyp ? `vtyp x : A
x:A @? ?
? `vtyp A : s1 ?; x:A `vtyp B : s2 hs1; s2; s3i 2 R x 62 dom(? ) ? `vtyp x:A:B : s3 ? `vtyp A : s1 ?; x:A `vtyp b : B ?; x:A `vtyp B : s2 hs1 ; s2; s3i 2 R Lda-vtyp x 62 dom(? ) ? `vtyp x:A:b : x:A:B ? `vtyp a : x:B:A ? `vtyp b : B App-vtyp ? `vtyp a b : A[x := b] ? `vtyp a : A ? `vtyp B : s Cnv-vtyp A'B ? `vtyp a : B Pi-vtyp
The predicate `vcxt is the smallest predicate on C satisfying the following rules. Nil-vcxt `vcxt Cons-vcxt
? `vcxt ? `vtyp A : s ?; x:A `vcxt
x 62 dom(? )
Lemma 23 Equivalence of ` and `vtyp . ? ` a : A , ? `vcxt and ? `vtyp a : A
In order to prove this, rst observe that weakening is trivial for `vtyp . Also it is easy to derive generation properties (compare lemma 6) for sorts and variables.
Lemma 24 Weakening and Generation for `vtyp . i If ?1 `vtyp a : A and ?1 v ?2 then ?2 `vtyp a : A ii If ? `vtyp x : A then x:A0 @ ? ? , where either A = A0 or [? `vtyp A : s and A ' A0] iii If ? `vtyp s : A then hs; s0i 2 A and either A = s0 or [? `vtyp A : s1 and A ' s0 ]
Proof of lemma 23. ) By induction on ? ` a : A , using 24 for the case of Wk. ( By induction on the lexicographic order, rst the sum of the lengths of the derivations of ? `vcxt and ? `vtyp a : A , second the length of the derivation of ? `vtyp a : A . ut In what follows we consider many dierent rule systems related to PTS. These systems could be based on the optimized presentation `vtyp , or could be optimized by this technique as they come up. In fact, for simplicity, we will start from the basic PTS , ` , and will not mention this optimization again, leaving its application to the reader.
3 Semi-full Pure Type Systems A PTS is called full i for all s1 ; s2 2 S there exists s3 2 S with hs1; s2; s3i 2 R . In full PTS the troublesome third premiss of the Lda-rule can be omitted. Let us focus on the Lda-rule: ? ` A : s1 ?; x:A ` b : B ?; x:A ` B : s2 hs1; s2; s3i 2 R Lda ? ` x:A:b : x:A:B The purpose of premisses ? ` A : s1 and ?; x:A ` B : s2 is to guarantee that the type x:A:B will be well-formed. But we know from the premiss ?; x:A ` b : B that ? ` A : sA for some sA , and by correctness of types we have either B 2 S or ?; x:A ` B : sB for some sB . In this latter case, for full PTS , we conclude there exists s with hsA ; sB ; si 2 R , so x:A:B is well formed, and it seems sound to replace the right premiss of the Lda-rule by the requirement that B 62 A or rather, making a positive statement, that B 2 S implies hB; sB i 2 A . We can generalize this idea somewhat beyond full PTS . De nition 25 Semi-Full. A PTS is semi-full i for all s1 2 S 9s2; s3 [hs1; s2; s3i 2 R] ) 8s2 9s3 [hs1; s2; s3i 2 R]: While the Pure Calculus of Constructions, P! , and various extensions with type universes are full (e.g. ECC [Luo90]), the Edinburgh Logical Framework, P , is semi-full. In [HHP92] it is shown that P is decidable by giving an algorithm which computes the normal form of types; a very expensive computation in practice. We will show that the algorithm which is actually used in LEGO is both sound and complete. Notice that P! and P are the only semi-full systems in the -cube.
3.1 A Nearly Syntax Directed system for Semi-full PTS's
To begin, de ne a correctness relation `sf which, for a semi-full PTS , is equivalent to the ordinary relation ` . De nition 26 `sf . The relation `sf C T T is the smallest relation satisfying the ordinary PTS -rules, but having instead of the rule Lda the following two rules ? `sf A : s1 ?; x:A `sf b : B hs1; s2 ; s3i 2 R Lda1-sf B 62 S ? `sf x:A:b : x:A:B Lda2-sf
? `sf A : s1 ?; x:A `sf b : sb ? `sf x:A:b : x:A:sb
hs1; s2; s3i 2 R hsb; s4i 2 A
Lemma 27 Equivalence of `sf and ` . For semi-full PTS ? ` a : A , ? `sf a : A . Proof. We do both cases. ) Induction on ? ` a : A . The interesting case is the Lda rule: ? ` x:A:b : x:A:B as a consequence of ? ` A : s1 , ?; x:A ` b : B and ?; x:A ` B : s2 , for some s1 , s2 , s3 such that hs1; s2; s3i 2 R. By induction hypothesis ? `sf A : s1 , ?; x:A `sf b : B and ?; x:A `sf B : s2 . If B 62 S we are done by Lda1-sf. If B 2 S it follows that hB; s2 i 2 A by the generation lemma for `sf , so use Lda2-sf. ( Induction on the derivation of ? `sf a : A . The interesting cases are Lda1-sf and Lda2-sf. Lda1-sf ? `sf x:A:b : x:A:B because of ? `sf A : s1 , ?; x:A `sf b : B and hs1 ; s2; s3i 2 R , while B 62 S . By induction hypothesis ? ` A : s1 and ?; x:A ` b : B . It follows by correctness of types that B 2 S or ?; x:A ` B : sB . As the rst case does not apply, and as our system is semi-full, we have hs1; sB ; s0 i 2 R. Hence ? ` x:A:b : x:A:B by Lda.
Lda2-sf ? `sf x:A:b : x:A:B because ? `sf A : s1 , ?; x:A `sf b : B and hs1 ; s2; s3i 2 R , while B 2 S and hB; s4i 2 A . By induction hypothesis ? ` A : s1 and ?; x:A ` b : B . Also ` B : s4 by Srt, so ?; x:A ` B : s4 by weakening, and because our system is semi-full we have hs1; s4; s0i 2 R, hence again ? ` x:A:b : x:A:B by Lda. ut Remark on Expansion Postponement. Consider two variants of `sf (see section 1.4 and remark 1.4), replacing Cnv-sf by either of the rules ? `sf ?r a : A Red-sf-r A B or ? `sf ?r a : B Red-sf-R
? `sf ?R a : A ? `sf ?R B : s AB ? `sf ?R a : B
We have, up to conversion, the following diagram (for semi-full PTS ):
`R `Tr ` k j k `sf ?R `sf ?r = `sf The rightmost vertical equality is lemma 27, and, since `R has correctness of types, the same proof gives the leftmost vertical equality. However `r is not known to have correctness of types,
so the middle of the three vertical relations remains an inequality. The rightmost equality on the bottom row is by straightforward induction: this is what we gain by removing the right premiss of the lambda rule. As you see, we cannot yet conclude that every ` -derivation can be mimicked up to conversion by a `r -derivation, or by a `R -derivation, even when restricted to semi-full PTS . As Lda-sf does not have the troublesome third premiss, it is now straightforward to de ne a nearly syntax directed system, which is sound and complete for `sf . De nition 28 `sfnsd . The relation `sfnsd C T T is the smallest relation satisfying the following rules. Srt-sfnsd `sfnsd s1 : s2 hs1; s2i 2 A
? `sfnsd A : s ?; x:A `sfnsd x : A ? `sfnsd b : B ? `sfnsd A : s Wk-sfnsd ?; x:A `sfnsd b : B ? `sfnsd A : s1 ?; x:A `sfnsd B : s2 Pi-sfnsd ? `sfnsd x:A:B : s3 ? `sfnsd A : s1 ?; x:A `sfnsd b : B Lda1-sfnsd ? `sfnsd x:A:b : x:A:B ? `sfnsd A : s1 ?; x:A `sfnsd b : sb Lda2-sfnsd ? `sfnsd x:A:b : x:A:sb
Var-sfnsd
App-sfnsd
? `sfnsd a :wh x:B0 :A ? `sfnsd b : B1 ? `sfnsd a b : A[x := b]
b2 S [V
hs1; s2; s3i 2 R hs1; s2; s3i 2 R B 62 S hs1; s2; s3i 2 R hsb; s4i 2 A
Lemma 29 Equivalence of `sf and `sfnsd . For semi-full PTS i If ? `sfnsd a : A then ? `sf a : A ii If ? `sf a : A then 9A0 [A ' A0 and ? `sfnsd a : A0 ]
B0 ' B1
Proof. We prove both parts. i Induction on ? `sfnsd a : A , using the fact that closure holds for `sf by equivalence (lemma 27) and closure for ` (lemma 10). We treat the case of App-sfnsd : ? `sfnsd a b : A[x := b] from ? `sfnsd a : A0 , A0 wh x:B0 :A , ? `sfnsd b : B1 and B0 ' B1 . By induction ? `sf a : A0 and ? `sf b : B0 . Take a common reduct B of B0 and B1 . It follows by closure for `sf that ? `sf a : x:B:A and ? `sf b : B , and therefore ? `sfnsd a b : A[x := b] by App-sf . ii Induction on ? `sf a : A, using correctness of types for `sf , which is again a consequence of equivalence (lemma 27) and correctness of types for ` (lemma 9). Again we do only one case, Lda1-sf: ? `sf x:A:b : x:A:B from ? `sf A : s1 , ?; x:A `sf b : B and hs1; s2 ; s3i 2 R , while B 62 S . By induction hypothesis ? `sfnsd A : A0 ' s1 and ?; x:A `sfnsd b : B0 ' B . It remains to show that B0 2 S implies hB0 ; si 2 A for some s . So suppose B0 2 S , hence B B0 . By correctness of types ? `sf B : sB , hence ? `sf B0 : sB by closure, so by generation hB0 ; sB i 2 A and we are done. ut
3.2 A Syntax Directed system for Semi-full PTS's
The only failure of syntax-directedness in the rules of system `sfnsd is due to possible nonfunctionality of the PTS . For example, if hs1; s2i 2 A and hs1 ; s02i 2 A , the rule Srt-sfnsd can be used to derive the judgements `sfnsd s1 : s2 and `sfnsd s1 : s02 . Similarly Pi-sfnsd , Lda1-sfnsd and Lda2-sfnsd may be non-syntax-directed due to non-functionality of R . For functional PTS `sfnsd is already syntax directed, and may be used directly for typechecking. For non-functional semi-full PTS we will remove this non-determinism by a method suggested in [Hue87, HP91, Pol92], and for this purpose we rst introduce some technical machinery.
Schematic Terms Introduce a new set , disjoint from V and S . Elements of will be called sort variables and will be denoted by ; 0; 1; : : :. The symbols ; ; ; : : : will range over [S .
De nition 30 Schematic Terms. The set, T , of schematic terms is the smallest set satisfying:
T , If X 2 T and b 2 T then X b 2 T , If A 2 T , X 2 T and x 2 V then x:A:X 2 T . We use X , Y , Z to range over T [ T .
In this section we will only need schematic terms of the form x1:A1: : : : xn:An : , but in section 7 the more general notion will be used. If X 2 T [ T we denote by SV (X ) the sort variables of X . Substitution and - and -reduction are easily extended to T . A sort assignment is a partial function from to S . If X 2 T and SV (X ) 2 dom () then X 2 T is de ned in the obvious way. Substitution of pseudoterms and reduction are conserved by assignments, i.e. X Y i X Y for any assignment . A constraint is a nite set of formulas of the form h; i 2 A , h; ; i 2 R and = . The set of all constraints is denoted by Cnstr . If C is a constraint then SV (C ) denotes the set of sort variables occurring in C . A sort assignment is said to satisfy a constraint C , written j= C , i SV (C ) dom () and each of the propositions in C is true. A constraint, C , is said to be consistent or satis able, written C con, if there is a sort assignment satisfying it. We say that a PTS has decidable constraints if for every constraint, C , it is decidable whether or not C is satis able. If S is nite then this condition is clearly ful lled, but PTS with S in nite may also have this property, as the following example shows. Example 2. Consider the PTS with in nitely many strati ed universes S = f Typei j i 2 N g A = f hTypei; Typej i j i < j g R = f hTypei; Typej ; Typeki j i k; j k g The constraints of this theory only contain formulas of the form < and ( = being expressible by the pair and ). All such sets are decidable (in time polynomial in both the number of constraints and the number of variables and constants) by checking acyclicity of a directed graph whose nodes are variables and constants, and whose edges are the relations < and . This result is due to Chan [Cha77], and its application for solving constraints of typechecking was suggested by Huet [Hue87] and detailed in [HP91]. ut
Our purpose will be to compute for any pair ? , a a term X 2 T [ T and a constraint C , (we denote this by ? `sfsd a : X j C ), such that
i If ? `sfsd a : X j C and j= C then ? ` a : X ii If ? ` a : A then ? `sfsd a : X j C and 9 [ j= C and X ' A] The tools developed so far will enable us to make the rules Srt-sfnsd and Pi-sfnsd syntax directed. Let us turn our attention now to the rule App-sfnsd . ? `sfnsd a :wh x:B0 :A ? `sfnsd b : B1 App-sfnsd B0 ' B1 ? `sfnsd a b : A[x := b] We have already made the reduction of the type of a to x:B0 :A deterministic by requiring weak head-reduction; now consider the convertibility of B0 and B1 . B1 could be a schematic term, so we must de ne the constraints under which two terms X , Y 2 T [ T are convertible, in order to give the application rule of `sfsd . De nition 31 Schematic Equality and Schematic Conversion. schematic equality is the smallest relation = j (T [ T ) (T [ T ) Cnstr , satisfying { = j f = g . { If X = Y j C then x:A:X = x:A:Y j C and X a = Y a j C . { If X = Y then X = Y j ; . schematic -conversion is the relation ' j (T [ T ) (T [ T ) Cnstr de ned by
X ' Y j C =4 9X0 ; Y0 [X X0 ; Y Y0 and X0 = Y0 j C ] Clearly for normalizing X and Y the relation 9C [X ' Y j C ] is decidable, and for arbitrary X and Y this relation is semi-decidable.
Lemma 32 Properties of Schematic Conversion. If X ' Y j C and j= C then X ' Y If X ' Y then 9C [X ' Y j C and j= C ]
The syntax directed system
Notation 33. We extend notation 12 and write ? `sfsd a : X j C for ? `sfsd a : X0 j C and X0 X .
De nition 34 `sfsd . The relation `sfsd C T (T [ T ) Cnstr is the smallest relation satisfying the following rules. Srt-sfsd `sfsd s : j fhs; i 2 Ag Var-sfsd
? `sfsd A : j C ?; x:A `sfsd x : A j C
Wk-sfsd
? `sfsd b : X j C ? `sfsd A : j D ?; x:A `sfsd b : X j C [ D
Pi-sfsd
? `sfsd A : j C ?; x:A `sfsd B : j D ? `sfsd x:A:B : j C [ D [ fh; ; i 2 Rg
Lda1-sfsd
? `sfsd A : j C ?; x:A `sfsd b : X j D ? `sfsd x:A:b : x:A:X j C [ D [ fh; 2; 3i 2 Rg
Lda2-sfsd
? `sfsd A : j C ?; x:A `sfsd b : j D ? `sfsd x:A:b : x:A: j C [ D [ fh; 2; 3i 2 Rg [ fh ; 4i 2 Ag
App-sfsd
? `sfsd a :wh x:B:X j C ? `sfsd b : Y j D ? `sfsd a b : X [x := b] j C [ D [ E
b2 S [V
X 62 [ S
Y ' B jE
In the de nition we assume all the sort variables which are introduced to be fresh. For example in rule Pi-sfsd we assume to be a fresh variable, and also SV (C ) and SV (D) should be disjoint. Similarly in rule App-sfsd SV (C ) and SV (D) will be disjoint, but SV (D) and SV (E ) might intersect. Note that for any ? and a there is (up to the names of sort variables) at most one derivable judgement ? `sfsd a : X j C , and at most one derivation of such a judgement. Further, derivations in `sfsd depend on the language of the PTS , but are independent of the axioms, A , and -rules, R, of the PTS; it is in the satisfaction of constraints that a particular PTS is observed.
Lemma 35 Equivalence of `sfnsd and `sfsd . i If ? `sfsd a : X j C and j= C then ? `sfnsd a : X ii If ? `sfnsd a : A then 9X; C ; [ j= C ; A = X and ? `sfsd a : X j C ]
Proof. i Induction on ? `sfsd a : X j C . We treat App-sfsd: ? `sfsd a : X 0 j C X 0 wh x:B:X ? `sfsd b : Y j D App-sfsd Y ' BjE ? `sfsd a b : X [x := b] j C [ D [ E Now suppose j= C [ D [ E . By induction ? `sfnsd a : X 0 wh x:B:X and ? `sfnsd b : Y : By lemma 32 Y ' B , so ? `sfnsd a b : ( X )[x := b] = (X [x := b]).
ii Induction on ? `sfnsd a : A. We consider Lda1-sfnsd: ? `sfnsd A : C s1 ?; x:A `sfnsd b : B ? `sfnsd x:A:b : x:A:B By induction
hs1; s2; s3i 2 R B 62 S
9X; C ; C; [C j= C ; C = C X and ? `sfsd A : X j C where X ] 9Y; D; D [D j= D; B = D Y and ? `sfsd b : Y j D] where we may assume SV (C ) disjoint from SV (D) (change the names of sort variables if necessary). Since D Y = B 62 S , Y 62 [ S , and by Lda1-sfsd ? `sfsd x:A:b : x:A:Y j C [ D [ fh; 2; 3i 2 Rg
We must have C = s1 , so take = C [ D [ f2 7! s2 ; 3 7! s3g and j= C [ D [ fh; 2; 3i 2 Rg as required.
ut
An algorithm for typechecking normalizing semi-full PTS with decidable constraints We cannot use `sfsd to decide ` because `sfsd does not incrementally check consistency of constraints, so we may try to normalize a schematic term that has no well-typed instance. We now x this de ciency. De nition 36 `sfa . The relation `sfa C T (T [ T ) Cnstr is the smallest relation satisfying the following rules. Srt-sfa `sfa s : j fhs; i 2 Ag Var-sfa
? `sfa A : X j C C con X ?; x:A `sfa x : A j C
Wk-sfa
? `sfa b : X j C ? `sfa A : Y j D Dcon Y ?; x:A `sfa b : X j C [ D
Pi-sfa
? `sfa A : X j C C con X ?; x:A `sfa B : Y j D Dcon Y ? `sfa x:A:B : j C [ D [ fh; ; i 2 Rg
Lda1-sfa
? `sfa A : Y j C C con Y ?; x:A `sfa b : X j D ? `sfa x:A:b : x:A:X j C [ D [ fh; 2; 3i 2 Rg
Lda2-sfa
? `sfa A : Y j C C con Y ?; x:A `sfa b : j D ? `sfa x:A:b : x:A: j C [ D [ fh; 2; 3i 2 Rg [ fh ; 4i 2 Ag
App-sfa
? `sfa a : Z j C C con Z wh x:B:X ? `sfa b : Y j D Dcon Y ' B j E ? `sfa a b : X [x := b] j C [ D [ E
b2S[V
X 62 [ S
4 The typing relations `o and `tp We now begin a deeper study of the general PTS . In section 1.3 we noticed that the third premiss of the Lda rule is a serious obstacle to a syntax directed characterization of ` . In section 3 we showed that for a special class of PTS that premiss could be replaced by innocuous side conditions, and proceeded to characterize that class by a syntax directed presentation, as advertised. Now we show that in general the troublesome third premiss asks for too much checking; that in the presence of the rst two premisses of the Lda rule we can replace the third premiss by a more liberal judgement. For this purpose we introduce two liberal typing judgements, `o and `tp , which are closely related to the typing operators in [vBJ93]. We remind the reader here of the extensions to the reduction and conversion relations arising from our contractions ( x:A:B ) a ! B [x := a] :
4.1 Well-formed Contexts We start with a very weak notion of well-formedness for contexts. De nition 37. `wfcxt is the smallest predicate on C satisfying the following rules. Empty-wfcxt `wfcxt FV(A) dom(? ) ?; x:A `wfcxt Recall convention 2. It is easily seen that ? ` a : A implies ? `wfcxt . Extend-wfcxt
? `wfcxt
4.2 A Nearly Syntax Directed Relation, `o
We de ne a relation, `o , closely connected with ` , which is nearly syntax directed. We will prove (lemma 39) that `o is complete for ` . `o is too liberal to be sound for ` , but if ? ` a : A for some A , and if also ? `o a : B , then B is, in a sense, convertible to a type for a . This is expressed in our Key Theorem 46. De nition 38 `o . The relation `o C T T is the smallest relation satisfying the following rules.
? `wfcxt ? `o s1 : s2 ? `wfcxt Var-o ? `o x : A ? `o A : s1 ?; x:A `o B : s2 Pi-o ? `o x:A:B : s3 ?; x:A `o b : B Lda-o ? `o x:A:b : x:A:B ? `o a : A x:B:A0 ? `o b : B App-o ? `o a b : A b Srt-o
hs1; s2i 2 A x:A @? ?
hs1; s2; s3i 2 R
Notice that the rule App-o may cause -application, and therefore create -redexes. Also, the rules for `o are nearly syntax directed, and consequently it is easy to see that `o has a strong generation lemma (compare with 6) which we will freely use without further comment.
Lemma 39 Completeness of `o . If ? `a:A then 9A0 [A0 ' A and ? `o a : A0 ]: Proof. By induction on ? ` a : A . We consider App: ? ` a b : A[x := b] because ? ` a : x:B:A and ? ` b : B . By induction hypothesis we have ? `o a : A0 ' x:B:A and ? `o b : B0 ' B . Let A1 , B1 be such that A0 x:B1 :A1 , and B0 B1 . Then ? `o a b : A0 b where A0 b ' (x:B:A ) b ! A[x := b] as required. ut Now we derive properties of `o leading to a weak closure (subject reduction) theorem. Lemma 40 Free variables for `o . If ? `o a : A then ? `wfcxt and FV(a) [ FV(A) dom(? ) Lemma 41 Weakening for `o . If ?1 `o a : A; ?1 v ?2 and ?2 `wfcxt then ?2 `o a : A Lemma 42 Approximate substitution for `o . If ?1; x:A; ?2 `o b : B; ?1 `o a : A0 and A0 ' A then ?1; ?2[x := a] `o b[x := a] : B0 [x := a] where B0 ' B Proof. Use induction on ?1; x:A; ?2 `o b : B , noticing that, since FV (a) dom (?1), we have ?1 ; ?2[x := a] `wfcxt . We consider some cases. Var-o ?1; x:A; ?2 `o y : B because y :B @ ? ?1; x:A; ?2 . We discern between cases y = x and y= 6 x. For y = x , ?1 ; ?2[x := a] `o a : A0 by weakening. Also a = y [x := a], and A0 = A0 [x := a] because x 62 FV (A0 ) by the free variables lemma. For y = 6 x we have either y:B @? ?1 or y:B @? ?2 . In the rst case we use the well-formedness of contexts to show that x 62 FV (B ), in the second case we are done immediately. App-o. ?1 ; x:A; ?2 `o c d : C d because ?1; x:A; ?2 `o c : C y:D0:C0 and ?1 ; x:A; ?2 `o d : D D0: By induction hypothesis ?1 ; ?2[x := a] `o c[x := a] : C1[x := a] and ?1; ?2[x := a] `o d[x := a] : D1[x := a] where C1 ' C and D1 ' D . Let C2 , D2 be such that C1 y :D2 :C2 and D1 D2 . Then C1[x := a] y:D2 [x := a]:C2[x := a] and D1[x := a] D2[x := a]: It follows that ?1 ; ?2[x := a] `o ( c d)[x := a] : ( C1 d)[x := a], where C1 d ' C d . ut
Lemma 43 Reduction of contexts. If ?1 `o a : A1 and ?1 ?2 then ?2 `o a : A2 where A1 A2 Proof. Use induction on ?1 `o a : A1 , noticing that that since ?1 `wfcxt we have ?2 `wfcxt . Consider the rule App-o: ?1 `o a b : A b as a consequence of ?1 `o a : A x:B0 :A0 and ?1 `o b : B B0 . By the induction hypothesis we have ?2 `o a : A1 and ?2 `o b : B1 , where A A1 and B B1 . Let A2 , B2 be such that A1 x:B2:A2 and B1 B2 . Then ?2 `o a b : A0 b where A b A1 b . ut Lemma 44 One step Closure. If ? `o a : A and a ! b then ? `o b : B where B ' A Proof. Induction on ? `o a : A . The interesting case is the rule App-o, where we have ? `o a b : A b as a consequence of ? `o a : A x:B0 :A0 and ? `o b : B B0 . We discern
cases. a = x:B1:a0 and a b ! a0 [x := b]. As ? `o a : A we have ?; x:B1 `o a0 : A1 and A = x:B1:A1 . (`o has a stronger generation lemma than ` .) Hence B ' B1 , and by the approximate substitution lemma (42), ? `o a0[x := b] : A2 [x := b] where A2 ' A1 . It follows that A b = (x:B1:A1 ) b ! A1[x := b] ' A2[x := b]; so we are done. a ! c so a b ! c b . We have by the induction hypothesis ? `o c : C where C ' A. It follows that ? `o c b : C b where C b ' A b . b ! c so a b ! a c . By the induction hypothesis ? `o c : C where C ' B . It follows that ? `o a c : A c where A c ' A b . ut Now we can easily prove the main result of the current subsection, analogous to lemma 10.
Lemma 45 Weak Closure for `o . If ? `o a : A; ? ?0 and a a0 then ?0 `o a0 : A0 where A0 ' A
How is it possible to have proved a closure lemma for `o without having a type correctness lemma such as lemma 9? (See [GN91, Bar92, MP93] for discussion of the proof of closure for ` .) The central point is that `o , having no conversion rule, has a stronger generation lemma than ` , as noted in the proof of 44 above. To take advantage of this, we use an approximate substitution lemma: compare 8 with 42. In the case of `o it is also possible to separate context reduction from term reduction, while for ` these must be proved by simultaneous induction; this latter point, however, is inessential.
4.3 The Key Theorem
In lemma 39 we proved completeness of `o for ` . In this section we will show that `o is sound, in the weak sense that, if a is some term already well-typed in ` , `o gives it types \correct up to -conversion". The precise statement is as follows:
Theorem 46 Key Theorem for `o . If ? ` a : A0 and ? `o a : A1 then either A1 ' A0 or A1 x1:C1: : : :xn:Cn:s0; a x1:C1: : : :xn:Cn:a0 and ?; x1:C1; : : :; xn:Cn ` a0 : s0
Notice that the second alternative is phrased to allow for the fact that the abstractions x1:C1: : : : xn:Cn:s0 and x1:C1: : : : xn:Cn:a0 may not be well-formed in the absence of sucient rule instances in R . The reader should compare the statement of this theorem with that of Theorem 5.5 of [vBJ93]. Proof. By induction on ? ` a : A0 . We discuss some interesting cases. Pi ? ` x:A:B : s3 from ? ` A : s1 , ?; x:A ` B : s2 and hs1 ; s2; s3i 2 R . Suppose ? `o x:A:B : s . Then ? `o A : A1 sA and ?; x:A `o B : B1 sB , where hsA ; sB ; si 2 R. De ne C such that A C and ? ` C : sA as follows: { If sA = s1 we take C = A. { Otherwise, by the induction hypothesis A C0 such that ? ` C0 : sA , and we take C = C0 . Similarly de ne a reduct D of B such that ?; x:A ` D : sB . It follows by closure that ?; x:C ` D : sB , and hence ? ` x:C:D : s . Lda ? ` x:A:b : x:A:B from ? ` A : s1 , ?; x:A ` b : B , ?; x:A ` B : s2 and hs1 ; s2; s3i 2 R . Suppose ? `o x:A:b : C . Then we have ?; x:A `o b : B0 and C = x:A:B0 . Now by induction either B ' B0 or B0 x1:C1: : : :xn:Cn:s0; b x1:C1: : : :xn:Cn:b0; and ?; x:A; x1:C1; : : :; xn:Cn ` b0 : s0 : In both cases we are done. App ? ` a b : A[x := b] from ? ` a : x:B:A and ? ` b : B . Suppose ? `o a b : C . Then ? `o a : A1 x:B0 :A0 ; ? `o b : B1 B0 and C = A1 b: By the left induction hypothesis we have either A1 ' x:B:A or A1 x0:C0: : : :xn:Cn:s0 ; a x0:C0: : : :xn:Cn:a0; and ?; x0:C0; : : :; xn:Cn ` a0 : s0 : (Notice n 0 because A1 ' x:B0 :A0 cannot reduce to a sort.) In the rst case we have C = A1 b ' (x:B:A ) b ' A[x := B]
so we are done. In the second case we have by closure for ` (lemma 10) that ? ` x0:C0: : : : xn:Cn:a0 : x:B:A: By generation for ` (lemma 6) C0 ' B and ? ` C0 : s for some s 2 S ; hence ? ` b : C0 by Cnv. Therefore we have by substitution for ` (lemma 8) that ?; x1:C1[x := b]; : : :; xn:Cn[x := b] ` a0[x := b] : s0 : Also C = A1 b (x0:C0: : : : xn:Cn:s0 ) b ! x1:C1[x := b]: : : : xn:Cn[x := b]:s0 and a b (x0:C0: : : : xn:Cn:a0 ) b ! x1:C1[x := b]: : : : xn:Cn[x := b]:a0[x := b] as required. ut
Corollary 47. If ? ` a : A and ? `o a : s then a a0 where ? ` a0 : s
Proof. By the Key Theorem we have either A ' s or a a0 where ? ` a0 : s . In the rst case notice that A , occurring in a ` -judgement, contains no -redexes, so A s , and we are done by closure (lemma 10). ut The following example shows that in general we cannot expect more with respect to soundness; given ? ` a : A and ? `o a : s , reduction of a may be necessary to obtain a term of type s. Example 3. Recall example 1; we have ` a : 4, and `o a : 5 but 6` a : 5 . The best we can say is a ? and ` ? : 5 . ut Obviously this is due to our liberal Lda-o rule.
4.4 The relation `tp
We now introduce a relation `tp , similar to `o but with no correctness check for application. `tp is more ecient for type checking, but it lacks the beautiful properties of `o . In fact it is easy to see that weak closure does not hold for `tp . We will show that `tp -judgements \lift" to `o -judgements in a sense that is sucient to prove the Key Theorem for `tp . De nition 48 `tp . The relation `tp C T T is the smallest relation satisfying the rules for `o , but having instead of the rule App-o : ? `tp a : A App-tp ? `tp a b : A b Clearly `o is contained in `tp , hence, from lemma 39, we have:
Lemma 49 Completeness of `tp . If ? `a:A then 9A0 [A0 ' A and ? `tp a : A0]
In order to prove the Key Theorem for `tp we borrow a technical notion from [vD80]: De nition 50 Similarity. The relation T T is the smallest relation satisfying:
aa s1 s2 , If A1 A2 then x:B:A1 x:B:A2 , If A1 A2 then A1 b A2 b . Lemma 51 Properties of . If A B then i A[x := c] B[x := c]. ii If A ! A0 then B ! B0 where A0 B0 . iii If A A0 then B B0 where A0 B0 . iv If A = x:C:A0 then B = x:C:B0 where A0 B0 . Proof. i is proved by induction on A B . ii is proved similarly, using i. iii follows immediately from ii. iv is a simple generation property of .
ut
Proof. By induction on the structure of a , using the generation lemma for `tp .
ut
Lemma 52 `tp uniqueness of types. If ? `tp a : A0 and ? `tp a : A1 then A0 A1 In particular, if ? `o a : A0 and ? `o a : A1 then A0 A1 .
Lemma 53 Lifting `tp to `o . If ? `o a : A0 and ? `tp a : A1 then ? `o a : A1 Proof. Induction on ? `o a : A0 . Obviously the only interesting case is App-o; ? `o a b : A b as a consequence of ? `o a : A x:B2 :A2 and ? `o b : B B2 . As ? `tp a b : A1 , it follows that ? `tp a : A3 where A1 = A3 b . By induction we have ? `o a : A3 . Hence A3 A by lemma 52, and, by 51(iii), A3 C where x:B2 :A2 C . Now C = x:B2 :C2 by 51(iv), and ? `o a b : A3 b = A1 as required. ut As a corollary we have
Theorem 54 Key Theorem for `tp . If ? ` a : A0 and ? `tp a : A1 then either A1 ' A0 or A1 x1:C1: : : :xn:Cn:s0; a x1:C1: : : :xn:Cn:a0 and ?; x1:C1; : : :; xn:Cn ` a0 : s0 Proof. By completeness for `o (39) we have ? `o a : A0 and hence, by the previous lemma ? `o a : A1 , so we can apply the Key Theorem for `o . ut Corollary 55. If ? ` a : A and ? `tp a : s then a a0 where ? ` a0 : s
4.5 A First Application of `tp
`o is more liberal than ` , and `tp is still more liberal. In the Introduction we hinted that `
checks too much to be used in the third premiss of Lda, and in the next section we will be precise about using `tp for this purpose. Another application where, intuitively, ` does too much checking is computing the type of a subterm of a term already known to be well typed. In application, this problem arises when implementing a uni cation algorithm for a PTS , for the variables of uni cation carry types, and before instantiating a variable with some subterm we must check that subterm has the correct type. For this purpose, `tp is more ecient than a correct typechecking algorithm, and this is so even for well behaved PTS such as the Calculus of Constructions, which is full, functional and normalizing. Assume we have some judgement ? ` a : A , and b is a \locally closed" subterm of a , i.e. FV (b) ? . Then it is easy to see that ? ` b : B for some B , and our goal is to compute B . By completeness for `tp , ? `tp b : B0 ' B .
5 Application to Pure Type Systems We present a nearly syntax directed system for arbitrary PTS which is, in a sense to be made precise, equivalent to the original system. We start by giving the de nition, in which we replace the third premise of the Lda rule by an appeal to `tp , motivated by corollary 55 above. De nition 56 `nsdtp . The relation `nsdtp C T T is the smallest relation satisfying Srt-nsdtp `nsdtp s1 : s2 hs1; s2i 2 A
? `nsdtp A : s ?; x:A `nsdtp x : A ? `nsdtp b : B ? `nsdtp A : s Wk-nsdtp ?; x:A `nsdtp b : B ? `nsdtp A : s1 ?; x:A `nsdtp B : s2 Pi-nsdtp ? `nsdtp x:A:B : s3 ? `nsdtp A : s1 ?; x:A `nsdtp b : B Lda-nsdtp ? `nsdtp x:A:b : x:A:B
Var-nsdtp
App-nsdtp
? `nsdtp a :wh x:B1 :A ? `nsdtp b : B2 ? `nsdtp a b : A[x := b]
b2 S [V
hs1; s2; s3i 2 R ?; x:A `tp B : s2 hs1; s2; s3i 2 R B1 ' B2
Informally notice that `nsdtp is better than `nsd from an algorithmic viewpoint, as the occurrence of `tp in Lda-nsdtp is cheaper to compute than the corresponding occurrence of `nsd in Lda-nsd. That is, our inability to prove Expansion Postponement may be costing us simplicity, but it is not costing eciency. Scanning the rules of `nsdtp we see that they are not fully syntax directed. First there are the rules Srt-nsdtp and Pi-nsdtp , which may give various types to the a term. We have seen how to solve such diculties by introducing schematic terms. This requires the extension of `tp to schematic terms, which seems to pose no problems.
Second there is the rule Lda-nsdtp ? `nsdtp A : s1 ?; x:A `nsdtp b : B ?; x:A `tp B : s2 Lda-nsdtp hs1; s2; s3i 2 R ? `nsdtp x:A:b : x:A:B which fails to be syntax directed because the reduction strategy for the type of b is not xed. In section 2.1 we repaired the non syntax directed aspect of rule App-nsd , replacing reduction by weak head reduction to some -type, as in App-nsdtp . But a -type is a weak head normal form, and if weak head reduction fails to terminate in the rst premiss of App-nsdtp then the term in question has no type. When should we stop reducing in the second premiss of Lda-nsdtp ? We do not in general know if the type of b has a normal form. In section 6 below, we see that, for functional PTS , no reduction is required in this premiss, but for general PTS this is not complete (example 4). To begin the theory of `nsdtp , observe:
Lemma 57 Completeness of `o for `nsdtp . If ? `nsdtp a : A then i 9A0 A0 ' A and ? `o a : A0 ii A 2 S or ? `o A : s Proof. By induction on ? `nsdtp a : A . The interesting case is Lda-nsdtp : ? `nsdtp A : s1 ?; x:A `nsdtp b : B 0 B ?; x:A `tp B : D s2 Lda-nsdtp hs1; s2; s3i 2 R ? `nsdtp x:A:b : x:A:B By induction we have ? `o A : s1 and ?; x:A `o b : B00 ' B 0 where B 0 2 S or ?; x:A `o B0 : sB0 . By Lda-o we have ? `o x:A:b : x:A:B00 ' x:A:B . To nish, we claim ? `o x:A:B : s3 . To show this claim using Pi-o and the left induction hypothesis, it remains to show ?; x:A `o B : s2 . Consider the two cases B 0 2 S or ?; x:A `o B 0 : sB0 . In the rst case B 2 S so it must be that hB; Di 2 A (a generation property of `tp ); but then ?; x:A `o B : D and D = s2 . In the second case, B 0 has some type in `o , and by lemma 45 (weak closure for `o ) B also has some type in `o . Now we are done by lemma 53 (lifting `tp to `o ) and the side condition of our original instance of Lda-nsdtp . ut Now we prove that `nsdtp is complete and sound with respect to ` . Lemma 58 Completeness of `nsdtp . If ? `a:A then 9A0 [A0 ' A and ? `nsdtp a : A0]: Proof. By induction on ? ` a : A . We give two cases. Lda
By induction
? ` A : s1
?; x:A ` b : B ?; x:A ` B : s2 ? ` x:A:b : x:A:B
hs1; s2; s3i 2 R
? `nsdtp A : s1 and ?; x:A `nsdtp b : B0 ' B .
Also ?; x:A `o B : s2 by completeness for `o . Taking a common reduct B1 of B and B0 we have ?; x:A `o B1 : s2 by closure for `o , so also ?; x:A `tp B1 : s2 . Moreover ?; x:A `nsdtp b : B1 , and therefore by Lda-nsdtp ? `nsdtp x:A:b : x:A:B1 ' x:A:B .
App ? ` a b : A[x := b] as a consequence of ? ` a : x:B:A and ? ` b : B . By the induction hypothesis ? `nsdtp a : A0 and ? `nsdtp b : B0 where A0 ' x:B:A and B0 ' B . It follows that A0 wh x:B1:A1 where A1 ' A and B1 ' B . Therefore B0 ' B1 so ? `nsdtp a b : A1[x := b] where A[x := b] ' A1 [x := b]. ut
Lemma 59 Soundness of `nsdtp . If ? `nsdtp a : A then 9A0 [A A0 ; ? ` a : A0; and [A 2 S or 9s0 ? ` A0 : s0 ]] Proof. By induction on ? `nsdtp a : A . Again we treat the lambda rule and the application rule. Lda-nsdtp
? `nsdtp A : A0 s1 ?; x:A `nsdtp b : B0 B ?; x:A `tp B : s2 hs1; s2; s3i 2 R ? `nsdtp x:A:b : x:A:B It follows by the induction hypothesis that ? ` A : A1 and ?; x:A ` b : B1 , where A0 A1 and B0 B1 . Therefore A1 s1 and hence ? ` A : s1 by closure. Also either B0 2 S or ?; x:A `o B0 : B2 s by lemma 57(ii). In the rst case B = B0 = B1 , hence hB; s2i 2 A and therefore ?; x:A ` B : s2 . Hence we have ? ` x:A:b : x:A:B and ? ` x:A:B : s3 . In the second case we have by closure for `o that ?; x:A `o B : B3 ' B2 . It follows by lemma 53 that ?; x:A `o B : s2 . Taking a common -reduct B4 of B and B1 we have by closure for ` that ?; x:A ` b : B4 and by closure for `o that ?; x:A `o B4 : s2 . By correctness of types for ` either B4 2 S or ?; x:A ` B4 : s0 for some s0 . In the rst case we have hB4 ; s2i 2 A so ?; x:A ` B4 : s2 , hence ? ` x:A:b : x:A:B4 by Lda and ? ` x:A:B4 : s3 by Pi, while B B4 . In the second case we have by the corollary to the Key Theorem for `o (47) that ?; x:A ` B5 : s2 for some reduct B5 of B4 , and therefore as before ? ` x:A:b : x:A:B5 by Lda and ? ` x:A:B5 : s3 by Pi, while B B5 . App-nsdtp We have ? `nsdtp a : A0 wh x:B:A ? `nsdtp b : B0 B0 ' B ? `nsdtp a b : A[x := b] By induction hypothesis ? ` a : A1 and ? ` b : B1 where A0 A1 and B0 B1 . Take a common reduct x:B2 :A2 of A1 and x:B:A and note that B2 ' B1 . Taking again a common reduct B3 we have by closure ? ` a : x:B3 :A2 and ? ` b : B3 , so ? ` a b : A2 [x := b] while A A2 . Moreover we have by correctness of types for ` that ? ` x:B3 :A2 : s for some s, hence ?; x:B3 ` A2 : s0 by generation and ? ` A2 [x := b] : s0 by the substitution lemma. ut As a consequence we can characterize ` completely in terms of `nsdtp .
Corollary 60 `nsdtp characterizes ` . ? `a:A , 9A0 [A0 ' A and ? `nsdtp a : A0] and [A 2 S or 9s0 ? `nsdtp A : s0] Proof. ) Suppose ? ` a : A then the rst statement is completeness of `nsdtp . Also we have by correctness of types that either A 2 S or ? ` A : s0 . In the rst case we are done and in the second case we apply once more completeness of `nsdtp .
( Now suppose ? `nsdtp a : A0 ' A . Then we have by soundness of `nsdtp that ? ` a : A1 , where A0 A1 . Now if A 2 S we have A1 A so ? ` a : A by closure. And if ? `nsdtp A : A2 s0 , then again by soundness of `nsdtp we see ? ` A : A3 where A3 s0 , hence ? ` A : s0 ; and ? ` a : A follows by Cnv. ut As corollaries we have also: Lemma 61 Weak Closure for `nsdtp . If ? `nsdtp a : A; ? ?0 and a a0 then ?0 `nsdtp a0 : A0 where A ' A0 Lemma 62 Weak Correctness of Types for `nsdtp . If ? `nsdtp a : A then either A 2 S or 9A0 ; s0 [A A0 and ? `nsdtp A0 : s0 ]: 5.1 Strengthening
Strengthening is the property that a type assignment to a variable which is never used may be dropped from the context. Formally it is the proposition
Theorem 63 Strengthening for ` . If ?1; x:A; ?2 ` b : B and x 62 FV(?2) [ FV(b) then ?1; ?2 ` b : B0 where B B0 :
This formulation was rst stated and proved for Constructions-like calculi by Luo [Luo90]. A proof for functional PTS appears in [GN91]; strengthening for arbitrary PTS was proved in [vBJ93]. Our results above have strengthening for arbitrary PTS as an easy consequence.
Lemma 64 Strengthening for `tp . If ?1 `tp b : B; ?2 `wfcxt; ?2 v ?1 and FV(b) dom(?2) then ?2 `tp b : B: As usual, we have
Lemma 65 Free Variables for `nsdtp . If ? `nsdtp a : A then FV(a) [ FV(A) dom(? ): More interestingly, because `nsdtp has no conversion rule, we have: Lemma 66. If ?1; x:A; ?2 `nsdtp b : B and x 62 FV(?2) [ FV(b) then x 62 FV(B ): Lemma 67 Strengthening for `nsdtp . If ?1; x:A; ?2 `nsdtp b : B and x 62 FV(?2 ) [ FV(b) then ?1; ?2 `nsdtp b : B
The proofs of these lemmas are by easy induction. They give us strengthening for ` . Proof of Theorem 63. Assume ?1 ; x:A; ?2 ` b : B . By completeness (58), strengthening for `nsdtp (67), and soundness (59), ?1 ; ?2 ` b : B1 , where B ' B1 . Taking a common reduct B0 of B and B1 we have by closure ?1 ; ?2 ` b : B0 . ut
6 Functional Pure Type Systems We observed that `nsdtp fails to be syntax directed only because of the axiom side condition of Srt-nsdtp , the -rule side condition of Lda-nsdtp and the non-deterministic reduction of the type of b in the second premiss of Lda-nsdtp . In the case of functional PTS the side conditions mentioned are in fact single valued, so the only remaining problem is the reduction in the second premiss of Lda-nsdtp . We have also seen that, although terms really can get more types by reduction (example 3), this does not happen for functional PTS (lemma 21). Thus for functional PTS we might hope to remove reduction in the right premiss of Lda-nsdtp , and in fact this is the case. For the most part this section parallels the development of section 5, but the following result shows what the dierence is: for functional PTS , the relations `tp and `o are functions. Lemma 68 Uniqueness of types for `tp . For functional PTS If ? `tp a : A0 and ? `tp a : A1 then A0 = A1 In particular, if ? `o a : A0 and ? `tp a : A1 then A0 = A1 . Now we de ne a syntax directed system `f for functional PTS's which is similar to `nsdtp but has a syntax directed Lda rule. For the rest of this section we assume that the PTS under discussion is functional. De nition 69 `f . The relation `f C T T is the smallest relation satisfying the rules of the system `nsdtp , de ned in de nition 56, but having instead of Lda-nsdtp the rule ? `f A : s1 ?; x:A `f b : B ?; x:A `tp B : s2 Lda-f hs1; s2; s3i 2 R ? `f x:A:b : x:A:B Trivially `f `nsdtp , so from lemma 57 we have:
Lemma 70 Completeness of `o for `f . If ? `f a : A then i 9A0 [A0 ' A and ? `o a : A0 ] ii A 2 S or ? `o A : s We prove that `f is complete and sound with respect to ` ; in fact we have even `f ` . Lemma 71 Completeness of `f . If ? `a:A then 9A0 [A0 ' A and ? `f a : A0 ]: Proof. By induction on ? ` a : A . All cases are as in the proof of completeness for `nsdtp (lemma 58) except Lda: we have ? ` x:A:b : x:A:B because ? ` A : s1 , ?; x:A ` b : B and ?; x:A ` B : s2 , where hs1 ; s2; s3i 2 R . It follows by induction that ? `f A : s1 and ?; x:A `f b : B0 where B0 ' B . Also ?; x:A `o B : s2 by completeness of `o with respect to ` (lemma 39). Either B0 2 S or ?; x:A `o B0 : s for some s by completeness of `o with respect to `f (lemma 70). If B0 2 S then B B0 , hence ?; x:A `o B0 : s2 by closure for `o . It follows that ? `f x:A:b : x:A:B0 . Alternatively, assume ?; x:A `o B0 : s . Taking a common reduct B2 of B and B1 we have ?; x:A `o B2 : s2 and ?; x:A `o B2 : s , both by closure for `o . It follows by functionality that s = s2 and therefore again ? `f x:A:b : x:A:B0 . ut
Lemma 72 Soundness for `f . If ? `f a : A then ? ` a : A Proof. By induction on ? `f a : A . We treat the Lda-f rule: ? `f x:A:b : x:A:B as a consequence of ? `f A : A0 s1 and ?; x:A ` b : B , where ?; x:A `tp B : s2 and hs1 ; s2; s3i 2 R . It follows by induction that ? ` A : A0 and ?; x:A ` b : B . Hence ? ` A : s1 by closure. Also by correctness of types (lemma 9) either B 2 S or ?; x:A ` B : s for some s . If B 2 S then hB; s2i 2 A (because of ?; x:A `tp B : s2 ) and therefore ?; x:A ` B : s2 . It follows that ? ` x:A:b : x:A:B by Lda. Alternatively assume ?; x:A ` B : s ; we have by completeness for `o (lemma 39) that ?; x:A `o B : s . It follows by functionality (lemma 68) that s = s2 and therefore again ? ` x:A:b : x:A:B by Lda. ut As corollaries we have:
Lemma 73 Weak Closure for `f . If ? `f a : A; ? ?0 and a a0 then ?0 `f a0 : A0 where A ' A0 Lemma 74 Correctness of Types for `f . If ? `f a : A then either A 2 S or 9s ? `f A : s: Proof. Assume ? `f a : A then by soundness for `f we have ? ` a : A , hence by correctness of types for ` either A 2 S or ? ` A : s for some s . In the latter case ? `f A : s by completeness for `f . ut An important consequence is the following theorem.
Theorem 75 Decidability for normalizing functional PTS. If a PTS is functional and normalizing, and 9s 2 S [hs1; si 2 A] and 9s 2 S [hs1; s2 ; si 2 R] are decidable relations then the relation ` is decidable. Proof. It follows from 72 that ? `f a : A implies that a and A are normalizing. Therefore all the side conditions in the rules for `f are decidable. ut
6.1 Incompleteness of `f
We promised to show that `f is in fact incomplete for the general PTS , i.e. that Lda-nsdtp is essentially non-syntax-directed. Example 4. Consider the following PTS , extending example 3: S = f?; 4; 5; g A = fh?; 4i; h?; 5i; h4; ig R = fh; ; i; h4; 5; ig
As before let a = ( x: 4 :x ) ? . We know that ` a : 4 , and now verify that in fact `f a : 4 , hence also `nsdtp a : 4 `f 4 : x:4 `wfcxt `f 4 : x:4 `f x : 4 x:4 `tp 4 : h; ; i 2 R `f x: 4 :x : x: 4 :4 `f ? : 4 `f (x: 4 :x) ? : 4 Now we verify y :a `nsdtp z : ? :y : z : ? :? y:a; z:? `wfcxt y:a `nsdtp ? : 4 y:a; z:? `nsdtp y : a ? y:a; z:? `tp ? : 5 h4; 5; i 2 R y:a `nsdtp z : ? :y : z: ? :? Obviously (because `f is nearly syntax directed) the only possible derivation in `f of a type for z : ? :y in the context y :a has shape y:a;z :?; x:4 `tp x : 4 y:a;z :? `tp x: 4 :x : x: 4 :4 y:a `f ? : X y:a; z :? `f y : a y:a;z :? `tp a : ( x: 4 :4) ? 4 h; 4; ?i 2 R y:a `f z : ? :y : z : ? :a As there is no rule h; 4; ?i 2 R, `f does not derive any type for z : ? :y in the context y :a . We see that `f is strictly weaker than `nsdtp , and is incomplete for non-functional PTS . ut
7 A syntax directed system for arbitrary PTS's Before de ning a syntax directed system we must decide on the lambda rule; given the pseudoterm x:A:b how far will we reduce B , the type of b , in order to nd its sorts? To make the reduction path unique, we reduce B using complete developments. De nition 76 complete development. The relation )1 T T is the smallest relation satisfying the following rules.
a )1 a 1if a 2 V [ S . 1 If A1 1) A2 and B11 ) B2 then x:1A1 :B1 )1 x:A2:B2 . If A ) B and a ) b then x:A:a ) x:B:b . If a )1 a0 and b )1 b0 then ( x:A:a ) b )1 a0 [x := b0 ]. If a )1 a0 , b )1 b0 and a 6= x:A:a1 then a b )1 a0 b0 . Clearly )1 is a function, )1 , and if a b , a )1 a1 and b )1 b1 then a1 b1 . Note that n 1 n times, and have by a )1 a i a is normal.n We will denote by ) the result of applying ) n induction: if a b , a ) an and b ) bn then an bn . The complete development strategy has the advantage of simplicity, and is moreover a co nal n c , with b c . So by reduction strategy, in the sense that if a b, then for some n and c , a ) closure (lemma 10), we know that if some reduct of B has sort s , it suces to consider complete developments of B in order to compute s . This behaviour should be contrasted with non-co nal strategies, such as leftmost-outermost reduction; in the case of non-normalising systems such strategies may not capture all possible sorts for B .
Now for every n 2 N we de ne a relation, `nsd?n , by a nearly syntax directed set of rules whose only non syntax directedness (in rules Srt-nsd-n and Pi-nsd-n ) is a consequence of the non-functionality of the PTS . In general `nsd?n is not equivalent to ` for any particular n , but by using unbounded n we will derive a semi-algorithm for typechecking an arbitrary PTS .
De nition 77 `nsd?n . For each n 2 N the relation `nsd?n C T T is the smallest relation satisfying the rules of the relation `nsdtp but for the Lda-rule which is replaced by: ? `nsd?n A : s1 ?; x:A `nsd?n b :)n B ?; x:A `tp B : s2 Lda-nsd-n hs1; s2; s3i 2 R ? `nsd?n x:A:b : x:A:B `nsd?n is sound and complete with respect to `nsdtp . Lemma 78 Soundness of `nsd?n with respect to `nsdtp . If ? `nsd?n a : A then ? `nsdtp a : A: For proving completeness we need some properties of `nsd?n . Lemma 79 Completeness of `o for `nsd?n . If ? `nsd?n a : A then i 9A0 [A0 ' A and ? `o a : A0 ]: ii A 2 S or 9s [? `o A : s]: Lemma 80 Monotonicity of `nsd?n . If ? `nsd?m a : A and n m then 9B [A B and ? `nsd?n a : B ]: Proof. By induction on ? `nsd?m a : A . We treat some cases. Lda-nsd-m We have ? `nsd?m x:A:c : x:A:C as a consequence of { ? `nsd?m A : A0 , where A0 s1 , m C , and { ?; x:A `nsd?m c : C0 , where C0 ) { ?; x:A `tp C : s2 where hs1; s2; s3i 2 R. The induction hypothesis gives us: rst ? `nsd?n A : A1 with An0 A1 (and hence A1 s1 ); second ?; xn :A `nsd?n c : C1 , with C0 C1 . Now de ne D by C1 ) D then we have by the properties of ) that C D . As `o `tp it follows by the lemmas 79 and 45 that ?; x:A `tp D : s2 . Therefore ? `nsd?n x:A:c : x:A:D . App-nsd-m We have ? `nsd?m a b : A[x := b] because of ? `nsd?m a : A0 where A0 wh x:B1:A and ? `nsd?m b : B2 , while B1 ' B2 . By induction, rst ? `nsd?n a : A1 with A0 A1 (and hence A1 wh x:B3 :B with A B and B1 B3 ); secondly ? `nsd?n b : B4 with B2 B4 . It follows that B3 ' B4 and hence ? `nsd?n a b : B [x := b]. ut Now we can prove completeness.
Lemma 81 Completeness of `nsd?n with respect to `nsdtp . If ? `nsdtp a : A then 9n; A0 [A A0 and ? `nsd?n a : A0 ]:
Proof. Induction on ? `nsdtp a : A . We select interesting cases. Wk-nsdtp We have ?; x:A `nsd b : B as a consequence of ? `nsd b : B , ? `nsd A : A0 where A0 s and b 2 S [ V . By induction hypothesis we have ? `nsd?k b : B0 where B B0 and ? `nsd?m A : A1 where A0 A1 . Take n = max (k; m) and, by 80, ? `nsd?n b : B1 where B0 B1 and ? `nsd?n A : A2 where A1 A2 . It follows that A2 s and hence ?; x:A `nsd?n b : B1 . Lda-nsdtp We have ? `nsdtp x:A:b : x:A:B as a consequence of ? `nsdtp A : A0 where A0 s1 and ?; x:A `nsdtp b : B0 where B0 B , in k steps, say. And also we know ?; x:A `tp B : D where D s2 and hs1 ; s2; s3i 2 R. By the induction hypothesis we have ? `nsd?l A : A1 where A0 A1 and ?; x:A `nsd?m b : B1 where B0 B1 . Take n = max (k; l; m), getting, by lemma 80, ? `nsd?n A : A2 where A1 A2 and ?; x:A `nsdn ?n b : B2 where B1 B2 . First observe that A1 s1 . Next de ne C0 and C2 by B0 ) C0 and B2 )n C2 . Then we have B C0 C2 by the properties of )n . Also, by 57(ii), either B0 2 S or ? `o B0 : D0 where D0 s 2 S . If B0 2 S then B = B0 = C2 and hence ?; x:A `tp C2 : s2 . And if ? `o B0 : D0 then by 45 we have ?; x:A `o B : D1 and, it follows by 53 that ?; x:A `o B : D and hence again by 45 ?; x:A `o C2 : D2 where D2 s2 . Therefore ? `nsd?n x:A:b : x:A:C2 . App-nsdtp We have ? `nsdtp a : A0 wh x:B1:A ? `nsdtp b : B2 B1 ' B2 ? `nsdtp a b : A[x := b] By induction hypothesis ? `nsd?k a : A1 ; A0 A1 ; ? `nsd?m b : B3 ; and B2 B3: Take n = max (k; m) and, by lemma 80, ? `nsd?n a : A2 where A1 A2 , and ? `nsd?n b : B4 where B3 B4 . It follows that A2 wh x:B0 :A3 and A A3 . Hence ? `nsd?n a b : A3[x := b] where A[x := b] A3 [x := b]. ut
Now we prove that ` and `nsd?n are { in a sense { equivalent.
Lemma 82 Equivalence of ` and `nsd?n . ? ` a : A , i 9n; A0 [A ' A0 and ? `nsd?n a : A0] ii either A 2 S or 9n; s [? `nsd?n A : s]: Proof. ) Suppose ? ` a : A , then by 60 we have that ? `nsdtp a : A0 ] where A0 ' A, and also that either A 2 S or ? `nsdtp A : s . It follows from 81 that for some n 2 N there exists A1 with ? `nsd?n a : A1 and A ' A1 . And also either A 2 S or by the same argument ? `nsd?n A : s . ( Now suppose that A ' A0 and ? `nsd?n a : A0 , and that either A 2 S or ? `nsd?m A : s . Then we have by 78 that ? `nsdtp a : A0 and either A 2 S or ? `nsdtp A : s and again by 60 ? ` a : A. ut
Monotonicity ensures that any strictly increasing sequence nj in N gives rise to a nearly syntax directed search for possible types via the systems `nsd?nj . Applying the methods of section 3.2 we will de ne a syntax directed set of rules for arbitrary PTS . But as the types in this system will be schematic terms, we have to extend our relations
`o and `tp to schematic terms. In order to do this we extend the notion of -convertibility of schematic terms modulo a constraint as de ned in 31 to -convertibility. X ' Y j C =4 9X0; Y0 [X X0; Y Y0 and X0 = Y0 j C ] Just as in the case of -reduction we have Lemma 83 Equivalence of schematic conversion and conversion. i If X ' Y j C and j= C then X ' Y ii If X ' Y then 9C [X ' Y j C and j= C ] iii If 1X ' 2Y then 9C [X ' Y j C ] Now we can de ne the schematized version of `o De nition 84 `o+ . The relation `o+ C (T [ T ) (T [ T ) Cnstr is the smallest rela-
tion satisfying the following rules.
? `wfcxt ? `o+ : j fh; i 2 Ag ? `wfcxt Var-o+ ? `o+ x : A j ; ? `o+ A : j C ?; x:A `o+ X : j D Pi-o+ ? `o+ x:A:X : j C [ D [ fh; ; i 2 Rg ?; x:A `o+ b : X j C Lda-o+ ? `o+ x:A:b : x:A:X j C ? `o+ X : Y j C ? `o+ b : Z j D App-o+ ? `o+ X b : Y b j C [ D [ E Srt-o+
2 S [ x : A @? ?
Y x:B:Y0 Z ' B j E
Lemma 85 Equivalence of `o and `o+ . i If ? `o+ X : Y j C and j= C then ? `o X : Y ii If ? `o X : A then 9Y; C ; 1 [1 ; 1 j= C ; A = 1Y and ? `o+ X : Y j C ] Proof. i Induction on ? `o+ X : Y j C . We treat App-o+: ? `o+ X b : Y b j C [ D [ E because ? `o+ X : Y j C and ? `o+ b : Z j D , where Y x:B:Y0 and Z ' B j E . Now suppose j= C [ D [ E . By induction ? `o (X ) : (Y ) and ? `o (b) : (Z ). Moreover by 83(i), Z ' B and hence Y ( x:B:Y0 ) = x:B:Y0 ' x:Z:Y0 . Therefore ? `o (X ) (b) : (Y ) (b) while (X ) (b) = (X b) and (Y ) (b) = (Y b). ii Induction on ? `o X : A . We consider some cases. Srt-o X 2 S [ and A = s 2 S , with hX; si 2 A . It follows that ? `o+ X : j fhX; i 2 Ag and de ning 1 = [ fh; sig we are done.
Lda-o As there are no schematic -terms we have ? `o x:A:b : x:A:B as a consequence of ?; x:A `o b : B . By induction hypothesis ?; x:A `o+ b : X j C and we have a sort assignment satisfying C such that B = X . We conclude that ? `o+ x:A:b : x:A:X j C and also x:A:B = x:A:(X ) = (x:A:X ).
App-o ? `o (X ) b : A b from ? `o (X ) : A , ? `o b : B and A ' x:B:A0 . By induction hypothesis ? `o+ X : Y j C and ? `o+ b : Z j D where we may assume SV (C ) and SV (D) to be disjoint. Also we have a sort assignments 1 and 2 respectively satisfying C and D such that A = 1Y and B = 2 Z . Taking 3 = 1 [ 2 , we have 3 Y ' x:B:A0 , hence Y x:B0 :Y0 where B0 ' B = 3Z . Now, as 3B0 = B0 it follows by 83(ii) that B0 ' Z j E for some constraint E which is satis ed by 3 . Hence ? `o+ X b : Y b j C [ D [ E , 3 j= C [ D [ E and A b = 3 (Y ) b = 3(Y b). ut We want to show that the substitution lemma and the closure lemma of `o carry over to `o+ . In order to avoid reasoning about schematic derivations to prove these, we introduce a technical device. De ne a new PTS , fS0; V ; A0; R0g , the completely full PTS derived as follows from the PTS fS ; V ; A; Rg under consideration: S0 = S A0 = f hs1; s2i j s1; s2 2 S g R0 = f hs1; s2; s3i j s1; s2; s3 2 S g The possible constraints in the two systems are identical, but in our new PTS every formula of the form h; i 2 A or h; ; i 2 R is satis ed by every sort assignment; only the equations in a constraint are important for satis ability. Further, consider the assignment, 0 , that maps every 2 to some one, arbitrary, s0 2 S , and hence satis es every constraint.
Lemma 86 Approximate substitution for `o+ . If ?1; x:A; ?2 `o+ b : Y j D; ?1 `o+ a : X j C and X ' A j E then 9Y0 ; D0; E0 [?1; ?2[x := a] `o+ b[x := a] : Y0 j D0 and Y0 ' Y [x := a] j E0]: Proof. Suppose ?1; x:A; ?2 `o+ b : Y j D , ?1 `o+ a : X j C and X ' A j E . As in our new system 0 j= C[D[E , it follows from 85(i) that in this system ?1; x:A; ?2 `o b : 0 Y , ?1 `o a : 0X and 0 X ' A. Hence we have by 42 that ?1; ?2[x := a] `o b[x := a] : B0 [x := a] where B0 ' 0 Y , and by 85(ii) there is Y0 , D0 and such that ?1 ; ?2[x := a] `o+ b[x := a] : Y0 j D0 and Y0 = B0 [x := a]. But Y0 = B0 [x := a] ' (0 Y )[x := a] = 0 (Y [x := a]) and it follows from 83 that Y0 ' Y [x := a] j E0 for some constraint E0 . ut Lemma 87 Weak Closure for `o+ . If ? `o+ a : X j C; ? ?0 and a a0 then 9X0; C0; D0 [?0 `o+ a0 : X0 j C0 and X0 ' X j D0]: Proof. Suppose ? `o+ a : X j C , ? ?0 and a a0 . As 0 j= C in the new system we have by 85 that ? `o a : 0 X , and hence by closure for `o (45) that ?0 `o a0 : A0 where A0 ' 0X . Again by 85 there is X0 , C0 and such that ?0 `o+ a0 : X0 j C0 where X0 = A0 . It follows from 83 that X0 = A0 ' 0X and hence X0 ' X j D0 for some D0 . ut Now we de ne the relation `tp+ . De nition 88 `tp+ . The relation `tp+ C (T [ T ) (T [ T ) Cnstr is the smallest relation satisfying the rules for `o+ , but having instead of the rule App-o+ the following rule for application. App-tp+
? `tp+ a : X j C ? `tp+ a b : X b j C
Clearly `o+ `tp+ , or more precisely, if ? `o+ a : X j C then ? `tp+ a : X j D for some D . As the systems are syntax directed we have also: if ? `tp+ a : X j D and ? `o+ a : Y j C then X =Y.
Lemma 89 Equivalence of `tp and `tp+ . i If ? `tp+ X : Y j C and j= C then ? `tp X : Y ii If ? `tp X : A then 9Y; C ; 1 [1 ; 1 j= C ; A = Y and ? `tp+ X : Y j C ] The proof is similar to the proof of 85. We are ready to de ne a syntax directed set of rules for arbitrary PTS . De nition 90 `sd?n . For each n 2 N the relation `sd?n C T (T [ T ) Cnstr is the smallest relation satisfying the following rules Srt-sd-n `sd?n s : j fhs; i 2 Ag
? `sd?n A : j C ?; x:A `sd?n x : A j C ? `sd?n b : X j C ? `sd?n A : j D Wk-sd-n ?; x:A `sd?n b : X j C [ D Var-sd-n
Pi-sd-n
? `sd?n A : j C ?; x:A `sd?n B : j D ? `sd?n x:A:B : j C [ D [ fh; ; i 2 Rg
? `sd?n A : j C Lda-sd-n
?; x:A `sd?n b :)n X j D
? `sd?n x:A:b : x:A:X jC [ D [ E [ fh; ; ig 2 R
App-sd-n
b2S[V
? `sd?n a :wh x:B:X j C ? `sd?n b : Y j D ? `sd?n a b : X [x := b] j C [ D [ E
?; x:A `tp+ X : j E
We prove soundness and completeness of `sd?n with respect to `nsd?n .
Y ' BjE
Lemma 91 Soundness of `sd?n with respect to `nsd?n . If ? `sd?n a : X j C and j= C then ? `nsd?n a : X: Proof. Induction on ? `sd?n a : A . Srt-sd-n `sd?n s : j fhs; ig 2 A If j= fhs; i 2 Ag then = s0 where hs; s0i 2 A . Hence `nsd?n s : s0 . Var-sd-n ?; x:A `sd?n x : A j C because ? `sd?n A : j C . Suppose j= C , then by induction hypothesis ? `nsd?n A : . As 2 S it follows that ?; x:A `nsd?n x : A . Wk-sd-n ?; x:A `sd?n b : X j C [ D because ? `sd?n b : X j C , ? `sd?n A : j D and b 2 S [ V . Suppose j= C [ D , then by induction hypothesis ? `nsd?n b : X and ? `nsd?n A : . Again 2 S , so ?; x:A `nsd?n b : X .
Pi-sd-n ? `sd?n x:A:B : j C [ D [ fh; ; i 2 Rg because ? `sd?n A : j C and ?; x:A `sd?n B : j D : Suppose j= C [ D [ fh; ; i 2 Rg , then by induction hypothesis ? `nsd?n A : and ?; x:A `nsd?n B : . As earlier we conclude ? `nsd?n x:A:B : because h; ; i 2 R. Lda-sd-n ? `sd?n x:A:b : x:A:X j C [ D [ E [ fh; ; i 2 Rg because ? `sd?n A : j C ; ?; x:A `sd?n b :)n X j D and ?; x:A `tp+ X : j E : Suppose j= C [n D [ E [ fh; ; i 2 Rg . By induction hypothesis ? `nsd?n A : and ?; x:A `nsd?n b :) X . Also by 89 we have ?; x:A `tp X : and it follows that ? `nsd?n x:A:b : x:A: X because h ; ; i 2 R . App-sd-n ? `sd?n a b : X [x := b] j C [ D [ E because ? `sd?n a :wh x:B:X j C ; ? `sd?n b : Y j D and quadY ' B j E : Suppose j= C [ D [ E , then by induction hypothesis ? `nsd?n a :wh x:B: X and ? `nsd?n b : Y : Also we have Y ' B by 32. Hence ? `nsd?n a b : (X [x := b]) because ( X )[x := b] = (X [x := b]). ut
Lemma 92 Completeness of `sd?n with respect to `nsd?n . If ? `nsd?n a : A then 9X 2 T [ T 9C 2 Cnstr 9 j= C [A = X and ? `sd?n a : X j C ]: Proof. Induction on ? `nsd?n a : A . Srt-nsd-n We have `nsd?n s1 : s2 because hs1; s2 i 2 A . Hence `sd?n s1 : j fhs1; i 2 Ag where is a fresh sort variable. De ning = s2 we have j= fhs1 ; i 2 Ag and we are done. Var-nsd-n We have ?; x:A `nsd?n x : A as a consequence of ? `nsd?n A : A0 where A0 s . By the induction hypothesis ? `sd?n A : X0 j C and we have j= C and X0 = A0 . Now as A0 s , also X0 where = s . It follows that ?; x:A `sd?n x : A j C . Wk-nsd-n ?; x:A `nsd?n b : B because ? `nsd?n b : B and ? `nsd?n A : A0 where A0 s , and b 2 S [ V . Induction gives ? `sd?n b : Y j C and ? `sd?n A : X0 j D , where we assume SV(C ) and SV (D) to be disjoint. Also we have 1 satisfying C and 2 satisfying D with 1 Y = B and 2 X0 = A0 . As A0 s it follows that X0 . So, taking = 1 [ 2 we have ?; x:A `sd?n b : Y j C [ D where j= C [ D and Y = B . Pi-nsd-n ? `nsd?n x:A:B : s3 as a consequence of ? `nsd?n A : A0 and ?; x:A `nsd?n B : B0 where A0 s1 , B0 s2 and hs1 ; s2; s3i 2 R . As before we have ? `sd?n A : X0 j C and ?; x:A `sd?n B : Y0 j D where we assume SV(C ) and SV(D) disjoint, 1 and 2 satisfying C and D respectively, and 1 X0 = A0 , 2 Y0 = B0 . Hence again X0 and Y0 , where 1 = s1 and 2 = s2 . It follows that ? `sd?n x:A:B : j C [ D [ fh; ; i 2 Rg . So taking = 1 [ 2 [ fh; s3ig we have j= C [ D [ fh; ; i 2 Rg and = s3 .
Lda-nsd-n ? `nsd?n x:A:b : x:A:B because ? `nsd?n A : A0 where A0 s1 , ?; x:A `nsd?n b : B0 where B0 )n B , ?; x:A `tp B : s2 and hs1; s2; s3i 2 R. By induction hypothesis we have ? `sd?n A : X0 j C and ?; x:A `sd?n b : Y0 j D . We observe that SV (C ) and SV (D) might be chosen to be disjoint. Also we have 1 and 2 satisfying C and D respectively, where 1 X0 = A0 and 2 Y0n = B0 . We de ne 3 =n 1 [ 2 . Then we have 3 = s1 and 3 Y0 = B0 . And because B0 ) B we have also Y0 ) Y with 3 Y = B . Hence we have ?; x:A `tp 3 Y : s2 and therefore by 89 ? `tp+ Y : j E and there is an extension 4 of 3 with 4 j= E and 4 = s2 . It follows that ? `sd?n x:A:b : x:A:Y j C [ D [ E [ fh; ; i 2 Rg : Also if we take = 4 [ fh; s3ig we have Y = B , so (x:A:Y ) = x:A:B and j= C [ D [ E [ fh; ; i 2 Rg: App-nsd-n We have ? `nsd?n a b : A[x := b] because ? `nsd?n a : A0 , and ? `nsd?n b : B2 where A0 wh x:B1:A and B1 ' B2 . By induction hypothesis ? `sd?n a : X0 j C and ? `sd?n b : Y2 j D and as before we consider SV (C ) and SV (C ) to be disjoint, and we have also 1 and 2 satisfying C and D respectively, where 1 X0 = A0 and 2 Y2 = B2 . It follows that X0 wh x:B1 :X and as 2 Y2 ' 2 B1 we have also Y2 ' B1 j E where 2 j= E . Hence we have ? `sd?n a b : X [x := b] j C [ D [ E and taking = 1 [ 2 we have also X = A and j= C [ D [ E . ut For proving our second completeness result we need the following property of `sd?n .
Lemma 93 Weak Completeness of `o+ for `sd?n . If ? `sd?n a : X j C then i 9Y; D Y ' X and ? `o+ a : Y j D ii 9; D ? `o+ X : j D Now we prove a second completeness lemma for `sd?n . Lemma 94 Weak Completeness of `sd?n with respect to `nsdtp . If ? `nsdtp a : A then 8n 9X; C ; D [A ' X j C and ? `sd?n a : X j D]: Proof. Induction on ? `nsdtp a : A . We select interesting cases. Wk-nsdtp We have ?; x:A `nsdtp b : B as a consequence of ? `nsdtp b : B , ? `nsdtp A : A0 where A0 s and b 2 S [ V . By induction hypothesis we have ? `sd?n b : X j D1 where B ' X j C1 and ? `sd?n A : Y j D2 where A0 ' Y j C2 . Now as A0 s it follows that Y and hence ?; x:A `sdtp?n b : X j D1 [ D2 . Lda-nsdtp We have ? `nsdtp x:A:b : x:A:B as a consequence of ? `nsdtp A : A0 where A0 s1 and ?; x:A `nsdtp b : B0 where B0 B . And also we know ?; x:A `tp B : C where C s2 and hs1; s2; s3i 2 R. By the induction hypothesis we have: ? `sd?n A : X j D1 where A0 ' X j C1 , and ?; x:A `sd?n b : Y j D2 where B0 ' Y j C2 . We observe that X . Also n we have by 93 that ?; x:A `o+ Y : j E . Now we de ne Yn by Y ) Yn . Then also ?; x:A `o+ Yn : j E1 by 87, and therefore also ?; x:A `tp+ Yn : j E2 . It follows that ? `sdtp?n x:A:b : x:A:Yn j D1 [ D2 [ E 2 [ fh; ; i 2 Rg by Lda-sd-n , and also x:A:Yn ' x:A:B j C2 , because B0 ' Y j C2 , B ' B0 and Yn ' Y .
App-nsdtp We have ? `nsdtp a b : A[x := b] as a consequence of ? `nsdtp a : A0 where A0 wh x:B:A and ? `nsdtp b : B0 where B0 ' B . By the induction hypothesis ? `sd?n a : X j D1 where A0 ' X j C1 and ? `sd?n b : Y j D2 where B0 ' Y j C2 . It follows that X wh x:B1 :X1 where B ' B1 and therefore also Y ' B0 j C2 . Hence ? `sd?n a b : X [x := b] j D1 [ D2 [ C2 by App-sd-n. ut It is the place to collect results. We will characterize ` in terms of the relations `sd?n . As the latter have syntax directed presentations there is an obvious way to construct checking algorithms for them, and hence we have an (ecient) algorithm for typechecking ` .
Lemma 95 Soundness of `sd?n for ` . If ? `sd?n a : X j C; ? `sd?n A : j D; X ' A j E and j= C [ D [ E then ? ` a : A Proof. We have by 91 that ? `nsd?n a : X and ? `nsd?n A : . Hence we have by 78 that ? `nsdtp a : X and ? `nsdtp A : . And therefore by 14 ? ` a : X and ? ` A : A0 where A0 . Now = s 2 S and it follows by closure that ? ` A : s . And as j= E we have X ' A, so ? ` a : A by Cnv. ut The next lemma states that for all n , `sd?n succeeds on every well typed subject; the rub is that, if n is not large enough, the constraints may not be satis able.
Lemma 96 Weak Completeness of `sd?n for ` . If ? `a:A then 8n 9X; ; C ; D; E [ ? `sd?n a : X j C ; A ' X j E and [A 2 S or ? `sd?n A : j D] ]: Proof. If ? ` a : A then we have by 58 that ? `nsdtp a : A0 where A0 ' A . It follows by lemma 94 that ? `sd?n a : X j C where A0 ' X j E and hence also A ' X j E . And we have also either A 2 S or ? ` A : s . In the rst case we are done. In the second case we repeat our argument, getting ? `sd?n A : Y j D with s ' Y j E1 and therefore Y 2 S [ . ut Lemma 97 Completeness of `sd?n for ` . If ? `a:A then 9n; X; ; C ; D; [ j= C [ D; ? `sd?n a : X j C ; A ' X and [A 2 S or ? `sd?n A : j D] ]: The proofs are similar to the proofs of the corresponding lemma's for `sd?n , and use the fact that all terms and all schematic terms considered will normalize, as they are -convertible to terms of the underlying PTS .
8 Conclusion We have presented ecient syntax directed presentations of two subclasses of PTS : { the semi-full systems, via the `sdsf relation { the functional systems, via the `f relation
The only remaining defect in these presentations lies in the possible failure of tests for conversion in the application rule. Thus for normalizing functional and semi-full systems, everything has been said. For non-functional systems the situation is less clear. We know of no a priori bound on the amount of reduction necessary to correctly type -abstractions, so we must be content with the collective completeness of the family of syntax directed systems `sd?n . We have made little impact on the Expansion Postponement problem, which we leave as future work. We can however bask in the relative peace of mind gained from the machinechecked presentation of most (i.e. those not concerning schematic judgments) of the above results.
References [Bar91] Henk Barendregt. Introduction to Generalised Type Sytems. J. Functional Programming, 1(2):125{154, April 1991. [Bar92] Henk Barendregt. Lambda calculi with types. In Abramsky, Gabbai, and Maibaum, editors, Handbook of Logic in Computer Science, volume II. Oxford University Press, 1992. [Ber90] Stefano Berardi. Type Dependence and Constructive Mathematics. PhD thesis, Dipartimento di Informatica, Torino, Italy, 1990. [CH88] Thierry Coquand and Gerard Huet. The calculus of constructions. Information and Computation, 76(2/3):95{120, February/March 1988. [Cha77] Tat-Hung Chan. An algorithm for checking PL/CV arithmetical inferences. Technical Report 77{236, Computer Science Department, Cornell University, Ithaca, New York, 1977. [Geu] Herman Geuvers. The calculus of constructions and higher order logic. In preparation. [Geu93] Herman Geuvers. Logics and Type Systems. PhD thesis, Department of Mathematics and Computer Science, University of Nijmegen, 1993. [GN91] Herman Geuvers and Mark-Jan Nederhof. A modular proof of strong normalization for the calculus of constructions. Journal of Functional Programming, 1(2):155{189, April 1991. [Hel91] Leen Helmink. Goal directed proof construction in type theory. In Logical Frameworks. Cambridge University Press, 1991. [HHP87] Robert Harper, Furio Honsell, and Gordon Plotkin. A framework for de ning logics. In Proceedings of the Symposium on Logic in Computer Science, pages 194{204, Ithaca, New York, June 1987. [HHP92] Robert Harper, Furio Honsell, and Gordon Plotkin. A framework for de ning logics. Journal of the ACM, 40(1):143{184, 1992. Preliminary version in LICS'87. [HP91] Robert Harper and Robert Pollack. Type checking with universes. Theoretical Computer Science, 89:107{ 136, 1991. [Hue87] Gerard Huet. Extending the calculus of constructions with Type:Type. Unpublished manuscript, April 1987. [Hue89] Gerard Huet. The constructive engine. In R. Narasimhan, editor, A Perspective in Theoretical Computer Science. World Scienti c Publishing, 1989. Commemorative Volume for Gift Siromoney. [LP92] Zhaohui Luo and Robert Pollack. LEGO proof development system: User's manual. Technical Report ECS-LFCS-92-211, LFCS, Computer Science Dept., University of Edinburgh, The King's Buildings, Edinburgh EH9 3JZ, May 1992. Updated version. See http://www.dcs.ed.ac.uk/packages/lego/ [Luo90] Zhaohui Luo. An Extended Calculus of Constructions. PhD thesis, Department of Computer Science, University of Edinburgh, June 1990. [Mar72] Per Martin-Lof. An intuitionistic theory of types. Technical report, University of Stockholm, 1972. [MP93] James McKinna and Robert Pollack. Pure Type Sytems formalized. In M.Bezem and J.F.Groote, editors, Proceedings of the International Conference on Typed Lambda Calculi and Applications, TLCA'93, Utrecht, pages 289{305. Springer-Verlag, LNCS 664, March 1993. [MP94] James McKinna and Robert Pollack. Pure Type Sytems formalized. Available by anonymous ftp from ftp.dcs.ed.ac.uk, directory export/lego, le PTSproofs.tar.Z, 1994. [Pfe89] Frank Pfenning. Elf: A language for logic de nition and veri ed metaprogramming. In Proceedings of the Fourth Annual Symposium on Logic in Computer Science, Asilomar, California, June 1989. [Pol92] R. Pollack. Typechecking in Pure Type Sytems. In Informal Proceedings of the 1992 Workshop on Types for Proofs and Programs, B astad, Sweden, pages 271{288, June 1992. Available by ftp.
[Pol94] Robert Pollack. The Theory of LEGO: A Proof Checker for the Extended Calculus of Constructions. PhD thesis, University of Edinburgh, 1994. Available by anonymous ftp from ftp.cs.chalmers.se in directory pub/users/pollack. [vBJ93] L.S. van Benthem Jutting. Typing in Pure Type Sytems. Information and Computation, 105(1):30{41, July 1993. [vD80] D. T. van Daalen. The Language Theory of Automath. PhD thesis, Technische Hogeschool Eindhoven, 1980.
This article was processed using the LaTEX macro package with LLNCS style