Type theory and programming languages

Confluence (rewriting system)

Multiple paths between a source and target term via rewrite rules.

As in the Church-Rosser theorem (ie, ordering in which reduction rules are applied doesn't matter in lambda calculus).

Weak vs strong normalization (rewriting system)

Following is weak normalizing:

1) S -> S S
2) S -> A

because there exists at least one path by which normal form is reached:

S -> S S 
  -> A A         (by 2)

But this is not strong normalizing because, the rewriting can potentially go on infinitely. For example:

S -> S S 
  -> (S S) S     (by 1)
  -> ...

— Strong normalization => no matter what order of rewrite rules we use, it will eventually lead to a normal form within a finite number of rewrites.

Untyped lambda calculus is not normalizing. Due to the famous divergent terms. It is not even weakly normalizing since given (λx. x x) (λx. x x), the only applicable rewrite rule is that of application and it would simply diverge further.

On the other hand, simply typed lambda calculus is strongly normalizing.

Ref:

Point-free functions

Eg: A function to find factorial of a number

A 'normal' style function:

-- Haskell
fact :: Int -> Int
fact 0 = 1
fact n = n * fact (n-1)

λ> fact 5
120

A point-free style function: ¹⁰

fact :: Int -> Int
fact = foldl (*) 1 . enumFromTo 1

λ> fact 5
120

Weak head normal form

From here:

Weak-head normal form means: evaluate it until you reach the outermost constructor.

Eg:

Expression WHNF?
8 + Just (5 + 5)
4 + 5

Also see this.

Bifunctor

Haskell got it in Data.Bifunctor:

class BiFunctor f where
  bimap :: (a -> b) -> (c -> d) -> f a b -> f c d

Incompleteness of subtyping in Java

Array types should be invariant, but java compiler enforces only the covariant part during its type checking phase. Java has got

   S <: T 
------------
 [S] <: [T]

The contravariance check is done at run-time.

So programs like this:³:

public class Foo {
  public static void main(String[] args) {
    Object[] oarr = "hi,hello".split(",");
    oarr[0] = 42;
  }
}

will compile fine, but would result in runtime error:

Exception in thread "main" java.lang.ArrayStoreException: java.lang.Integer
        at Foo.main(Foo.java:5)

This is a design choice that some people consider to have been not so cool.

See this and this.

Type inference

Principal type

Most general instance of a type: 'makes the smallest commitment about the values of type variables tha yields a well-typed term' Eg:

a = λf:Y. λx.X. f x

a by itself is not well-typed.

But if we have a map like:
σ = {Y↦X→X, X↦X}

σ a = λf:X→X. λx.X. f x

which is well-typed.

This is the most general instance of a.

Notation:

Principal unifier: Most general unifier

First-class values

With first-class values in a programming language, both the following are possible:

Logical relations

Refs: link link

Strong normalization (??) of STLC is not provable with induction (??) but is with logical relations.

W-types and M-types

See:

More:

Kind

A kind is a type of types.

From https://serokell.io/blog/kinds-and-hkts-in-haskell

they show the arity of a type (if we think of that type as a function).

Concrete types are like values and polymorphic types are like functions.

Higher-kinded types

Can be thought of as a 'function' taking a kind and returning another kind.

Misc

Types, Categories, Topologies, Knots, etc

From an article.

Types

Categories

Topology

homotopy

Topological space

Topology

Homeomorphism:

Neighbourhood of a point:

Knot

Tangles String diagrams

Misc

Spaces:

References