patternMinor
Is dependency analysis required in order to type a program?
Viewed 0 times
orderanalysisprogramdependencytyperequired
Problem
I have seen stated in various places that in order to allow an "increase in polymorphism," functional dependency analysis should be performed, and type inference should be used for every declaration group in topological order. It seems like a well-known fact... except that nobody provides any reference or example.
In fact I don't understand what is really meant by this statement. Does it mean that without doing dependency analysis I wouldn't be able to type some programs (i.e. some valid programs would fail type-checking) or only that I would type them with a type that is more specific than what really is?
I'm particularly interested in consequences in a Haskell-like language/type system. In particular during kind inference, is dependency analysis actually necessary to avoid rejecting valid programs? Can you provide an example that wouldn't be well-kinded if typed without performing dependency analysis?
Since kinds are monomorphic I struggle to understand why dependency analysis should matter in this particular case.
In fact I don't understand what is really meant by this statement. Does it mean that without doing dependency analysis I wouldn't be able to type some programs (i.e. some valid programs would fail type-checking) or only that I would type them with a type that is more specific than what really is?
I'm particularly interested in consequences in a Haskell-like language/type system. In particular during kind inference, is dependency analysis actually necessary to avoid rejecting valid programs? Can you provide an example that wouldn't be well-kinded if typed without performing dependency analysis?
Since kinds are monomorphic I struggle to understand why dependency analysis should matter in this particular case.
Solution
Any use of a type anywhere may place constraints on that type. So, in the absence of explicit type signatures (i.e. when doing type inference) it is necessary to perform inference in a way that doesn't allow specific usages of a type in a particular form to "infect" the most general type it may have. This involves a dependency analysis. Consider
Now if we type foo first, we would unify the first argument of
Hence, dependencies and topological sort!
id = \x -> x
foo = id 1Now if we type foo first, we would unify the first argument of
id with Int and thus conclude id had type Int -> Int. So therefore we must first type id and then generalize its signature to the polymorphic type forall a. a -> a, and then instantiate that at type Int when we apply it and attempt to unify the argument type with Int.Hence, dependencies and topological sort!
Code Snippets
id = \x -> x
foo = id 1Context
StackExchange Computer Science Q#24343, answer score: 2
Revisions (0)
No revisions yet.