Apologies, I dabble in Haskell in a minor way, once ever 2 years, and then I can't get my head around it.

Its best to just do the example.

> {-# LANGUAGE MultiParamTypeClasses #-}
> {-# LANGUAGE ScopedTypeVariables #-}
> {-# LANGUAGE AllowAmbiguousTypes #-}
> {-# LANGUAGE FlexibleInstances #-}
> {-# LANGUAGE FlexibleContexts #-}

> class Foo a b where
>     foo :: a -> b
> class Bar a b where
>     bar :: a -> b

> data A a = A a
> data B b = B b

> instance Foo (A a) a where
>   foo (A a) = a

> instance Bar (B b) b where
>   bar (B b) = b

so, what I THINK I'm doing is create to data types and then defining two functions based on the the membership of the datatype in the class....

so this works....

> f1 x = foo x
> f2 x = bar x

> x1 :: String
> x1 = f1 $ A "1"
> x2 :: String
> x2 = f2 $ B "1"


worry #1...if I remove the type declaration of x1 and x2, ghc complains

• Ambiguous type variable ‘b1’ arising from a use of ‘f1’
      prevents the constraint ‘(Foo (A [Char]) b1)’ from being solved.
      Relevant bindings include x1 :: b1 (bound at catdog.lhs:27:3)
      Probable fix: use a type annotation to specify what ‘b1’ should be.
      These potential instance exist:
        instance Foo (A a) a -- Defined at catdog.lhs:17:12
    • In the expression: f1 $ A "1"
      In an equation for ‘x1’: x1 = f1 $ A "1"
27 | > x1 = f1 $ A "1"    |        ^^^^^^^^^^

which is a worry, isnt it obvious?...and this sort of thing will happen again.....

if I write

> f x = bar (foo x)

to me a perfectly reasonable thing...ghc agrees!

I ask it for the type...I get

f :: (Bar a1 b, Foo a2 a1) => a2 -> b

I can buy that.

like a good programmer I paste the type in

> f :: (Bar a1 b, Foo a2 a1) => a2 -> b
> f x = bar (foo x)

and "BOOM"...

• Could not deduce (Foo a2 a0) arising from a use of ‘foo’
      from the context: (Bar a1 b, Foo a2 a1)
        bound by the type signature for:
                   f :: forall a1 b a2. (Bar a1 b, Foo a2 a1) => a2 -> b
        at catdog.lhs:32:3-39
      The type variable ‘a0’ is ambiguous
      Relevant bindings include
        x :: a2 (bound at catdog.lhs:33:5)
        f :: a2 -> b (bound at catdog.lhs:33:3)
      These potential instance exist:
        instance Foo (A a) a -- Defined at catdog.lhs:17:12
    • In the first argument of ‘bar’, namely ‘(foo x)’
      In the expression: bar (foo x)
      In an equation for ‘f’: f x = bar (foo x)
33 | > f x = bar (foo x)    |              ^^^^^

so ghc is telling me the type it inferred without a type declaration, its now not sure about!

now...there is usually one cog in my head that gets turned backwards by using scala or f# or some other OO style type system, that I have to turn the other way around....am I going mad?

1 Answers

bradrn On Best Solutions

Let's look at your example again:

> class Foo a b where
>     foo :: a -> b

The problem here is that a given a could correspond to multiple given bs; you could have, for example, both instance Foo Int Char and instance Foo Int Bool. This causes the problems with type inference you've been seeing.

> f1 x = foo x
> f2 x = bar x

> x1 :: String
> x1 = f1 $ A "1"
> x2 :: String
> x2 = f2 $ B "1"

Here, for example, you've specified both a and b, so GHC knows you're talking about instance (A String) String. When you remove the signatures, GHC doesn't know what b to use. You haven't defined any other instances, and you don't intend to, but GHC doesn't know that.

So how do we fix this problem? We use {-# LANGUAGE FunctionalDependencies #-}:

> class Foo a b | a -> b where
>     foo :: a -> b

Basically, this tells GHC that for any given a, there is only one corresponding b. This should resolve your problems (although admittedly I haven't tested this yet). For more on functional dependencies, see this SO answer, which also addresses your bar (foo x) problem.