When is Ap's Num instance lawful?

165 views Asked by At

Consider this type:

newtype Ap f a = Ap (f a)
instance (Applicative f, Num a) => Num (Ap f a) where
  (+)         = liftA2 (+)
  (*)         = liftA2 (*)
  negate      = fmap negate
  fromInteger = pure . fromInteger
  abs         = fmap abs
  signum      = fmap signum

What needs to be true of f for that instance to be lawful? Clearly, just being a lawful Applicative isn't sufficient (e.g., x + negate x is different than fromInteger 0 when x is Ap Nothing, even though Maybe is a lawful Applicative). My guess is that it's both necessary and sufficient for f to be a representable functor. I'm not sure how to go about proving this, though (or finding a counterexample if it's false).

For the sake of reference, here are the laws suggested by the Num documentation:

-- Associativity of (+)
(x + y) + z = x + (y + z)

-- Commutativity of (+)
x + y = y + x

-- fromInteger 0 is the additive identity
x + fromInteger 0 = x

-- negate gives the additive inverse
x + negate x = fromInteger 0

-- Associativity of (*)
(x * y) * z = x * (y * z)

-- fromInteger 1 is the multiplicative identity
x * fromInteger 1 = x
fromInteger 1 * x = x

-- Distributivity of (*) with respect to (+)
a * (b + c) = (a * b) + (a * c)
(b + c) * a = (b * a) + (c * a)
2

There are 2 answers

0
leftaroundabout On

This structure is called an algebra, which is a vector space equipped with a product. It is a fairly well known pattern to implement vector spaces as representable functors. This is what the linear library is all based around.

An algebra is trivially a ring (by forgetting about scalars), and that's about the only thing everybody can agree on as to what the Num class actually describes. In that sense, yes, Representable is what's needed. Specifically, what you're exploiting is to substitute abstract vectors with their basis expansion, i.e. you're essentially reducing everything to

instance Num a => Num (r -> a) where
  fromInteger = const . fromInteger
  f + g = \e -> f e + g e
  f * g = \e -> f e * g e
  negate f = negate . f
  abs f = abs . f
  signum f = signum . f

which can then be used with r ~ Rep f, because the type Rep f -> a is isomorphic to f a.

I personally am not a fan of any of this. In particular the Num instances of V2 etc. are a pain in the backside, because e.g. 1 :: V2 Double ≡ V2 1 1 doesn't make any sense whatsoever geometrically if you're dealing with e.g. diagrams, which are supposed to be rotation-agnostic. See, any finite-dimensional vector space can be realised with representable functors, but this entails picking an arbitrary basis. But this is a bit like picking an arbitrary character encoding for a text editor and then hard-baking it in a way that's fully exposed in the public interface. It's bad design. Just because you can doesn't mean you should.

Furthermore, many practically useful vector spaces can not sensibly be implemented with representable functors, or that implementation may be very impractical. I'm talking about sparse vectors here, or efficient unboxed and/or distributed storage, or (personal favourite) infinite-dimensional function spaces.

Therefore, what I recommend is to keep the Num class to things that are unequivocally numbers, and call everything else explicitly by what you actually want. My free-vector-spaces and linearmap-category packages attempt to do that in a principled and powerful way, building upon the simple and safe classes from vector-space.

0
duplode On

What needs to be true of f for that instance to be lawful?

To sketch an argument, we can begin by focusing on the effects monoid (Ap f (), if you will) which lies under the applicative. For left distributivity to hold, we need the effects monoid to be left self-distributive (that is, u <> (v <> w) = (u <> v) <> (u <> w)), and analogously for right distributivity. Furthermore, if a monoid is both left self-distributive and right self-distributive it must be both commutative and idempotent. That is a hefty restriction already, though it is not enough to rule out Maybe (which is an idempotent, commutative applicative). That being so, it makes sense to use your hint and look at the additive inverse law:

x + negate x = fromInteger 0
-- Substitute the instance implementations:
liftA2 (+) x (negate <$> x) = pure (fromInteger 0)
-- (&*&) = liftA2 (,)
(\(a, b) -> a + negate b) <$> (x &*& x) = pure (fromInteger 0)

In particular, using (() <$) on both sides gives us:

() <$ (x &*& x) = pure ()

Going back to the effects monoid point of view, that amounts to u <> u = mempty. Since we had established earlier that the effects monoid must be idempotent, u <> u = u. Therefore, u = mempty; that is, there is just one possible Ap f ().

As for the other laws, identity and associativity should follow from the applicative laws, while commutativity of (+) presumably requires f to be a commutative applicative...

liftA2 g u v = liftA2 (flip g) v u

... which is a bit stronger than merely requiring the commutativity of Ap f () that I mentioned before (see Select for an illustration of that).

For the Num (Ap f a) instance to be lawful, therefore, it seems we need at least that f is a commutative applicative, and that there is just one f () value. While the argument isn't quite watertight just yet, those requirements are indeed strongly reminiscent of representable functors.