I did bump into this question while searching for this topic, but this one seems to be outdated.
Reading https://blogs.mathworks.com/loren/2016/10/24/matlab-arithmetic-expands-in-r2016b , implicit expansion was introduced in 2016b, but I can still find the reference codes in the papers using bsxfun for arithmetic expansion. So I assume that there are some circumstances that make bsxfun preferable to other methods.
I did compare the speeds between bsxfun, repmat, and implicit expansion (I used the code of Jonas from the link)
The below shows the comparison in calculation time using tic toc:

which shows that implicit expansion is clearly faster than bsxfun or repmat. Is there any reason to use bsxfun nowadays?
Here is the code I used to compare the speed:
n = 300;
k=100; %# k=100 for the second graph
a = ones(10,1);
rr = zeros(n,1);
bb = zeros(n,1);
ntt = 100;
tt = zeros(ntt,1);
for i=1:n;
r = rand(1,i*k);
for it=1:ntt;
tic,
x = bsxfun(@plus,a,r);
tt(it) = toc;
end;
bb(i) = median(tt);
for it=1:ntt;
tic,
y = repmat(a,1,i*k) + repmat(r,10,1);
tt(it) = toc;
end;
rr(i) = median(tt);
for it=1:ntt;
tic,
z = a + r;
tt(it) = toc;
end;
gg(i) = median(tt);
end
figure;
plot(bb,'b')
hold on
plot(rr,'r')
plot(gg,'g')
legend(["bsxfun","repmat","implicit"])
All
bsxfundoes is Binary Singleton eXpansion. It's more typing than the, now usual, implicit expansion. I'd guess The MathWorks keptbsxfunaround for backwards compatibility, but no longer works on it; it might even internally just map to implicit expansion.The documentation on
bsxfunstates:Additionally, implicit expansion seems to have internal optimisations beyond what
bsxfundoes, see this question of mine.More helpful links can be found in this answer by nirvana-msu, amongst others to blogs by MathWorks employees discussing this.
So I'd say that the only reason to use
bsxfuninstead of implicit expansion would be if you'd run the code on a pre-2016b version of MATLAB.