[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: functional languages ill-suited for large programs?



Quoting "Perry E. Metzger" <perry@piermont.com>:
> Jim Blandy <jimb@redhat.com> writes:
> > I can think of one case where pure functional languages make nice
> > interfaces difficult.
> > 
> > Suppose you've got a function that takes some time to compute, but you
> > know that once you've applied it to one value, you're likely to apply
> > it to that same value again soon.  In a stateful language, you can
> > wrap a cache around your function without changing its type.  But in a
> > pure functional language, each call to the cached version of the
> > function must take the cache (or something containing the cache) as an
> > explicit parameter, and return an updated version; the caller must
> > thread these values along from each call to the next.
> 
> That's not necessarily the case. The optimizer for a pure functional
> language will know that a function is pure and can handle
> memoization/caching behind your back. That requires more work on the
> part of the compiler, but there is no reason that it can't be done.

I disagree. It seems to me that a compiler simply can not *in general* make
such optimizations and can not guarantee such optimizations *in general*.

Consider a simplistic exponential recursive fibonacci function. If the
compiler did automatic caching of results the function would become linear.
Unless the optimizer were rather clever, it would also mean that the
function would take linear space. Although automatic optimization of the
exponential recursive fibonacci algorithm into a linear iterative
constant-space algorithm is probably possible, I'm not sure how easy it is
or whether such optimization generalize into more general situations.

Before I can agree with the argument that "a compiler can do it", I need to
see a proof, in the form of a publication or preferably an existing
compiler, that shows that such optimizations are *in general* possible - not
just in toy examples that are of no practical value.

The reason why I stress *in general* is because in some situations you need
that kind of guarantees, because otherwise the algorithm would not be usable
(just consider the fibonacci example). The *fact* is that caching does not
merely change the constant factors of algorithms - it can and does change
the asymptotic complexity bounds (both time and space) of the algorithm.
Like tail call optimization changes linear space recursion to constant space
iteration. If you need the guarantee, and the *language specification* does
not guarantee the optimization, a competent engineer would simply not trust
the compiler to do the optimization, but would implement the optimization
manually.

Regards,
  Vesa Karvonen