[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: Java interface natural history was RE: "static" declaration
Sorry for delayed reply on this, but I did intend to respond.
Daniel Weinreb wrote:
> Um, often there wasn't any implementation to inherit, because the
> interfaces were pretty simple, having few methods, and the
> implementations of those methods didn't need to do any sharing
> that requiring inheritance (only sharing that requires subroutines).
I think I misunderstood the original message, and thought you were making a
broader claim.
> >What I'm saying is that either of the above constitute a limitation in
> >design
>
> Wait, wait, wait. What do you mean, a limitation in design? We didn't
> set any limits on design.
Perhaps not in that system. But the issues I'm referring to can constrain
designs in many systems.
Certainly, many programs don't require the capabilities I'm talking about,
and many others that do can nevertheless get away without them. But I think
there's a danger here, based on the way that languages tend to limit
thought. Something that's factored the Right Way for a language with single
implementation inheritance, and no explicit support for multiple interfaces,
might benefit enough from refactoring into a multiple interface design that
the original design would stop looking like the Right Way - I've come across
at least a few such cases.
> >It sounds as though you may be claiming that good designs don't
> >need to run into this issue,
>
> Certainly they don't. Suppose you are tasked with writing a subroutines
> that takes the square root of a floating point number. Well, I bet it
> doesn't need to run into the issue of multiple inheritance, right?
I have two responses to this. First, of course, I'm not claiming that all
systems need multiple interface or implementation inheritance down to the
most granular level, necessarily (although I could easily be convinced to
stretch that far, see below). I'm talking about the kinds of systems I've
run into in real life, though. There are plenty of simple systems out there
for which a PHP web page is a perfectly practical solution - I'm not talking
about those cases.
My second response is that in fact, the task you've set might well run into
a requirement for multiple interfaces, depending on the language you're
using. Most languages simply punt on the issue of interfaces to basic
datatypes, and glom together all possible operations on their built-in
datatypes into a single implicit interface. That's not necessarily the best
solution - it may simply be a reflection of how primitive and
poorly-factored mainstream languages currently are, even at the most basic
levels. A language that supported an Arithmetic interface on its various
number types, plus some other orthogonal interfaces (Printable? I could
come up with something more convincing if pressed), might then use multiple
interfaces to implement your square root problem.
So perhaps mainstream languages suffer from this issue at their very core.
There's often more than one implicit interface, where the language just
offers one; and languages may not support polymorphic access to otherwise
common interfaces on different concrete datatypes.
> There's no one boundary; it's a whole landscape of possibilities,
> starting mundane and getting more and more powerful and sophisticated
> and exotic.
I agree. But, although multiple interfaces are particularly applicable to
more complex systems, part of what I'm saying is that the lack of explicit
support for them in languages tends to lead to imprecise thinking about
designs, and may negatively impact the designs even of systems that might
not appear at first glance to need such techniques.
In designing, decomposing things to the max is almost never a bad thing - at
least identify distinctions between entities, even if you can't or don't
implement those distinctions explicitly. But when languages don't allow a
certain type of distinction to be made, it can have negative effects, on
programs, designs, and the thinking of designers and programmers.
Have I managed to overstate the case yet? Someone once told me you can't
win arguments without exaggerating... ;)
Anton