[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

how is polymorphism used?



I'm curious about your experience with polymorphism and dynamic
typing.  In your experience, how much code that you write would fall
under these various categories:

1. The code is written as a general purpose utility module that is
used with many different types.  Something like a tree structure in
ML that requires polymorphic operators or higher order functions to
do its work.  In this case, how many unique types are involved?  In
other words, if you wrote non-polymorphic versions of the same code,
how many versions would you have to write?

2. The code is written polymorphically, but is really only used in a
single way.  This is basically a special case of #1, but an
important one.  How often do we work to be generic but not realize
any benefits?

3. The code works with different types, but is just generic, not
polymorphic.  A queue would fall under this category.  It might work
on any type, but it doesn't do anything different when used with
different types.

4. The code is written with specific types in mind and will only
work with those types.

These aren't well-defined categories, but I'd just like to get a
better feel for how polymorphic most code really is.  In dynamically
typed languages, are we usually just making things more convenient
most of the time (not having to specify types) or does a significant
amount of code actually take advantage of the ability to pass
arbitrary types around?

My first guess would be that in most (not all) applications there
are a few generic utility classes/modules/functions and the rest of
the code is pretty tied to certain types, even if the language
doesn't force this.  In other words, for most code you could replace
it with a statically typed language without really changing much.

What does your experience suggest?

- Russ