[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: learning languages [Was: Re: Y Store now C++]

On Fri, Mar 14, 2003 at 10:34:14AM -0600, Trevis Rothwell wrote:
> Based on your experience, how hard is it, really, for a programmer
> to learn a new language?

In my experience, that's a question that cannot be answered in the
large.  Each segment of the bell curve will have a different average

Hackers, such as those who use Common Lisp to define mini-languages
seem to define and learn new languages as a standard part of problem
solving.  People on the opposite end of the bell curve learned one
language with great difficulty, and it would be very difficult for
them to learn a second language.  I'd also assert that many of
these people haven't "mastered" a programming language as much as
they've learned to become functional with one.

The difference between professionals and amateurs in programming
is much like the difference between carpenters and homeowners when
it comes to hammering a nail into a wall.  A carpenter can nail
something into a stud with two or three good, strong thwaps.  A
homeowner will tend to grip the hammer in the wrong spot, take a
dozen or so middling little taps and probably hit his thumb once
or twice.

At the end of the day, if you just need to hang a picture hook, it
doesn't really matter all that much (unless you're the one who's
hammering your thumb over and over again).  But for something more
complex, like fastening a wooden crate or building a house, the
differences are much more meaningful.

Sadly, managers tend to optimize their systems for a large number of
homeowners instead of hiring a couple of carpenters.  Those who do hire
carpenters tend not to care because the quality of the work speaks for

> It seems to me that there is a lot of "conceptual crossover", and
> once you understand the underlying ideas (polymorphism, closures,
> whatever) learning a new *language* is fairly trivial in
> comparison.

The effort required to learn your n+1st language is proportional to the
effort to learn your nth language, n+1 = .75n or so I'd expect.

In the modern era of computational monocultures, the need to learn a
dozen or so langauges has faded.  As a result, the trend is for the
average programmer to still expend a considerable amount of effort to
learn a new language.

As recently as a decade ago, it was still common for a "programmer"
to start a new job using a new computing environment.  At school,
we used mostly Macs and SunOS boxes.  One job I had in school was
working on a system running on Primos written in a mixture of
FORTRAN IV, Fortran 77, Pick, DCL and I forget what else.  That
company hired entry-level programmers (still in school) with the
expectation that a good programmer can learn a new language in few
days/weeks.  (At the time, almost all of the new hires had been
working exclusively in Pascal.)  That company had about five years
of experience proving that assertion correct.  Indeed, it was par
for the course through the 1980s and only stopped being true in the
mid-to-late 1990s.

Today, the IT world has standardized on two primary platforms:
Windows and *NIX.  That has reduced the pressure to learn new
langauges pick up new skills, and increased the cost for average 
programmers to design/learn/build their n+1st language.


PS: Rob Pike presented a similar perspective in 2000: