[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Industry versus academia



Michael Vanier wrote:

>This is a good point.  I have met some hackers who are very talented but
>who can't imagine programming in anything but C.  It's just "real
>programming" to them, and any language that doesn't offer fine-grained
>bit-level control over the machine is simply bogus in their eyes. 
>
Yes, that's a really good point.  And here's where I think academia can 
make a big difference,
the main caveat being that it's a big difference but it takes a long 
time: by teaching students about
cool new technology, you eventually end up with people in industry (who 
used to be students)
who are familiar with those cool ideas (that are now not so new, since 
time has passed).

For example, when Gosling and company came out with Java, a lot of 
people took a look at
its use of automatic freeing of memory and, instead of recoiling and 
saying "no can do, GC
is far too weird and slow", they accepted it.  I think that would not 
have happened twenty
years earlier, and probably not ten years earlier.  Good new ideas do 
make it out there, but
it's a painfully slow process.

Of course the fact that it's slow isn't all bad.  It did take a while 
for GC's to get as good as
they are now.  Many, many years of hard work went into development of 
better and better
GC's.  Lots of GC's were really implemented and got tested in very real 
circumstances, and
then new ones were tried out and really tested in real circumstances, 
and all this took a great
deal of time and effort and cleverness, and finally we had these really 
studly GC's that performed
quite well, and at that point, people were willing to say, OK, we take 
this seriously.

Same for JIT compilers.  The first one I knew about, and as far as I 
know the first one that fits
more or less the current concept of a JIT compiler, was by L. Peter 
Deutsch for Smalltalk,
done in something like maybe 1980 at Xerox PARC or something.  He 
published this in the
open literature somewhere (like OOPSLA or something I think), and 
eventually other people tried
it, and some did it well and some did it not so well, and with the 
widespread use and implementation
of Java lately there has been a lot more work on it, and finally these 
things are getting pretty darn good.

Other good ideas are being refined even as we speak.  I learned in the 
two LL workshops that
cool, innovative, high-performance implementation techniques for other 
language concepts that
once were considered innovative in the academic world, and are now 
pretty mature by academic
standards but still ahead of their time by industry standards, such as 
full continuations.  Someday
there will be enough people out there who grok and appreciate 
continuations, that the next time
the planets line up right for a new language to emerge, maybe one of 
those people will be one
of the cooks standing around the pot and will pour in the right 
continuation magic.  It's very hard
to say where or when that might happen, but the chances are higher that 
it will happen, and happen
sooner, if more people are exposed to those ideas.  At MIT, for example, 
every computer science
major gets a walloping dose of continuations as part of the required 
curriculum.

>Another point occurred to me: people in industry often have to learn a lot
>of stuff, but it's stuff that's directly related to their jobs (particular
>development environments, tools, etc.).  Generally, they are required to
>learn these things as part of their jobs, since decisions about what
>technologies to use are made by the project manager.  Perhaps they're
>sufficiently busy learning these technologies that they have no time to
>learn less obviously practical skills (like new programming languages).  To
>an outside observer, this might appear to be a lack of interest in learning
>(say) new languages, when it's really just a lack of time.
>
Absolutely.  As a working programmer/architect guy, I am learning new 
stuff all the time, but there
is so much to learn that I have to pretty much limit myself to learning 
only the stuff that I absolutely
need to know.  Well, maybe I reserve 10% for stuff that I just think is 
cool and might want to know
someday, but not much more than that.  And I actually try to avoid 
learning specific technologies that
might well be obsolete before I ever need to actually use them.  But 
learning new technologies takes
very real time, and time is at a premium.

-- Dan

>  
>