[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: Industry versus academia
Eli Collins wrote:
> Buffer overflow bugs are certainly an important issue, but if you
> believe that there's a 100x factor in productivity between your
> programmers (if you believe _The Mythical Man Month_) then buffer
> overflow is probably a secondary issue.
Buffer overflows were just a single example with some fairly obvious
consequences. To that, you can add things like broken type systems,
inability to enforce abstraction boundaries, a resulting inability to raise
the abstraction level significantly, etc. In the end, a 100x productivity
factor *due to the tools* is probably way conservative, once you take into
account debugging and long-term maintenance costs - it's just that the costs
in question have been hidden by blindness.
> With static analysis tools like SPARK and non-executable
> stacks (recently added to OpenBSD) problems like buffer overflow are
> becoming _less_ (still important!) of an issue for application developers
So to paraphrase, tools that patch the easiest and most obvious of these
flaws are now available, and are even starting to infect operating systems.
Forgive me if I don't consider this real progress. The costs have already
been paid, are still being paid, and at best, these tools will only reduce
> meanwhile the problems Brooks pointed out long ago are still here.
Yes, but this is a little like the argument "don't go to space, there are
people starving in Africa", (although I should probably have picked a less
arguable Good Thing than space exploration). If there are problems that are
truly hard to solve, or unsolvable, that doesn't mean we shouldn't address
ones that are more easily solvable.
> I agree with PG on java (http://www.paulgraham.com/javacover.html).
> From my experience Java's adds enough expense and complexity to
> compensate for its features. You can't get a buffer overflow but
> the Oracle driver makes the JVM core dump every 2 weeks. I'll get
> flamed for this but I think Ada95 is a better alternative to C++.
> I'm certainly glad the Boeing 777 flight control is not coded in
> Java or C++.
I won't argue with any of that - but my point about Java is that it is
demonstrably better than C++ for a large class of business applications.
Some people used to write accounting systems in C++ (probably still do).
The people now writing them in Java are getting further quicker. That's not
to say there aren't better languages for the job, but if you're making the
safe mainstream choice for business reasons, Java is an improvement over C++
for that large class of applications. As such, it's an example of "how
important language/tool issues are to businesses", which was the question
> No question that academic advances help, I just think we're at a point
> where tools are no longer the #1 issue.
Sorry, but I think that's shortsighted. First, I don't think it's about #1
issues - there are very few truly #1 issues. More importantly, mainstream
tools are currently at a local maximum, at best (or minimum, for the less
generous). The tools appear good enough, and are rationalized as such,
because real fixes to the problems require too great a change to the status
quo. So when it comes to the *overall* efficiency of software development,
including long-term cost of ownership/maintenance, tools are probably in the
top 3, if you want to characterize it in those terms. But, to make a big
enough difference, existing tools need more than incremental changes. Which
means it's not obvious how to get there from here. So a valid point I'd
take from what you're saying is that the ROI on improving tools is
uncertain - which is undeniable. That doesn't mean it shouldn't be pursued.
> BTW Lisp solved these problems almost half a century earlier.
You mean the first versions on the IBM 704? :) Lisp raised the bar to a
level that the mainstream languages still can't match, but it's hardly
"solved" all of these problems. For example, it hasn't solved issues of
integrating many of its features in a statically type-checked context, which
is one of the major issues Java faces. (And I don't consider throwing out
static checks to be a good enough answer.)
> Java generics are a great example of how
> long it takes to get a feature from academia to industry, not only that
> they are pretty poor. Try overloading a generic method, oops, you can't
> because its mostly syntactic rewriting to avoid the nasty casting issues.
> I'll move on, the why many smart people don't like java thread is too
> often repeated.
I agree with all of this. But despite the current "Pinky and the
Brain"-style take-over-the-language-world tone of other messages in this
thread (no offense to anyone, it's fun :), none of our favorite "advanced
languages" is about to take over the world. So what we're left with, most
of the time, is the usual glacially slow migration of features from
academia, and other sources, e.g. the lightweight/scripting languages. But
this migration also has a beneficial impact on businesses, which is what I
was trying to point out, regardless of the disadvantages of the languages
BTW, the "nasty casting issues" in Java are a result of its type system - an
area where the academic languages do much better. This is a big example of
a "feature" that would be important to businesses, if it could be migrated.
And it's a pretty important one! Unfortunately, grafted solutions aren't
likely to fully solve the problem.
> Notice that most langs discussed at LL1 were dynamically typed!
That's all but inherent in the category description of "lightweight"
languages, given most existing type systems.
> The java/tools buy and develop our way out of the problem perspective
> seems to be the status quo from my POV.
It is the current status quo. But I raised Java as an example of how the
status quo has changed between C/C++ and Java, due to language/tool issues.
You can't simply dismiss Java if you're asking questions about the impact of
tools on businesses.
> I don't accept that most programmers have to be mediocre.
> Just out of curiosity, do most people on this list accept
> as fact that most programmers have to be mediocre?
I don't want to get bogged down in terminology - I was working off Matt
Curtin's definition, which I agreed with, which was:
"I did, however, write about mediocre programmers, and I do not apologize
for my comments. The fact is that people engaged in any endavor are, on
average, mediocre. (As far as I'm concerned, the real
problem is not mediocrity, per se, but the level of incompetence at which
the present mediocrity presently rests. People following the traditional
career path now present in software simply do not have
enough time to get good enough to overcome this.)"
If you want to phrase that even less contentiously, change "mediocre" to
"average", i.e. average programmers are, well, average; and the current
average is below where it should/ought to/needs to be, or at least below
where some of us wish it were.
[I should note that a lot depends on the circles you move in - for example,
programmers in scientific companies and software companies are not the same
as those in financial services companies, which are not the same as
manufacturing companies, etc. In larger companies, the quality of talent
can vary wildly by department.]
Anyway, assuming you accept the complaint, what is it that keeps the average
programmer from doing better? The Sapir-Whorf hypothesis is an old favorite
in programming language discussions, and it applies here - language dicates,
or at least influences and constrains, thought. We have large numbers of
programmers who think in Java, VB, etc. That's a problem! If you want
programmers to be less mediocre, you have to give them a language that they
can think less mediocre thoughts in!
At the very least, good languages encourage thinking about problems in
better ways - even if not everyone can be a superstar programmer, at least
let the tools guide them in the right directions.
The only people really capable of rising above the limitations of the
mainstream languages are those who've been educated in other languages.
It's as though we did all of our business in a pidgin language, but
communicated with friends and family in a more erudite and sophisticated
dialect. Now imagine that some of our business partners only speak pidgin.
Isn't that a problem?
Actually, this strategy worked as a good colonial tool of control, for the
colonists of the Polynesian islands. If we want to control programmers,
keep them in their place, and restrict and limit them, we can do it by
forcing them to use pidgin languages, and ideally discourage them from
learning other languages.
I don't seriously mean to make this political, but I do mean to be
provocative, because I think there's plenty to be provocative about. We
can't give up on improved tools just because the tools we're "allowed" to
use are so bad and dead-ended that the situation seems hopeless. You may
not agree with that characterization, but that's why I'm writing this.
> I think these issues are the hardest to address, which is why they're
> still here! If these problems (productivity/code quality/correctness/etc)
> are so easy to address then why are they still here? Effective management
> and group work are certainly as hard and as complicated as PL tool
Maybe, but until the first AI's are produced, we're unlikely to be able to
automate effective management; and although group work can benefit from some
kind of automation, and has been, it's a fundamentally human activity. I
don't really understand your point, and don't see how it connects to the
question of trying to improve the mainstream toolset.
> We might be talking about two sides of the same coin, for
> example when the SQL Server worm caused some of MSs servers to fail you
> could blame it on (a) the C codebase or (b) the MS development
> organization (why don't they autoupdate critical patches?). Either is
> valid in my opinion.
Explanation (a) is a root cause of the problem which would have taken it out
of the domain of human responsibility, and thus solved the problems forever.
"Take the humans out of the loop." :) Unless we want to create a society of
unionized sysadmins who work 24x7 keeping systems patched and in sync.
BTW, MS autoupdating critical patches (for its customers) is not the path to
robustness, trust me! I envision Bart Simpson writing this on the
blackboard: "New service packs crash servers. New service packs crash
servers. New service packs..."
> I think its time to focus on some other issues which might
> get more bang for the buck.
Like what exactly? Are you talking about other kinds of software that needs
to be developed, or should we just give up on software and focus on human
resources? Might not be a bad idea...
> > But another goal is helping code which uses high-level abstractions to
> > perform well. It's the high-level abstraction features, and
> > the ability to use them without paying an unacceptable performance
> > cost, that's the real
> Funny you should mention this. In my graduate compilers course our
> semester long group project is to write a Python compiler which generates
> x86 machine code. I couldn't agree more here!
So, aren't you being a little schizophrenic? (Join the club.) If we need
faster/better tools that support high-level abstractions, shouldn't we be
working on them?
In developing your Python compiler, have you noticed places where the design
of the language lead to unavoidable performance issues, which could be
improved by a different design? Dynamic dispatch is an interesting topic in
this area - arguably, doing dynamic dispatch all the time is admitting the
failure of your type system.
> I'm all for academic advances in industry. I would love to see ML replace
> Java/C# as the language your pointy headed boss demands.
I don't actually think that's the answer. There's work to do on both sides.
I think that's what this list is supposed to be about. :)