[Prev][Next][Index][Thread]

Re: Closures



Jeff Dalton wrote:

>
>
> What if I have
>
>   def f(x):
>      print z  # is z local or global?
>      z = 2
>      print z  # presumably z is local here at least
>

Both are local but at the beginning z is unbound (so you get an error at runtime).

Clearly the compiler could get this...

>
> > if we put def f(...) in:
> >   def g():
> >      z = 3
> >      def f(x):
> >        ....
> > then should z = 2 in f introduce a new local variable or rebind the
> > z in in g?
>
> Ordinarily, I'd expect it to be the same as if you wrote
>
>    z = 3
>    ...
>    z = 2
>
> both in g.  That is, both assignments would change the value of
> the same variable.
>

Yes and No. You should consider that closures have been added to Python only
recently, before of that the code above simply produced a z local to f and one
local
to g.

The un-ordinary rule for a lisper avoid breaking even more code with the change.
And the need of a new syntax for introducing locals.


> [Java case]
>
> Moreover, they didn't invent some "weird" different semantics,
> just said the variables had to be "final".

The "weird" (but that's matter of taste) point was pre-existent,
that means the absence of explicit declarations for locals.


> > I think on the tradition of C and C++ Java is
> > a bit scared about not explicit performance costs (Lisp absolutely not [1])
>
> Though I don't know exactly what in Steele and Gabriel's history you
> have in mind,  I don't think that can quite be true.
>

Here's the passage:
"This situation [performance difference between Interlisp-D and -10] of unexpected
performance is prevalent with Lisp.One can argue that program- mers produce e
cient code in a language only when they understand the implementation.With C,the
implementation is straightforward because C operations are in close correspondence
to the machine operations on a Von Neumann architecture computer.With Lisp,the
implementation is not straightforward,but depends on a complex set of
implementation techniques and choices.A programmer would need to be familiar not
only with the techniques selected,but the performance rami  cations of using those
techniques.It is little wonder that good Lisp programmers are harder to  nd than
good C programmers."


> but I think there was also (for various reasons) the view
> that the shared-assignment semantics was simply the right semantics
> in a language that had assignment and nested procedure definitions.
>

It has some kind of minimality (wrt to exceptions and special-
casing) . But Python status quo gave another kind of peculiar context.


>
> > >From this point of view it's also more clear that the space of possibilities
> > (what is needed vs. nice, what is not )is a bit larger when you already have
> > both objects and first-class functions.
>
> But consider Dylan.  It was always going to have both objects and
> first-class functions.  Yet I doubt the Dylan designers ever thought
> "since we have objects, we should have some other semantics for
> assignment in our first-class functions."

Starting from CLOS and Scheme it was indeed the right thing.
My point was simply that then you can choose without affecting too
much expressiveness, but then other factors can come into
play.

>
> Or - going in another direction - consider T (and OakLisp) where they
> try to stick close to the model of first-class functions as objects.
> If a language is going to have both first-class functions and objects,
> why not try to unify them?

Because if one don't have a lisper or hardcore-CS mind
function and objects are felt different things. In some programming cultural
traditions OTOH
this can make perfectly sense.