[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: pretty-lambdas
At 11:18 AM -0500 12/4/02, Sundar Narasimhan wrote:
> At 9:46 AM -0500 12/4/02, Sundar Narasimhan wrote:
> > ^ (x y) (+ x y)
> >So now I wonder idly.. how many other languages can do something like
> >this (given this discussion started out re. macros after all :) --
> >i.e. modify the base syntax for introducing something so basic.
> >
> >Python, Perl, Ruby, Smalltalk? Perhaps?
> >
>
> I don't know how many languages let you modify the syntax
> at the level of the lexer.
>
> I'm not convinced that this is actually a good idea, as it
> happens, but it is undeniably useful to be able to define
> a special "read table" for reading custom data files. One
> could reasonably argue that this is properly the domain of
> lexing and parsing libraries, without having to argue as well
> that macros are a poor idea.
>
> Common Lisp also has these things called "symbol macros"
> where a random symbol in any old position can be magically
> macro-expanded. I'm pretty sure this is a bad idea. The
> problem with both of these things is that they can cause
> the "deep" syntax of the language to silently change with
> no cue to the reader that something magic is happening.
> This is not quite as true of ordinary Lisp macros. (I
> say "not quite" because ordinary macros quite often change
> evaluation semantics.)
I agree with all you say below, but...
- Being able to hack Lisp's read table isn't a particularly
good way to do most of what you want. It's true that a
wizard could do it, but I would argue that there are better
ways to do it than hacking 'read' by tweakinr read tables.
- Common Lisp's symbol macros don't address this.
- Really, Lisp macros don't do what you want, either. You're
still stuck with the "extra" "fixed structure" that you
find to be a hindrance for your average students. That is,
Lisp-like macros systems still require that you take in and
and spit out things that, at a deep level, are Lisp (that
is, obey the basic s-expression syntax that accounts for
(, ), dot, whitespace, symbols, etc).
I think that what you are looking for is not necessarily addressed
by instruments such as macros, but I've thought about this for
all of 3 minutes.
>Actually -- I tend to shy away from "good" vs. "bad" judgements when
>such facilities are concerned since these, in my mind at least, are
>in the gray area of "trade-offs".
"Good" = "benefit to risk ratio is high"
"Bad" = "benefit to risk ratio is low"
In my experience, reader macros are a close call, but if I were
implementing a new Lisp, I would leave them in. Symbol macros
have a very low benefit to risk ratio, in my experience, so I
would leave them out.
>Just because programmers design languages does NOT mean that the
>syntax should necessarily be governed by them. I'll illustrate what I
>mean with a slight digression.
>
>If you look at the entire "business rules" segment -- look at Ilog
>JRules for a popular example, you will see that it's key successes
>have been because of the fact it lets you attach "pseudo-english" like
>descriptions to a business-object model, that end up controlling a
>rete-engine written in Java underneath. It uses reflections heavily --
>but makes a completely different argument about separation of levels
>than people have made on this group about using reflection
>facilities. I could say the same about Matlab/Mathematica, AutoCad and
>other niche products.
>
>I'll give you another example -- whenever I attempt to teach
>programming to kids or novices -- I find I have a *huge* conceptual
>hurdle to jump. They just don't understand why one has to "bother"
>with all this extra (, {, $, # stuff. I've tried Logo and other so
>called "educational" languages, but after a few years bashing my head
>against my average student (note I don't worry about the stars here
>:), I'm more than convinced that it is the "fixed structure" that
>machines expect (that is sometimes oh so drastically different from
>what kids are taught in their other subjects and walks of life) that
>is at the root of the problem. Making things more "flexible", so that
>different syntaxes can be supported .. could potentially impact
>this.. because I can see how perhaps I can write programs that start
>with "minimal structure to do simple things", and then gradually ramp
>up to more complicated things. Right now we have a situation where
>.. because it's an "all or nothing" game, we tend to lose 70% of the
>class right at the beginning (of course they all then go on to become
>the pointy headed bosses of the others that do master the
>syntax/structure :)
>
>So.. I think there *are* benefits to putting in facilities at the
>language level to allow such drastic redefinitions of syntax. i.e. the
>reader really doesn't want to see the machine-level, or progamming
>language level semantics sometimes.
Just from watching the problems of doing a good macro system in
Dylan, I would have to say that the practical problems of allowing
"drastic definitions" of syntax in the language are pretty daunting.
Or in a Japanese-like postfix syntax, "Problems the macro system
good Dylan in watching regarding, problems practical syntax's
'redefinitions drastic' of language in regarding pretty daunting are."
I would be interested to see a parser that can be extended, on the
fly, from handling the first syntax to additionally handling the
second one. (That's not meant to be sarcastic -- I really would
be interested in seeing how drastic a change in syntax can be.)
>I think we programmers automatically make the leap from that -- "if
>you can't see what the program counter might do here, how the heck am
>I going to debug it".. Exactly the right question to ask - and tease
>apart where the tool support, language facility, abstraction
>boundaries between runtime/execution/use lie. Doesn't mean it's bad
>.. it just 'appears' bad because some people don't quite know how to
>deal w/ that perhaps :)