[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Clear Choices was :: Re: Vectors as functions



Shriram,

> Unfortunately, this kind of choice is very hard to provide.  DrScheme
> mainly provides a series of syntactic analyses (the constrained
> versions of Scheme have additional checks).  It would be possible to
> exploit these constraints, but because features interact in all sorts
> of subtle ways, it's very difficult to get much meaningful reuse
> without having first exhaustively worked out what will and won't
> interfere.
> 

    I've been grappling with this very issue in my design work on Clear.
    
[The name, Clear, to distinguish a multi-paradigm End User accessible
programming language is subject to change pending a trademark
registration analysis - it is truly frightening how many trademarks seem
to be registered for every word in the English language ranging from
animal names and beverages all the way down to obscure mythological
figures and ancient literary references. If anyone here is using the
name Clear or knows someone who is, and would like me to make an
alternate selection, do let me know as soon as possible!]

    I think the key may be to introduce the new syntactic construct of
an explicit Semantic Block or (Dialect in linguistic terms) to
distinguish which syntactic analysis and set of potential optimizations
to apply within a given region of code, subject to appropriate
interaction constraints, which I think is where you are going with your
comment that:

> In short, this boils down to a semantic reuse problem.  It's a hard
> problem in that I know very few techniques that are a substitute for
> brute-force human labor.  On the other hand, if you're willing to do
> the hard work manually, it's an easy problem -- just implement the
> language x operation grid:
> 
>                             parser  compiler  optimizer  verifier ...
>   first-order functions
>   + higher-order functions
>   + exceptions
>   + continuations
>         :
>         :
> 

    Have you run across any papers formalizing this grid notion?
    
> The more general you make your starting point, the harder you would
> have to work to get something usable, because the default version of
> the tool will have to handle the language in all its generality, and
> just recognizing when what you have is an instance of a restricted
> subset is often quite difficult.  If you've tried to write an
> optimizing compiler, or even just a useful program analysis engine,
> you will know exactly what I'm getting at.
> 
> Shriram

    It seems to me that we can reasonably punt this recognition problem
to the programmer by having him or her use the Semantic Block construct
so one could nest a lower language level within a higher one (and
vice-versa; think in terms of creating inline teach packs which are
presumably defined with more advanced features than those available to
their users).

    PLT's module mechanism seems to go a fair measure of the way in this
direction already with its provisions for using alternate languages.
Just imagine extending it in such a way that if the programmer were
working at an intermediate level with lambda plus mutation, he or she
could augment a definition with a "beginner level" qualifier that would
allow the environment to apply a more restrictive analysis of that chunk
of code.

    This also lets us explicitly indicate a programmer's intended choice
of Style keeping it orthogonal to that of available language features so
for example, the environment doesn't have to infer from erroneous source
code that the programmer working in the full language was *intending* to
use CPS or tail recursion for a given function.


Warmest Regards,

Peter




_________________________________________________________________

Peter J. Wasilko, Esq.
     J.D., LL.M.               

Executive Director, The Institute for End User Computing, Inc.

Visit us on the web at: http://www.ieuc.org

_________________________________________________________________

Its time to abandon brittle architectures with poorly factored
interfaces, gratuitous complexity, and kludged designs dominated
by sacrifices on the altar of backwards compatibility.

Such artifacts are vulnerable to cyber-attack, weigh down the
economy costing trillions of dollars in lost productivity, and
suffer from an impoverished conceptual model that lacks the
integration and elegance needed to empower end users to
get the most from advanced applications in the future.

_________________________________________________________________
The Institute for End User Computing --- Pursuing Secure, Simple, 
   Supple, & Sophisticated Systems to Unlock Our Human Potential
_________________________________________________________________

* The Institute is incorporated under New York State's
   Not-For-Profit Corporation Law