[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Integers, dynamic isomorphisms, etc.
The lame, particularly stupid, but easy to reason about
(and easy to prove that it's lame) implementation for integers is as follows:
Data Unary_Integer = Zero | Succ Unary_Integer
A great improvement that better matches the way we actually think
about integers is as follows:
Data Bit = Zero | One
Data Little_Endian_Two_s_Complement_Integer =
Sign Bit | LETCI_Cons Bit Little_Endian_Two_s_complement_Integer
But why be so low-level as to choose endianness or radix base?
You can parametrize over the implementation of Array and Digit,
with a proper type class constraint.
Data Sign = Plus | Minus
Data N_Complement_Integer Digit Array = NCI Sign (Array Digit)
Or instead of the usual representation, you can have a different skewed
representation for integers, that allows for constant amortized time
addition and substraction, etc.
And of course, you can pick a very abstract view on Integers,
with lots of operations (addition, factorial, bernouilli numbers, etc.),
just like Mathematica gives you. Why the hell should Integers
be a *poor* type, described in low-level terms of constructors?
Why not have a wide choice of constructors and deconstructors?
One day, I want to deconstruct my integers in unary style,
another day, I want to recurse little-endian or big-endian style, etc.
Why can't I do that with the *same* type of integers?
Why should there be a one "canonical" way to construct and match integers?
No sane mathematician does that. Why should computer scientists do?
Mathematicians always consider types up to isomorphism.
Why don't Computer Scientists do?
My bet is that the real progress in the next 50 years
will be that programming languages will allow to reason up to isomorphism.
And not just a static set of isomorphisms,
cast in stone by the language specification.
No, a dynamic set of isomorphism, that grows with the code base,
that grows with the "artificial intelligence"
in the programming environment.
Once the programming environment can match something as being
"an Integer", and has a database of good implementations of integers,
depending on the usage constraints, then it can optimize away
inefficiencies that stem from picking bogus implementations of integers
(efficiency-wise -- such implementations might ease reasoning in another way),
so that programmers can focus on *meaning* without having to pick
And no, I disagree that this means a one-size-fits-all datatype
with a highly optimized implementation with a static set of dynamic
specializations and invalidations. I think that this means an
extensible meta-programming environment that can search your code
for abstract programming patterns, analyze them, match them against
its database, optimize, extend its database with the produce of analyzes,
do more extensive searches while you're sleeping, propose solutions
when you wake up in the morning, get all happy when it finds a simple
and elegant solution that isn't in its database yet, etc.
In any case, such is the direction I've always dreamed of
for my Tunes project.
[ François-René ÐVB Rideau | Reflection&Cybernethics | http://fare.tunes.org ]
[ TUNES project for a Free Reflective Computing System | http://tunes.org ]
The meta-Turing test counts a thing as intelligent if it seeks to apply
Turing tests to objects of its own creation.
-- Lew Mammel, Jr.