home .. forth .. misc mail list archive ..

Re: MISC-d Digest V99 #71 and Plectics


Dear MISC readers:
I had said:
>> >Chuck also felt that ANS Forth was missing out on some
>> >good innovation like @+ which could help write faster code without
>> >needing super-optimizers.  
>>
>> That is true he did make that point.  As he said if you use it it
>> will signifigantly change the style of your code.

Stephan Pelc wrote:
>The arguments are *NOT* about whether @+ or the use of an 
>address register are technically good/useful. I and others such 
>as Phil Koopman have discussed and implemented (especially in 
>silicon designs) the use of address registers in stack machine 
>VMs.

I assume you are refering to the comparison of Machine Forth
to ANS Forth.  If so I think the address register issue is a
side issue either way.  It is interesting but I think by far
the biggest issue is whole idea that Chuck thinks a Forth
should be about 1K of code and that this is in complete
contrast to the meta-mega-Forth specified in ANS.

Perhaps contrast more by perception than use.  Chuck or 
anyone else who is trained can write good code in these systems,
but mostly it involves avoiding a lot of stuff.  Chuck refers to
many things specified in the ANS Standand as abominations.

>The discussion is really about whether the address register(s)
>needs to be exposed in the Forth VM (not necessarily the same
>as the stack machine VM). 

I see that as a small side issue again.  You can add it in high
level code VARIABLE A etc to write simple code that ports cleanly
to a simpler Machine Forth.

>This affects the complexity of the 
>compiler, but I would argue that if I can produce better code 
>generation, this is *much* less complex than having to produce a 
>new piece of silicon.

It only adds a word or two to a Machine Forth compiler although it
depends on how complex a compiler is.  New silicon as in FPGA may
be less complex than some compilers but custom VLSI adds more levels
of complexity.  Software is still a lot softer, you are right.

>> I think the ANS standards team has a
>> pretty obvious bias to declare that Forth was frozen fifteen years ago and
>> that they were made keepers of the crystal.
>Complete and utter rubbish. I note that since ANS, both MPE and 
>Forth Inc have produced optimising compilers that deal with ANS 
>source code. 

Yes I know MPE, Forth Inc., myself, and lots of others have produced
various compilers that generate good code from ANS source.  After
writing optimizing compilers that could generate optimal code I
found I was more productive with the simpler Machine Forth compiler.
But I have never argued that the ANS Forth specification precludes
a decent implementation of Forth.

The point I was making was that as Chuck has said the ANS Forth
specification documents what was pretty much standard practice
twenty years ago.  Yes, you can apply a "modern" compiler that
does native code compiling with inlining and other optimizations,
I agree.  I still see that as a side issue compared to asking
if that native code inlining compiler is a couple of hundred
words.  I also like to ask if the GUI is a mega interface to an
external mega GUI or is it a few K of the system?

When Chuck asked the question in c.l.f of what he should
talk about in his talk to SVFig it got some interesting
responses.  The keepers of the crystal refered to the
discussion as promoting the fad of the week etc.

>What ANS Forth (or any other standards committee) 
>does do is to document common practice. An effect of this is to 
>make it clear what code is portable and what is not. This is 
>very different from restricting change.

I know, but documenting and promoting common practice from fifteen
or twenty years ago into a standards document and arguing about the
fiddly bits for decades is not restricting change directly.  It does
however divert the energy that might otherwise have gone into something
a little less philosophical than endlessly debating theoretical 
obscure holes in the standard.  But it is just my opinion.

>From: "Lloyd R. Prentice" <pai@tiac.net>

>Wish I was smart enough to contribute to the effort.

I don't think that is the issue.  Not everyone has to write all the
code or design all the transistors.  Perhaps the idea that there is
no way to contribute to the effort is part of the problem.

>available, would go a long way toward addressing the question of whether
>MISC is merely the highly personal and idiosyncratic vision of a few
>extraorinarily bright people, or is  truly applicable to broad
>mainstream markets. 

I think it is merely a matter of degree.  I know when Chuck and I don't
get paid for doing work we don't spend as much time doing it.  I know
that niether Chuck nor I spend a large percentage of time doing
technical work.  So if only so much work gets done and it doesn't get
documented etc. then certainly it will certainly be the former rather
than the latter.

>The MISC Manifesto

I like the MISC Manifesto so some extent.  But complexity is real and
beautiful, just not wasteful complexity.  Murray Gell-Mann describes
his Plectics: 'To refer to the subject on which some of us now work as
"complexity" seems to me to distort the nature of what we do, because
the simplicy of the underlying rules is a critical feature of the
whole enterprise.   Therefore what I like to say is that the subject
consists of the study of simplicy, complexity of various kinds,
and complex adaptive systems, with some consideration of complex
nonadaptive systems as well.'

Plectics from plexus, twisted or braided, to complex meaning
braided together, like Forth. :-) While plicare, to fold, through simplex
meaning once folded, to simple.  The important terms are "effective
complexity" or "potential complexity" to Dr. Gell-Mann.

Like the minimalism of our underlying computer hardware and
software structures lets us get the maximum useful complexity from
a system with finite resources.  All systems have finite resources!
I am interested in complex adaptive systems and for me MISC is a
useful tool to study them.

>1. Complexity is the enemy of performance, reliability, creativity and
>personal satisfaction.

Complexity is essential for the things I am interested in.  A human with
a 100 I.Q. may well equate to using 1% of the 200,000,000,000 neurons
but that is still pretty complicated. You might say that memory is cheap. ;-)

But seriously on a human scale the complexity of the universe is
undeniable.  Dealing with complexity is a useful goal.  Unneeded,
crippling complexity is the enemy. 

>2. Conventional computing technology is crippled by over elaboration and
>unbridled complexity.

Spiraling bloat.  

>3. In analysis we strive for the simplest possible solution.

you bet, 0 code, 0 hardware.

>4. In hardware we strive for ever smaller dies, simpler systems.

you bet, 0 size, 0 cost, 0 power, 0 bugs.

>5. In software we strive for the minimal set of simple, tight logical
>building blocks that we can easily rearrange in simple patterns to solve
>every programming problem at hand.

you bet.

>6. Less is inevitably Moore.

Leave off #6.  The rest is ok, but #1 needs "Unneeded."