IWETHEY v. 0.3.0 | TODO
1,095 registered users | 0 active users | 0 LpH | Statistics
Login | Create New User
IWETHEY Banner

Welcome to IWETHEY!

New The awakening begins
Tim Bray, XML perpetrator began to make amends and restitution when he wrote that maybe XML is a bit too complicated for programmers to seriously care about.

Now, he begins to catch on to the idea that strong typing is not all that good at producing solid code. [link|http://www.tbray.org/ongoing/When/200x/2003/05/08/FutureLanguage|http://www.tbray.org...08/FutureLanguage]

Even Robert C Martin (for whom I have great respect - despite his rabid C++ leanings, he is a keen observer of factors contributing to success and failure in software development). His eyes are beginning to [link|http://www.artima.com/weblogs/viewpost.jsp?thread=4639|open].

Finally, Bray points to a nice post on [link|http://mindview.net/WebLog/log-0025|test driven design] by Bruce Eckel.

Coupled with the recent InfoWeek article on Smalltalk, I'm starting to ask:

Is the world waking up?
Is the Apocalypse nigh?
Can I buy an extended warranty on my life?
Who am I going to argue with if everyone starts acting sensibly?

Its all very vexing.



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Quotes from Uncle Bob
I've been a statically typed bigot for quite a few years. I learned my lesson the hard way while using C. Too many systems crashed in the field due to silly typing errors. When C++ came out, I was an avid adopter, and rabid enforcer of strong typing.
Emphasis mine. C++ static typing was a good thing as compared to C, because C sucked so bad at it.

About two years ago I noticed something. I was depending less and less on the type system for safety. My unit tests were preventing me from making type errors. The more I depended upon the unit tests, the less I depended upon the type safety of Java or C++ (my languages of choice).

I thought an experiment was in order. So I tried writing some applications in Python, and then Ruby (well known dynamically typed languages). I was not entirely surprised when I found that type issues simply never arose. My unit tests kept my code on the straight and narrow. I simply didn't need the static type checking that I had depended upon for so many years.
Emphasis mine. Gee. Big surprise. What have we been arguing about with Mr. Burns here? :-)
Regards,

-scott anderson

"Welcome to Rivendell, Mr. Anderson..."
New Maybe it's just me...
but I'm really scratching my head at the huge fuss made over static/dynamic typed languages.

Since the languages we're talking about are OO, shouldn't any properly designed system (imo) be setup so that for the programmer they don't have to worry about the type of the language?

Example: In Java, if I get back an SQL object, I don't have to know the type (Integer, Float, ect) to shove it back to the client, just call the getString() on it and the object will handle it for me.

(note: I'll admit that Java and C++ both fail at this - mostly because they were derived from C, which have primitive datatypes and is statically typed. But in both cases, what they've added in, their new stuff, is set up so that the programmer really doesn't have to worry about the type.)

Frankly, the further and further I get into OO, I'm beginning to see that primitive datatypes are a bad idea. They're a pain in the neck, get in the way and cause problems.
New Static languages make the code brittle ...
and a pain in the neck to write. You end up writing a lot of code just to satisfy the compiler. It also locks everything in making it very hard to evolve the system. Paul Graham expresses it really well [link|http://www.paulgraham.com/hp.html|Hackers and Painters]

"A programming language is for thinking of programs, not for expressing programs you've already thought of. It should be a pencil, not a pen. Static typing would be a fine idea if people actually did write programs the way they taught me to in college. But that's not how any of the hackers I know write programs. We need a language that lets us scribble and smudge and smear, not a language where you have to sit with a teacup of types balanced on your knee and make polite conversation with a strict old aunt of a compiler."
Expand Edited by bluke May 12, 2003, 04:05:02 AM EDT
New History revisionism - beware !!! (IMHO)

The true issue over why Java (Strong typing) became popular over weakly typed languages was that it was the *only* way security would work for connected code over the Internet as it was 3-5 years ago. The 'sandbox' security of Java enabled objects to be 'tightly' validated for type whereas Smalltalk et al would merely pass an object that *could be* tampered with in transit (does not understand).

This is a different issue to compile time awareness of an object's type.

VPNs are elimnating the need for validating objects (using strong typing) over WANs.

Weakly typed languages are a damned site more flexible and have more to offer in the long term in terms of coding and testing.


Cheers

Doug Marker
New Re: History revisionism - beware !!! (IMHO)
dmarker: The true issue over why Java (Strong typing) became popular over weakly typed languages was that it was the *only* way security would work for connected code over the Internet as it was 3-5 years ago. [...]

I'm not sure that is entirely correct. Glenn Vandenburg writes (see link below) that at the time of Java's introduction, there were several competing "safe code" models for mobile code. All used very different security models and only one (Java) was strongly typed.

[link|http://www.vanderburg.org/cgi-bin/glv/blosxom/2003/05/17#Software/Languages/static_vs_dynamic|http://www.vanderbur...static_vs_dynamic]

Unfortunately Glenn gives very little details about any of them.
--
-- Jim Weirich jweirich@one.net [link|http://w3.one.net/~jweirich|http://w3.one.net/~jweirich]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
New Another issue was the potential popularity of a lang

>>
When Java first hit the streets, mobile code was a hot topic. General Magic was promoting their Magic Cap environment, featuring mobile code ("agents") heavily, and powered by a language called Telescript. Nathaniel Borenstein was researching "active mail", sending active invitations and the like via email using a dialect of Tcl called Safe-Tcl. Someone (I can't remember who at the moment) was developing roughly equivalent functionality in Perl (the Safe.pm module). Luca Cardelli at DEC was developing a beautiful and novel little language called Obliq.

All of those languages supported secure mobile code, and all of them used very different security models. My memory of Telescript is fuzzy, but I know for a fact that Java is the only one of the rest that is statically typed. And I remember from my evaluation at the time that Java and Telescript had the two most complex security models (and complexity is not a good thing in a security model).
<<

The above extract from the link points out some other work that was being done but one also needs to take into account the political dynamics of the day ...

- MS was creaming the market & competition with its classic tactics

- IBM decided to get behind Sun's Java (Oak) as a counterbalance to MS propaganda about DCOM (and DCOM & Active-X's inherent weakness for net use)

- Smalltalk was there but had completely failed to garner popular support. This went beyond the effort of the various ST tool vendors

- Java won because it was so familiar to the multitudes of C & C++ programmers. Java was very shallow when it 1st came out (effectively in ver 1.0.3). Over 5 years, Sun IBM HP & others developed the bulk of what is J2SE & J2EE today. That Java has many flaws didn't stop it carving out an empire that tipped MS temporarily on its ear. MS had to go back to the drawing board & invent .NET & C#

- The other languages that were or were about to enter the scene had no substantial backers that would enable them to do any better than Smalltalk in garnering market share, some appear to have been still borns.

What ST vs Java taught me was that elegance (ST) does not dictate victory. Also I believe there is much confusion (due to time passing) over the issue of Java's strong typing. It was the strong typing and thus tiny code packets that enabled Java's applet concept to work well over WANS & appear secure (just a pity that MS succeeded in poisoning Applets with their 'dirty' Java VM back in MSIE 2 & 3.

Cheers

Doug Marker


New Re: Another issue was the potential popularity of a lang
dmarker: The true issue over why Java (Strong typing) became popular over weakly typed languages was that it was the *only* way security would work for connected code over the Internet as it was 3-5 years ago.

dmarker: Java won because it was so familiar to the multitudes of C & C++ programmers. [...] The other languages [...] had no substantial backers

I wouldn't disagree with your second statement. Java was definitely a strongly marketed language[1]. And I wouldn't disagree that the perception that strong typing is necessary for secure mobile code had some (small) effect. However, I'm not sure that the first statement is accurate as stated, especially with the emphasis on the word "only".

Footnotes:
[1] I recently flipped through a first generation Java book on my bookshelf. I had forgotten how much of the book was just propaganda (and inaccurate at that!) without much technical content.
--
-- Jim Weirich jweirich@one.net [link|http://onestepback.org|http://onestepback.org]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
New Gee...I thought it was a friendly discussion...
... the kind that might be helped by a brew or two... ;-)

Actually, I have found the same things. However, I don't think (just yet) that unit testing is the antithesis of strong typing...or that one obviates the other. Especially, when you're writing code in a team environment (read: several different levels of expertise), where someone wants to use (or, nor often, abuse) your base class for something it wasn't exactly designed to do (like compare Strings to Locomotives).

StrongTyping != evil for all values of StrongTyping (or evil, for that matter)....
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New Re: Gee...I thought it was a friendly discussion...
jb4 StrongTyping != evil for all values of StrongTyping

Actually, I like to use the term Manifest typing rather than strong or static typing. Some static, but non-manifest typing systems (such as the one in Haskel) looks rather promising.

There's a lot of confusion regarding static, strong, weak, dynamic (etc.) typing. Here's my general definitions ...

Strong: No type errors can happen unnoticed.
Weak: Type errors can happen unnoticed.
Static: The type of an object can be determine at compile time.
Dynamic: The type of an object cannot (in general) be known at compile time.
Manifest: Every variable must be explicitly declared.

Ruby, Python, Smalltalk and brethern are have Strong, Dynamic, non-manifest typing.

Haskel has Strong, static non-manifest typing.

Eiffel has Strong, Static and Manifest typing.

Java has Strong, Static, Manifest typing, with a tendency to use dynamic typing in collections (and therefore suffers from the drawbacks of both manifest and dynamic typing).

C is also manifest and static, but with a lot of weak type holes (where you can accidently circumvent the type system).

C++ has manifest static typing and has closed many of the accidental type holes in C, but still allows the programmer to delibrately circumvent the type system.

FORTH would be dynamic, non-manifest and VERY weakly typed.

Perl is a tough one. One one hand, I would expect it to be similar to Python or Ruby. But if you consider the types in Perl to be scaler, list and hash, then Perl would have static and manifest typing.

There's a good article on strong typing at [link|http://perl.plover.com/yak/typing/|http://perl.plover.com/yak/typing/].

Have fun.
--
-- Jim Weirich jweirich@one.net [link|http://w3.one.net/~jweirich|http://w3.one.net/~jweirich]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
New Manifest typing....a la Fortran.
I remember Fortran 4 (Lincoln Pre-processor)... every damn spelling mistake became a variable.
New Thanks, Jim. Nicely put.
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New Re: Ditto - Thanks, Jim.
New Next experiment: try it without OO
________________
oop.ismad.com
New Java going in the other direction
In JDK 1.5 they are adding generics stuff (in a really stupid way, basically the compiler just inserts the casts for you into the byte code, at runtime all the exta type information is lost) which will allow you to place even more type constraints on Collections, etc. Paul Graham [link|http://www.paulgraham.com/hundred.html|The Hundred Year Language] claims that Java, like Cobol, while being very popular is an evolutionary dead-end. I hope that he is right.

"I think that, like species, languages will form evolutionary trees, with dead-ends branching off all over. We can see this happening already. Cobol, for all its sometime popularity, does not seem to have any intellectual descendants. It is an evolutionary dead-end-- a Neanderthal language.

I predict a similar fate for Java. People sometimes send me mail saying, "How can you say that Java won't turn out to be a successful language? It's already a successful language." And I admit that it is, if you measure success by shelf space taken up by books on it (particularly individual books on it), or by the number of undergrads who believe they have to learn it to get a job. When I say Java won't turn out to be a successful language, I mean something more specific: that Java will turn out to be an evolutionary dead-end, like Cobol."

New Re: Java going in the other direction
The artcle at [link|http://java.sun.com/features/2003/05/bloch_qa.html|http://java.sun.com/.../05/bloch_qa.html] gives examples of the new features for java 1.5. I think the new features are an improvement (given what you are starting with) and will make java programming easier. Generics, autoboxing and the new iterator syntax in particular address a number of annoyances in the language.

However ...

On a lark, I converted his examples to Ruby, and in every case (except the enum example), the Ruby code was still smaller and more direct and expressive than the new JDK 1.5 examples. (I know most here are Python programmers ... I'm sure doing the same in Python would give similiar results).

Although JDK 1.5 is an improvement, it still doesn't match the flexibility and expressiveness of a good, dynamic language.
--
-- Jim Weirich jweirich@one.net [link|http://w3.one.net/~jweirich|http://w3.one.net/~jweirich]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
New Smalltalk also
For example the frequency table example can be expressed in 1 line in Smalltalk

Collection>>#frequency
^self asBag

Doesn't get much better then that.

The improvements are sugar coating. All the improvements are compiler tricks to hide the underlying deficiencies of the language. Generics are just the compiler adding in a bunch of casts, at runtime you lose all the generic information.
New Speaking of autoboxing
If you auto-unbox null, should you get zero or a NullPointerException?

If I'm not mistaken, I think the current plans are to return 0.
New gasp
-drl
New This is what happens when the foundation sucks
You can try and beautify things, but you end up with major compromises. The introduction of primitive types into Java from the start was a huge mistake. It is a classic case of a premature optimization. The creators of Java were sure that primitives as Objects would cause performance problems and therefore went with primitives. They ignored existing languages which dealt with these issues such as Smalltalk, where the VM deals with the issue, or statically typed languages like ML or even ongoing research which produced PolyJ.
New Oh My!
How generous of them! The performance of Java is SO good due to their sacrifices on behalf of us!
-drl
New According to Joshua Bloch it hasn't been decided yet
[link|http://java.sun.com/features/2003/05/bloch_qa.html|New Language Features for Ease of Development]

"One thing worth noting: this program assumes that when you auto-unbox null, you get zero. It's still an open issue whether this will be the case. The alternative is to throw NullPointerException. Both alternatives have their advantages. Unboxing null to zero beautifies applications like the one above, but it can also sweep real errors under the rug. If anyone has any strong opinions, or better yet, convincing arguments on this issue, please pass them along to the JSR-201 expert group."
New This is just stupid
Null is only zero because historically C used zero to represent null.

Since null is meant to be a special value of object reference, why is it not just an object?

I suspect its simply because the J-heads are designing from inside the box.



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New I think you missed the point
The question is not how to represent null, but when extracting a integer from its boxed container, if the boxed container is missing (the reference is null), then should the result be zero (a reasonable default) or should it cause an exception (also a reasonable expectation).

This has nothing to do with the historical representation of a null pointer as zero.

Now, granted, the whole boxed primitive idea is a kludge to work around a lack of foresight in the original language design.

(an aside: I wonder how much performance will be lost through this autoboxing technique, especially compared with eliminating primitives altogether. I suspect (without proof) that the no-primitives technique done right would have be faster overall).
--
-- Jim Weirich jweirich@one.net [link|http://w3.one.net/~jweirich|http://w3.one.net/~jweirich]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
New OK, maybe so
Because now I'm trying to figure out how you put a null integer into the container in the first place. My guess is if autoboxing is implemented, then nobody will bother with using the wrapper classes directly (since they can't even be used for arithmetic now anyhow).

IOW, you are never going to see this in brand new 1.5 code:

Integer i = new Integer(5).
List list = new ArrayList();
list.add(i);

when you can do:

List<int> list = new ArrayList<int>();
list.add(5);

So how do you put a null value into this list?

Is the problem this:

List<Object> list = new ArrayList<Object>();
list.add(new Integer(5)); // do you need to do this or should the autoboxing make it an Integer?

then the argument is

int i = list.get(0);

which should require a cast anyhow since the list type is Object.
Hmmmm. I'm not seeing it.



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Re: OK, maybe so
ToddBlanchard: [...] figure out how you put a null integer into the container in the first place.

How about (stealing the example from the article) ...
Map<String, Integer> m = new TreeMap<String, Integer>();\nint i = m.get("not_yet_in_map");
Should "i" be zero, or should there be a null pointer exception thrown? (remember that get returns a null pointer of the key doesn't exist).
--
-- Jim Weirich jweirich@one.net [link|http://w3.one.net/~jweirich|http://w3.one.net/~jweirich]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
New Well in this case
I think a KeyNotFoundException should be thrown.

This is just another case where Smalltalk's collection class protocols are vastly superior to J-crap.

i := dict at: 'not_there' ifAbsent: [ 0 ].

For java I would propose something similar on Map.

map.getValueIfAbsent('key_not_found',5);

failure to use this should throw a KeyNotFoundException.



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Perhaps ... but ...
Todd: For java I would propose [...] map.getValueIfAbsent('key_not_found',5);

That's an interesting proposal ... but it still doesn't solve the basic question of unboxing nulls. But ... let's let the J-Heads worry about that.

Ruby handles missing hash values a bit differently. If the key is not in the hash, you get a default value (which is typically nil, but can be something else).

   h = Hash.new\n   h['not_there']  #=>  nil\n\n   g = Hash.new { 0 }\n   g['not_there']  #=>  0\n\n   j = Hash.new { fail "key not found" }\n   j['not_there']  #=> Exception

--
-- Jim Weirich jweirich@one.net [link|http://w3.one.net/~jweirich|http://w3.one.net/~jweirich]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
New My point was
it should be impossible to "box" null (and I think it probably is). So the problem ought not to exist. So the problem isn't how do you unbox null, its "how do you provide a mechanism for signaling and handling this case".

I have found the requirement to provide a default "not found" value to be a useful bit of flypaper.

OTOH, Java currently returns null from maps where the key is missing so I suppose they will decide that backwards compatibility is more important than better behvior.




"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Not J-heads.
This is how Python, Perl, etc. work.

If you want to increment a value in a hash, and it hasn't been set yet, scripting languages assume it to be 0. This is extremely convenient for such things. However, there is an assumption about the nature of null being made.
Regards,

-scott anderson

"Welcome to Rivendell, Mr. Anderson..."
New Smalltalk as usual is consistent
nil is the Singleton instance of UndefinedObject and all Objects are initialized to it. This simplifies many things and is very powerful, for example, when trying to print something, you don't need code that explicitly checks for nil as UndefinedObject understands how to print itself. This also allows you to add all kinds of behaviour to nil objects if needed.
New Set Theory
In set theory, there is a gigantic difference between "nothing" (empty set) and "non-existence". Computer development idioms should have a way of dealing with this difference. The only tool I know that does is APL.
-drl
New Re: Set Theory
\nSet foo = null; // nonexistence\nSet foo = new HashSet(); // empty set\n

Regards,

-scott anderson

"Welcome to Rivendell, Mr. Anderson..."
New Here we go
I'm sure you'll argue, but what the fuck...you seemed determined to be contrary, as if every thing I say is a direct challenge to you personally.

Your example is exactly what I DIDN'T mean. Someone in J-world invented a piss-poor mimic of set theoretic ideas - so what? There can still be null pointers flying around and they can still cause destruction. In any case I think it's in a utility class definition so it's not natively part of the language, is it?

APL has no types. It has arrays. An array can be empty but still exist. THAT is a real "null". But, "null" has a size - 0. It's the size of null that is 0, NOT null itself.

Is that clear? I have the greatest respect for you skill but I'm damn sick of you and Peter just dismissing what I say as if I were some pimply ignorant teenager.

If you find a REAL example of an idiom that is built around sets other than APL, let me know.

(BTW historically APL was invented as a way to describe algorithms, not as a language - that is probably why it is such a perfect langauge.)
-drl
New Re: Here we go
You asked. I provided. Go take a flying leap if you didn't want a response. I'm sick of your crusty, argumentative "I'm the authority" diatribes around here.

You originally said:
In set theory, there is a gigantic difference between "nothing" (empty set) and "non-existence".
No kidding, Sherlock. I showed you how that difference is represented in Java.

An array can be empty but still exist.

\nSet foo = new Set();\nSystem.out.println(foo.size()); // prints 0\n
This is your "nothing" or "empty set".

Your "non-existence" in Java is represented by a null. The terminology may not be the same, but the same functionality is there.

I'm damn sick of you and Peter just dismissing what I say as if I were some pimply ignorant teenager.
Then quit writing stupid ignorant blather as if you were some pimply ignorant teenager. Or learn to express yourself better (another sign of adulthood). Or STFU. Your choice. This is my last response to you until you grow up, ace.
Regards,

-scott anderson

"Welcome to Rivendell, Mr. Anderson..."
New Amazing
When you declared Set s=null, you made a set and assigned it a value. The fact that you made a set means that it exists.

This is COMPLETELY DIFFERENT than the way an APL array can be empty - it's not even remotely the same thing. If something is non-existent in APL and you try to refer to it, the interpreter stops (other things being default equal). That would never happen in any case, because one just doesn't make that kind of error.

As far as Conrad's SQL statement, this isn't even worth commenting on, because SQL is not a development idiom.

If you were me. you'd be frustrated talking to walls as well.
-drl
New Re: Amazing
If something is non-existent in APL and you try to refer to it, the interpreter stops (other things being default equal).
If you have a null object in Java, and you try to use it, the interpreter stops.

Not the same != can't be used the same. I'm really not catching what you're on about here. "If something is non-existent". How do you represent something that is non-existent in APL then?
Regards,

-scott anderson

"Welcome to Rivendell, Mr. Anderson..."
New Re: Amazing
It's a very subtle and difficult point.

You can't model non-existence with something that exists and has a special value, no matter what it is. So "Set s=null" is emptiness, not non-existence. Strong typing means that everything you refer to HAS to exist. Set s = new HashSet() is not emptiness, because s now has an identifiable property. It's like an empty glass of water - s is the glass, not the water.

In APL, an empty array is in an intermediate area - it exists but has no properties. BY DEFINITION, size 0 = empty. So you can safely deal with emptiness because your in no danger of causing an NPE. Here's an example:

[link|http://www.csm.astate.edu/~rossa/cs3543/apl.html|http://www.csm.astat...a/cs3543/apl.html]

I have to run but will get back to this interesting topic. Please, let's bury the hatchet - IN CONRAD!
-drl
New Wow. My first exposure to APL
and Brainf*ck seems to be not too bad at all.

But you are right. APL's empty is closer to Smalltalk's Null than Java's null. Correct me somebody, but I think Smalltalk's Null can be made to behave exactly like APL empty if so desired. E.g. 0*nil would now yeild doNotUnderstand exception, but you can handle some messages in Null and in Number to make it evaluate to nil.
--

Less Is More. In my book, About Face, I introduce over 50 powerful design axioms. This is one of them.

--Alan Cooper. The Inmates Are Running the Asylum
New Same as in Objective C
Regards,

-scott anderson

"Welcome to Rivendell, Mr. Anderson..."
New Re: Amazing
deSitter: It's a very subtle and difficult point.

Its even more subtle than that. Its not merely empty vs non-empty, but APL doesn't distinguish between a single number and an array of length one.* In other words, the number 1 can be treated as a number, or as an array of a single element.

Now this is makes for some convient shortcuts in programming (much like returning a zero when unboxing a null pointer), but I don't think it is mathematically very accurate. In Math**, a set of things and a thing are not the same thing***.

Footnotes:

* Assuming I correctly remember what little APL I ever knew.

** Assuming I correctly remember what little set theory I ever knew.

*** Do I get extra points for using the word thing(s) three times in one sentence?
--
-- Jim Weirich jweirich@one.net [link|http://w3.one.net/~jweirich|http://w3.one.net/~jweirich]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
New Heh.
*** Do I get extra points for using the word thing(s) three times in one sentence?


Only if you publish it as "Thing Theory". :D

Many fears are born of stupidity and ignorance -
Which you should be feeding with rumour and generalisation.
BOfH, 2002 "Episode" 10
New ROFL
-drl
New No
>>>>>>>>>>>>>>
When you declared Set s=null, you made a set and assigned it a value. The fact that you made a set means that it exists.
<<<<<<<<<<<<<<

No.

When I do the above, I've declared a variable (a nest, a hook) that can be assigned a set (set can be placed in the nest, hung on a hook). I did not create any sets. Not even 0-sized sets. "null" is not a set.
--

He walks around, talking to himself. On the phone.
New Hey Ross, it's only a model.
Like sets are. Remember your Goedel. None of this matters. ;-)

But I am curious, how do you represent in code, in any language, the complement of "the set of all sets"? ;0)

And, set a property to be "Cardinality".
bcnu,
Mikem

The soul and substance of what customarily ranks as patriotism is moral cowardice and always has been...We have thrown away the most valuable asset we had-- the individual's right to oppose both flag and country when he (just he, by himself) believed them to be in the wrong. We have thrown it away; and with it all that was really respectable about that grotesque and laughable word, Patriotism.

- Mark Twain, "Monarchical and Republican Patriotism"
New Hey, watch this!
This is me not biting on the flagrant troll.
===

Implicitly condoning stupidity since 2001.
New Unlike DrooK, I'll bite: Ever heard of SQL, ya nitwit?!?
New See comment above, applies here as well
Your problem is the same as the two above you - your math background sucks.

That's it, I will no longer converse with jerks.

-drl
New Better stop talking to yourself then.
Your problem is the same as the two above you - your math background sucks.
Bachelor of Science, Comp Sci, 1992. Bachelor of Science, Algorithmic Mathematics, 1992. A good portion of that was set theory.

Your problem is that your programming background sucks. You also seem to have difficulty realizing that math terms are often not the same as the equivalent comp sci term. Deal with it. Or when you start talking about nulls and sets, make it clear you are talking about math, and not programming. They aren't the same. Or if that's the total point, say so.
Regards,

-scott anderson

"Welcome to Rivendell, Mr. Anderson..."
Expand Edited by admin May 13, 2003, 04:51:56 PM EDT
New Your problem is the same you had a year (or was it two?) ago
If anyone is "convers[ing] with jerks" here, it isn't you -- it is *you* who are being a total asshole, again.

The only question now is: Will it take you months and months to stop, this time too?

Marlowe and Norm sure aren't the only two manic-depressives here.


   [link|mailto:MyUserId@MyISP.CountryCode|Christian R. Conrad]
(I live in Finland, and my e-mail in-box is at the Saunalahti company.)
Your lies are of Microsoftian Scale and boring to boot. Your 'depression' may be the closest you ever come to recognizing truth: you have no 'inferiority complex', you are inferior - and something inside you recognizes this. - [link|http://z.iwethey.org/forums/render/content/show?contentid=71575|Ashton Brown]
New I remeber Pascal in the very same way
Late '70s, Pascal was all the rage. They were structuring entire CS curricula around teching it..in fact, its "the only langugage you'd ever need".

Now, how much Pascal do you see these days? El zippo1. Why? Because it was a pedantic exercise, and proved to be too limited in the real world to solve the kinds of problems that needed to be solved.

It may prove that Java will be the same, for much the same reason. And while I do see merit (to a degree; again experience is limited...there aren't many Objective C or Python compilers available in the embedded world...;-) ) in dynamic-typed langauges, it is a testament to C and C++ that it hasn't succumbed to the kind of Darwinism that has affected everything from Algol to Cobol to Pascal to...Java?

(Yeah, and SQL still lives, too! Well, Darwinism isn't perfect....)

1Yes, I know that Sir Cyclic's darling, Delphi, is basically Object Pascal. Don't want to offend CRC.... but Delphi resembles Jensen/Wirth Pascal even less that C++ resembles C, so I'd maintain that Delphi is the result of Darwinism...the organism adapted rather than became extinct.)
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New Just had this conversation
...talking with a friend who does firmware for flow controllers, and is probably soon going back to Imagineering at Disney. The conversation came about because of the last gems from Todd and Scott on typing et al. :) It came down to both of us agreeing: if you need the benefits of C, use C, not C++; if you don't, use Python, etc. Java and C++ were nice training wheels. All of which is to say, I don't see C "becoming extinct" anytime soon in its role as "heavy-hitting macro language" for assembly. But C++ is an adaptation that is slowly losing the Darwinian struggle.

Many fears are born of stupidity and ignorance -
Which you should be feeding with rumour and generalisation.
BOfH, 2002 "Episode" 10
New Freep said the same thing
back on the ezboards.

He said that basically, from the Smalltalker's view, you need Smalltalk, and for stuff where Smalltalk doesn't fit (speed, size, whatever), you need C.

Some of the Squeakers are now emitting custom machine code sections from Smalltalk source (similar to the Java JIT/Hotspot stuff).

Which can mean that you don't really need C much longer either.



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Still waiting for ...
... Smalltalk, Python, Objective C, ruby, Squeak, etc compilers for the embedded arena.

Know of any that are of the quality of the Paradigm or IAR C/C++ compilers?
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New Depends on constraints
But for embedded - you just need C. C++ doesn't add anything interesting.

What's your target platform?

There are squeak implementations for various ARM's and similar tiny gadgets btw.

What kind of embedded software are you working on?



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Platforms:
80x86 (in embedded appications)
68HC1x
MCS-51 series (8051, 8091, etc.)
680x0 (again, in embedded applications)

And, as you can probably guess, I strongly disagree that "C++ doesn't add anything interesting." Besides polymorphism, which is extremely "interesting", it adds substantially better typing and several enhancements against C95 (including, but not limited to, my all-time favorite: the bool type).
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New Don't even get me started
on the idiocy of the bool type.

Its implementation and promotion rules are just dumb.

Repeat after me - there is no bool type, there is no bool type, there is no bool type.

Every idiot C programmer that ever typedef'd a bool (or worse, #defined's TRUE or FALSE) should burn in hell.

0 is false.
!false is true.

Live with it.

As for polymorphism - you can make your own in C with about the same amount of work.



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New I'll get you started, alright!
Because, with all due respect, in this particular case, you're quite full of shit.

The bool type (or more accurately, its absence) was the single biggest omission of C, vastly surpassing its other glaring omissions and shortcuts.

Wasn't it you who was whining about how it takes 4 bytes to represent a bool in somebody's POS implementation of C++? Well, chucko, guess how many bytes it takes to represent the non-bool in every implementation of C1....

(Hint: 3 more than it takes in all my C++ compiler's implementations...and that even includes..(gasp!) Micros~1's implementation!)

And look, you're so blinded by your dogmatic disdain for C++ (or, perhaps, even ignorance of it), that you contradict yourself:
0 is false.
!false is true.


Congratulations! You've just explained the definition of the ANSI C++ bool type! Not bad!

Live with it.

I can, and I do daily. Can you? (from your post, I rather doubt it.)

Repeat after me - there is a bool type, there is a bool type, there is a bool type. In C++. And in C (ref. X3J11 C99 standard).

Live with it!

1 Prior to C99, that is.... :-)
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New No I'm not
C's mission in life is to provide portable assembler.

That's it. At that particular job, C is a masterwork. (And I agree with Dennis Ritchie's opinion of C99 - it strays from the mission too far).

If the processor supported bool, then I'd agree with you - there ought to be a bool. But the processor doesn't. It supports bytes, shorts, longs, floats, and doubles. If there's a glaring problem with C, its the unfortunate naming of byte as char.

Instead, we have a kludge of syntactic sugar on some indeterminate integer type.

Examples of idiocy:

bool b = 0; // fine its false
bool b = false; // fine

cout << b << endl; // print's 0 - why not false? You can't change it either because ostream::operator<<(bool b) is defined as a member function of ostream.

bool b = true; // fine
bool b = 5; cout << b << endl; // print's 1?!?!

b = false;
cout << b++ << ' ' << b++ << ' ' << b++ << ' ' << b++ << endl;

this print's 0 1 1 1

interestingly, b-- won't compile, operator is not permitted. So if you were going to use it as a use counter bool offers surprising behavior.

Bottom line - bool is a quirky POS that you don't really need if you just accept that you can do your conditional branching using any convenient integer type.

So don't try to tell me that this is a good idea. Its not. It's pretentiousness at its very worst.



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New The problem is, you're trying to treat a bool as a number
after complaining mightly (and accurately) that it is not.

Your problem is here:
bool b = 5:

What you're really done is:
bool b = static_cast<bool>(5);

One could argue that trying to cast a value to a bool is a non-sequitir, and one would be right. However, since that one was not on the ANSI committee, and since the overarching goal of the committee was not to break existing code (no matter how screwed up that existing code might be), the committee compromised, and decided that a standard conversion from an integer type to bool shall exist and that any non-zero integer value casted (either implicitly via the standard conversion, or explicitly via a casting operator) shall be considered to be the value 'true' (again, being consistent with C -- can't break exisiting, screwed-up code...).

Bottom line - bool is a quirky POS that you don't really need if you just accept that you can do your conditional branching using any convenient integer type.

Now you've really gone off the deep end, and are again contradicting yourself. If, indeed, "C's mission in life is to provide portable assembler [...]" (an assertion I don't accept, BTW; but for the moment let's let that stand), where in the hell does it follow that using the presence or absence of a "convenient integer type" is sufficient (or even necessary) for conditional branching? Conditional branching is the result of the conformance or non-conformance to a predetermined condition set by the programmer at the decision point. So the choice to follow an execution path is based on the evaluation of the presence or absence of a condition (which in assembler can be the presence/absence of a non-zero value, but can just as easily be the presence/absence of a carry, a flag value, the sign-bit, etc.) Your "portable assembler" does not give us such granularity, it has quite arbitrarily given us the presence/absence of a non-zero value to make decisions on. But what is the non-zero value that represents x > y? You can of course justify some mathematical hash that reduces itself to a zero/non-zero value, but in reality, there is none. What we really have is a boolean decision: is the relation true or not? The committee, rightly identifying that, abstracted the decision to a yes/no result, then created a type to represent that decision (which has the happy side effect of being available to the programmer for his/her other uses -- and abuses-- as well; abuses such as trying to set a boolean varialbe to the value of 5, for example). It beats hell out of the various kludges that C had to come up with to approximate that abstraction. Its no accident that the type of the relation operators in C++ is bool (not int); and, you know what? They're right...a relational operator expresses a boolean not integral relation!

Now for someone who jumps all down my throat wailing the praises of Objective C, Smalltalk and other so-called "pure" OO languages, for you to make a statement like the blockquoted one above is so oxymoronic. Make me wonder whether Bryce hijacked your login....
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New No, I'm trying to branch on a condition
And it actually amounts to jump-if-zero. Which is fine. I can write very interesting programs with that structure. The fact that relational operators evaluate to zero on false and (as a convention) one on not-false makes this mechanism easier to use and follow.

There's nothing oxymoronic about my position on this.

C isn't a high level language. Its a very low level language. Its not for writing applications (although it gets used for that). Its for writing maximally efficient processor independent code. If you bother to read the K&R or study your computer history, you'll find that C was specifically created to make it easy to produce portable code while remaining close to the machine.

As I'm in the midst of moving, all of my K&R's (I own several because its just not good to be without one) are in boxes or I'd quote it at you.

The important thing to remember is that C actually has no concept of "true" and "false". Trying to force fit it into the language hasn't done anybody any favors. The bool whiners are also losers that write shit like:

if(condition == true) // this is just stupid

rather than

if(condition)

Furthermore, the C convention fits nicely with pointers, jump-if-zero is the correct semantic.

char* foo = "Hi there";

if(foo == NULL) // is totaly stupid
if(foo) ....// perfectly reasonable given the framework of the language

Keep in mind that I don't think every language ought to be like every other language. I like my C primitive and I like my application development languages rich and convenient - for very different reasons.



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New 21st Century Schitzoid Man
OK, first you say you're trying to branch on a condition (the title of your post), then you say stuff like:
Furthermore, the C convention fits nicely with pointers, jump-if-zero is the correct semantic.

char* foo = "Hi there";

if(foo == NULL) // is totaly stupid
if(foo) ....// perfectly reasonable given the framework of the language


So which is it??? If you're really trying to branch on a condition, then the line:
if (foo)    // where foo is a pointer
is completely meaningless! Where is the condition your trying to branch on?!? I don't see it, and neither do you. What you're doing is being lazy; failing to bother to fully express the condition you're ostensibly trying to branch on. The pointer foo is not a condition, it's a fscking pointer awready. It has no more truth or falsehood than does a glass of water. What you're really asking is whether foo is NULL or not. That has truthness. And therefore the correct way to determine that truthness is to ask it, thus:
if (foo == NULL)


Now I will agree that:
if(condition == true) // this is just stupid

rather than

if(condition)
is "just stupid"...so long as 'condition' is truly a conditional expression (or an object of type bool or one that has a cast operator of type bool...;-)...); if it is a numeric expression, then both are equally stupid, which you would have to agree if what you're really trying to do is "branch on a condition".
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New You are fighting the language
There is no boolean in C.

There is only jump-if-zero

if(something-that-might-be-zero) // implies jump past statement if zero
{

} // statement ends here so here is where we jump

The idea of boolean in C is something you have invented in your head. It is not in the language and does not need to be there.

If you're really trying to branch on a condition, then the line:

if (foo) // where foo is a pointer

is completely meaningless! Where is the condition your trying to branch on?!? I don't see it, and neither do you.


Of course I see it. The condition being branched on is the zero-ness of foo. If foo evaluates to zero, then we skip past the next statement. What is so hard about that?

What you're doing is being lazy; failing to bother to fully express the condition you're ostensibly trying to branch on. The pointer foo is not a condition, it's a fscking pointer awready. It has no more truth or falsehood than does a glass of water. What you're really asking is whether foo is NULL or not.


No, I'm asking if its zero - which is the same as null in C.

That has truthness. And therefore the correct way to determine that truthness is to ask it, thus:

if (foo == NULL)


I'd say that's bad form. Much better to just write if(foo)...

There is no true or false - there is only zero and non-zero in C. The if statement jumps past the next statement on zero. This is the correct way to think in C. The rest is fantasy made up by people who confuse logic and programming. They are not the same.





"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Tell you what...
I can see this descending into the type of circular non-reasoning that tends to pervade these fora, so I'll simply stop.

It's true at the most concrete level that C has no concept of booleanism. It's false that C at the level of being a translator for abstract ideas into a working program that C has no concept of booleanism. You have chosen to argue the former position, and from that position, you are right. I have chosen to argue from the latter position, and from that position I am right.

We can't solve this impasse, so let's stop trying, OK?

(I'll even let you get in the last word...)
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New Can I put my oar in?
The way C handles "true" and "false" tends to drive me up the wall. I will write if( condition == 0 ) because that's usually what I want to know. A language needs *something* to be a boolean type and C's use of 0 and not-0 is not intuitive (to me)!

So what's the alternative? PHP and languages of that ilk use a genuine boolean type: expressions evaluate to a "true" or "false" value which is not an integer and can be assigned and passed around and used in conditionals. Much neater. Another alternative is exceptions: if it succeeds, it returns a value. Otherwise it fails. Many assembly calls on the x86 do this. Icon does this, too.

I avoid programming in C these days.

Wade.

Is it enough to love
Is it enough to breathe
Somebody rip my heart out
And leave me here to bleed
 
Is it enough to die
Somebody save my life
I'd rather be Anything but Ordinary
Please

-- "Anything but Ordinary" by Avril Lavigne.

New Yeah sure
I understand what you are saying.

It doesn't bother me that C has no boolean. The CPU has no boolean either. When I write C, I see hardware level pseudocode. This is the Zen of C.

Working at (almost) the level of the machine to produce specific behaviors in the hardware. IOW, when I'm writing in C I care as much about how something gets done as what gets done.

When I write applications in Smalltalk or Java or whatever, I don't generally care how something gets done as long as it gets done with reasonable efficiency. Its at that level that I'm happy to have a boolean and all the abstraction that comes with it. That's a different Zen.

Apart from all this, I continue to assert that C++'s boolean is the stupidest implementation of boolean ever created and only serves to add lovely plumage to the platypus.






"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Such flowerly language toward such a misguided conclusion
Apart from all this, I continue to assert that C++'s boolean is the stupidest implementation of boolean ever created and only serves to add lovely plumage to the platypus.

Put aside your disdain for the nale of the language and observe what is really going on for a second.

A C++ object of tyep bool can have exactly one of exactly 2 values: the value 'true', and the value 'false'. These "values" are not integers, they're not enumerators of type bool, either. They are proper values in and of themselves.

Now how is having a boolean type whose values can only be true and false be "the stupidest implementation of boolean ever created"?

(Hint: It can't....)
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New Yeah right
IFF C++ had a real boolean, the following statements would be compiler errors:

bool b1 = 0;
bool b2 = 1;
bool b3 = 500;

Instead, only the two following assignment statements ought to be valid.

bool proper = true;
bool sane = false;

Don't fucking allow an assignment and then change the value of what was assigned without so much as a by your fucking leave sir.

increment operators ought not to be allowed - or - ought to be paired with decrement oprerators for symmetry.


A C++ object of tyep bool can have exactly one of exactly 2 values: the value 'true', and the value 'false'. These "values" are not integers, they're not enumerators of type bool, either. They are proper values in and of themselves.


OK, so why does

cout << "false is " << false << endl;

produce "false is 0" on the output stream? Looks pretty fishy to me.

Sorry, I can't take this kind of shit seriously - not even a little. There's no consistency of behavior in the bool type - its a really thin veneer on (depending on implementation, char, short, int, or long) and the veneer has more holes than Amish swiss cheese.

Find me a quirkier implementation of bool (in a language that has one) and I might retract my statement.

But I certainly don't know of any.



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New (++true == false)
If you got increments and decrements, then I'd think they should work like the not operator (!true = --true).

Just a thought to muck with the standard some more. :-)
New Just add a little gasoline, and stir!_____;-)
Actually, I don't understand why the decrement operator is not defined on a bool. However, if I were the king of X3J16, I'd not support a wrap-around approach to the increment or decrement operators. I'd support:
\nfalse++ == true\nfalse-- == false\ntrue--  == false\ntrue++  == true\n

(with the pre-increment and pre-decrement operators working the same way.)

I'd also not support implicit conversions to or from bool; you'd have to explicitly cast a non-bool to a bool or vice versa. (That should mitigate Todd's objections a teensy bit....)
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New OK, Now I see wht your problem is
...and you may have a point, but your disdain is misguided.

As I have stated earlier, the statement

bool b = 500;

works because there is an implicit, compiler-supplied conversion from type int (and all its siblings) to type bool. What you are really entering when you type the above line is

bool b = static_cast<bool>(500);

The bool type is fine, its the implicit conversion you disdain.

Which I find interesting, to say the least, because in the you have defended C's implicit conversions and promotions. And also because you jump up and down stating that integer values should be treated as boolean entities when placed in an if/while/for conditional clause.

So your objection to the bool type as implemented in C++ paradoxically boils down to the fact that there is an implicit conversion from itn to bool, which should be OK in your world. Taken to its (il)logical conclusion, you should also argue that C++ (and therefore C) should also reject such standard things as:
\nfloat f = 1;\nint   i = 1.0;\nchar  c = 14; \nfloat f2 = 1.0; // the reason why is left as an exercise for the reader...\n

The point is that implicit conversions and promotions are just as much a part of C as is the nonsense of treating integers as boolean entities, which you so rabidly defend.
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New I thought you were going to give up on this
bool b = 500;

works because there is an implicit, compiler-supplied conversion from type int (and all its siblings) to type bool. What you are really entering when you type the above line is

bool b = static_cast<bool>(500);

The bool type is fine, its the implicit conversion you disdain.

Which I find interesting, to say the least, because in the you have defended C's implicit conversions and promotions.


Actually, what I am really typing is what I really typed - which is stupid and ought not to be allowed. It looks like C++ has been taking lessons from Clippy.

I don't object to C's conversions because they are (mostly) sensible. When mixed mode operations are encountered, the types are "promoted" to prevent loss of precision. The promotion rules are simple and comprehensible.

C++ takes an extreme approach to type conversion which is the source of many many surprises and bugs (the construction of temporaries and such). In practice, this has made it nearly impossible to write predictable code.

I also find the behavior inconsistent with how enum's work. You can do this:

enum Traffic { RED, YELLOW, GREEN };

Traffic aTraffic = (Traffic) 7;

cout << aTraffic << endl; // this will print 7

There's no value clamping here (and thus no inadvertent loss of information).

Finally, your comparisons with C are falling flat. C is weakly typed. C++ is meant to be strongly typed (which is why we have 5 kinds of cast operator). But its only *sometimes* strongly typed.

Rule number 1 - never surprise the programmer.

Unfortunately, C++'s rule seems to be *always* surprise the programmer.

(Edit - clippy bit)



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
Expand Edited by tuberculosis May 23, 2003, 11:15:41 AM EDT
New How sensible is this?!?
\nint      i   = 723.524;\nchar *   ptr = i;\nint      j   = ptr;\n


I don't object to C's conversions because they are (mostly) sensible. When mixed mode operations are encountered, the types are "promoted" to prevent loss of precision. The promotion rules are simple and comprehensible.


Yeah, that's real "sensible", preserves "precision", and "comprehensible", all right. But it's C, so its OK, right? Stupid things like this are OK for you if they happen in C, but in C++, they are "stupid and ought not to be allowed."

Pot. Kettle. Black.
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New Not convinced
You keep taking one little aspect per message and - yes - by itself each item isn't *so* bad. But its cumulative and, as a complete package, the sum is stupider than the parts.

So I'm not going to bother with this anymore. You clearly have your hammer - go pound your nails (and screws, crockery, whatever looks inviting).

Bear in mind that I used to be a (really pedantic language lawyer) expert at your particular hammer. I never use it anymore (since 1997). The drawbacks outweigh the benefits and there are much better tools.



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Nor am I
The only thing I keep taking in each one of your messages is the current objection, and showing where said objection is inconsistent.

I can understand how that can be annoying.

Go. Get thee hence. Write fine programs using whatever tool(s) you have available. I'll do the same.
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New You guys should be using Modula-2. :-P (new thread)
Created as new thread #103624 titled [link|/forums/render/content/show?contentid=103624|You guys should be using Modula-2. :-P]
New And an answer to your question.
OK, so why does

cout << "false is " << false << endl;

produce "false is 0" on the output stream? Looks pretty fishy to me.


For two reasons: the first is the implicit conversion thing again, unless told otherwise the stream class converts objects of type bool to integers before converting them to text. This is a function of the stream class and not of the language (but, you already knew that, didn't you...). Second, you want text? Try:

cout << boolalpha << "false is " << false << endl;

In other words, the display of bool values is divorced from their internal representation, just like using any formatter in a printf statement. I can make floats look like integers, too...wanna see?

If you objection to bool is based on formatting decisions by an I/O library, that's pretty weak!
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New Wrong answer
the implicit conversion thing again, unless told otherwise the stream class converts objects of type bool to integers before converting them to text.


OK, why? The stream class has an overload on type bool - so its not the implicit conversion thing.

More telling - this default tells me that the bool type isn't quite viewed as a non-integer by the implementers of the language and libs. IOW, they don't buy their own BS about bool not being some indeterminate integer type with stupidly defined operators.

If you objection to bool is based on formatting decisions by an I/O library,


No, its the package. The whole thing is a hack job and bad one at that. Better to leave it off. What was so broken by not having this abortion in the language?



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Wrong answer back
OK, why? The stream class has an overload on type bool - so its not the implicit conversion thing.

Todd...first, the stream class has on overload on operator <<, not on the type.

Second, the operator << overload knows that the type of the value it is being passed is bool, and based on the settings of that stream object's attributes (like, for example, the manipulator settings in effect at the time), determines a rendering. Unless told otherwise, it casts it to an int, and displays the number.
More telling - this default tells me that the bool type isn't quite viewed as a non-integer by the implementers of the language and libs. IOW, they don't buy their own BS about bool not being some indeterminate integer type with stupidly defined operators.

Actually, (and this answers the rhetorical "OK, why?" from the first citing) The default has much. more to do with the rules for outputing locale-neutral text than for any other reason. P.J. and his team have always been very reticent to spontaneously spout text, because there is always the problem of non-English speaking audience (which is a larger audience that then English-speaking audience). The fact that there is a native, built-in way to get a locale-specific textual representation of a bool lends the lie to your assertion that the "the bool type isn't quite viewed as a non-integer by the implementers of the language and libs." (Oh, and can we finally get past the red herring that the language and the library are both required to make up "C++", or are you now going to argue that SWING and AWT are integral, inseparable parts of Java?)
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New They've turned it into Pascal
I never thought boolean types were of any value at all, other than to complicate things. 0 is false - that much is certain, and that's what you need, a definition of certainty. Note that true can be -1 or 1 - or really anything that is not 0. The most useful definition is true=-1, because it means the biggest unsigned integer - all the bits = 1. Whatever the number of bits, -1 will always be the state with all of them on. What's the point? Boolean really should mean 2s complement arithmetic. It's not a type, it's an algebra.
-drl
New Circular definition.
Ross vents his usual frustration with anything invented after 1969:
I never thought boolean types were of any value at all, other than to complicate things.
That's because you're a fuckwit.


0 is false - that much is certain, and that's what you need, a definition of certainty.
Only if you're a religious nutcase... But I digress.

Back on track: So you only need "a definition of certainty" for FALSE, but NO "definition of certainty" for TRUE? Why is that? Where's the logic in it?


Note that true can be -1 or 1 - or really anything that is not 0.
Yeah, that's SOOO "certain" and un-"complicated".


The most useful definition is true=-1, because it means the biggest unsigned integer - all the bits = 1. Whatever the number of bits, -1 will always be the state with all of them on.
Only if you run it on a twos-complement processor. Or did you think that's somehow a Law Of Nature, or something...?


What's the point?
Good question; personally, I don't think you have one, except hanging out your crankiness.

(Which, in your case, usually means hanging out your crank... And stepping on it.)


Boolean really should mean 2s complement arithmetic. It's not a type, it's an algebra.
Only if you DEFINE it in the idiotic C way (and run it on a twos-complement processor).

Circular definition much, fuckwit?


   [link|mailto:MyUserId@MyISP.CountryCode|Christian R. Conrad]
(I live in Finland, and my e-mail in-box is at the Saunalahti company.)
Your lies are of Microsoftian Scale and boring to boot. Your 'depression' may be the closest you ever come to recognizing truth: you have no 'inferiority complex', you are inferior - and something inside you recognizes this. - [link|http://z.iwethey.org/forums/render/content/show?contentid=71575|Ashton Brown]
New Re: Circular definition.
Because false implies the possibility of true. Otherwise there is no meaning to false. Therefore the Boolean "type" is superfluous. What you need is false and not false. You could turn it around of course, and call 0 "true". This isn't so odd. When a return code is 0, that's good and it means the function worked "truly". Do I need to explain it more?

(PS: I'm listening to Steppenwolf 7 - "Renegade - Foggy Mental Breakdown - Hippo Stomp". That was invented in 1970.)
-drl
New Self-contradiction, and logically inconsistent definition.
Ross exposes the depths of his (and C's) illogic:
Because false implies the possibility of true. Otherwise there is no meaning to false. Therefore the Boolean "type" is superfluous.
So, following that line of (what I will perhaps to charitably call) reasoning: "Because the existence of an integer i implies the existence of the next one, i+1, the integer type is superfluous."

That's funny -- down here on Earth, it's usually accepted that precisely *because* integers behave in one way that is particular to them and not to anything else, that's why you *do* need (or at least, want) a specific type "integer".

(Extending a parallel to the behaviour of true/false values and a boolean type is left as an excercise for the reader with brains bigger than his haemmorhoids.)


What you need is false and not false.
Exactly. And since "not false" _I_S_ true, this means that what you need is false and true.


You could turn it around of course, and call 0 "true". This isn't so odd. When a return code is 0, that's good and it means the function worked "truly".
Actually, while C *doesn't* turn it around in if statements (i.e, 0 is "false" there), in function return codes it *does* work exactly as you say! So on the one hand, 0 is "false", but AT THE SAME TIME it means "worked TRULY". And you claim any *change* to this illogical piece of shit is the problem?!? You need to get your head examined, man!

Alternatively, you (and Todd) could just admit that you don't really give a shit about all the logic and consistency you're *talking* about, but just don't want to accept that anything you learned twenty-five years ago could possibly not have been the ultimate pinnacle of reason and sense you once thought it was. Because that _I_S_ where the real problem is for you two, isn't it?


Do I need to explain it more?
Don't try to be condescending to me, Bubba -- it only works *downwards*.


(PS: I'm listening to Steppenwolf 7 - "Renegade - Foggy Mental Breakdown - Hippo Stomp". That was invented in 1970.)
Yeah, well, "listening" -- but you're probably listening to it with utter disdain.

(BTW, _Der Steppenwolf_ (the one from 1927, that is) is way over-rated, AFAICS.)


   [link|mailto:MyUserId@MyISP.CountryCode|Christian R. Conrad]
(I live in Finland, and my e-mail in-box is at the Saunalahti company.)
Your lies are of Microsoftian Scale and boring to boot. Your 'depression' may be the closest you ever come to recognizing truth: you have no 'inferiority complex', you are inferior - and something inside you recognizes this. - [link|http://z.iwethey.org/forums/render/content/show?contentid=71575|Ashton Brown]
New Can someone start a new thread please?
===

Implicitly condoning stupidity since 2001.
New What for, aren't the long ones the best?
New Comments on supposed idiocy

Examples of idiocy:
[...]
cout << b << endl; // print's 0 - why not false? You can't change it either because ostream::operator<<(bool b) is defined as a member function of ostream.
At least it is consistent. For example ...

\n enum Color {RED, GREEN, BLUE};\n cout << GREEN << endl;           // Also prints 1
However, I will agree that the decision to print 0/1 for false/true is ... surprising.

Continuing ...
bool b = true; // fine
bool b = 5; cout << b << endl; // print's 1?!?!
That's right. Bool (despite your declarations to the contrary) is not a integer type.
b = false;
cout << b++ << ' ' << b++ << ' ' << b++ << ' ' << b++ << endl;

this print's 0 1 1 1

interestingly, b-- won't compile, operator is not permitted. So if you were going to use it as a use counter bool offers surprising behavior.
Yes. This a concession to backwards compatibilty with existing code. But you know this. No one is claiming this is the correct way to define bools in a green field language.
Bottom line - bool is a quirky POS that you don't really need if you just accept that you can do your conditional branching using any convenient integer type.
This seems an odd statement. Of course you don't need it. You don't need enums either. You don't need strings. There are a lot of things you don't need.

However, bool is a common abstraction. One that programmers often provide themselves. Unfortunately, it is not possible to provide a user defined bool type in C++ that has the following behavior ...
void f(int i) { cout << "Integer version called"; }\nvoid f(bool b) { cout << "Bool version called"; }\n\nint main(int argc, char** argv) {\n  f(argc == 0);   // Prints "Bool version called"\n}
Before bool was defined as a built in type, the integer version would have been called.
So don't try to tell me that this is a good idea. Its not. It's pretentiousness at its very worst.
Wow strong language. It seems to me, given the design constraints (backwards compatibility, desire to overload on bool) the choices were far from as bad as you paint it.
--
-- Jim Weirich jweirich@one.net [link|http://w3.one.net/~jweirich|http://w3.one.net/~jweirich]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
New Re: Comments on supposed idiocy
It seems to me, given the design constraints (backwards compatibility, desire to overload on bool) the choices were far from as bad as you paint it.


I understand the design forces at work that lead to this mess. Incrementing a flag as a one way latch was a common idiom in a lot of older code.

What I really question is whether the increased complexity is worth it. I don't think it is. They had a choice between not having a bool and leaving the branching consitent with C, or adding a proper bool type and breaking a bunch of code.

The gutless wonderdogs did neither. This is my beef.

I dislike inconsistency and C++ has it in spades. The whole thing is a sign of muzzy headed thinking and committee dynamics.

Compare C++ to Objective C and the amateurishness of C++'s design shines like the top of the Chrysler Building.



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Don't even get me started
on the idiocy of the bool type.

Its implementation and promotion rules are just dumb.

Repeat after me - there is no bool type, there is no bool type, there is no bool type.

Every idiot C programmer that ever typedef'd a bool (or worse, #defined's TRUE or FALSE) should burn in hell.

0 is false.
!false is true.

Live with it.

As for polymorphism - you can make your own in C with about the same amount of work.



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New You didn't mention types of programs



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Sorry, thot I was clear earlier...
..We're doing embedded processing (and by that, I mean "real" embedded stuff, not Micros~1's bastardation of the term). Realtime data acquisition and control stuff, with a microprocessor of some sort (like the sort I mentioned) running things.

For this, I've got some real good C95 compilers, and some ANSI (and not-so-ANSI) C++ compilers. I haven't seen a C99 compiler in this space yet. (In fact, I haven't seen a C99 compiler in any space yet; I don't think either Micros~1's or Borland's are C99-compliant yet).

No Smalltalk.
No Objective C (which I would expect to be the first).
No Python or Ruby.
Yet...
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New Still doesn't tell me enough
What is your memory budget? Persistent storage? Is there a file system? Do you drive a display? What about a network interface? If you have a couple megabytes, you can likely use your C compiler to bring up squeak and use Smalltalk (one guy has a 400k image working). So just because nobody ported it to your machine doesn't mean you can't use it. The tiny smalltalk guys use a pair of executables - a tiny executive on the device and a big fat dev environment on a regular machine. They push down little clusters of serialized objects to the executive to update the exec.

This is more likely to succeed than porting the ObjectiveC runtime I think.

But its up to you - all you need is a C compiler and you can run whatever you like in there - you just have to port the machine yourself. Most Squeak ports take a week or so but that's for a full blown machine with video, network, file system access, keyboard and mouse. You can skip any of these or implement them in new ways.



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Re: Still doesn't tell me enough
What is your memory budget? Persistent storage? Is there a file system? Do you drive a display? What about a network interface?

Whatever the device will support. (1MB for the 80186, 64K for the HJC1x or the 8051...you get the picture).

EPROM or Flash. We partition the total memory address space between ROM and RAM.

Not usually. I did recently get to work on a box that supported QNX as its OS/Kernel, and we used a 486 in protected mode. A "file system" was supported (the flash was made to look like a file system). Such luxuries are rare, however....

Yes, generally an LCD of insufficient size ;-) . sometimes color, often monochrome, never using standard parts (except for that QNX box I mentioned earlier...)

In your (and my) dreams....

So what I hear you saying is that we download a VM-equivalent for Smalltalk, then download class definitions, and away we go?
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New The VM's are all written in very portable C
you have to port them if you want to use them. There's a reason Squeak runs binary equivalent on Windows, Linux, Solaris, Macintosh, Mac OS X, the Sharp Zarus, and some other little handheld gadgets. The core VM is portable but there are stubs you need to do hardware/filesystem/network/display/input device interfacing.

FWIW, there have been a number of embedded Smalltalk projects over the years.

You might not be able to afford Squeak on your smaller devices. You may be able to afford PocketSmalltalk though. There is some info here: [link|http://www.iutc3.unicaen.fr/serge/62|http://www.iutc3.unicaen.fr/serge/62] or [link|http://www.pocketsmalltalk.com/|http://www.pocketsmalltalk.com/]




"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Re: The VM's are all written in very portable C
I've got a spare 2gig laptop disk that was intended as a testing place for "FORTH OS" so I think I will play around with this.

Has anyone ever tested Smalltalk as an extension of the FORTH VM? Kinda like running win or startx from the command prompt. FORTH is paradoxically fast in some situations, like moving data (probably why it's good for graphics). You could have a completely portable hardware interface in FORTH.

[link|http://www.zetetics.com/bj/papers/moving1.htm|http://www.zetetics....apers/moving1.htm]

How would ST map onto this?
-drl
New Funny you should mention it
Just ran across a thread on Squeak list talking about changing a couple low level operations in the VM to support something more like IDT on message sends to allow any object to be used as a CompiledMethod (the object that represents executable code). Which raised the following comment:


Incidentally, does anyone know of prior art for doing that kind
of jumping from VM to meta-interpreter and back, in Smalltalk or other
similar (OO, bytecode) systems? There's a company doing exactly that to
enable call/cc in Java, and I'm curious if their patent application has
any validity.
------
My first thought was that your question reminded me of indirect-threaded
Forth implementations, which is the classic style of Forth
implementation. In an indirect-threaded Forth, each "subroutine"
(called a "word" in Forth) begins with the address of its interpreter.
Also, consider the similar, but slightly different, direct-threaded
Forth implementation style where every "word" begins with machine code
which is the interpreter for that "word" (it is typically a jump or call
to the interpreter for that class of Forth "word").

Maybe that could be seen as prior art, without needing to squint too
hard? I suppose this goes back to the late 1960s or early 1970s. There
was a nice article in Byte (1980? or early 1980s) with a title something
like "Threaded Interpreters" which went into some detail about
indirect-threading, direct-threading, subroutine-threading, and
token-threading.



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Re: Funny you should mention it
Well, the ideal system to me would be:

1) A FORTH VM for hardware abstraction

2) A Smalltalk windowing environment. If the basic Smalltalk ideas
map onto FORTH constructs like code and parameter fields, and
threading, and dictionaries, then it's a done deal to write Smalltalk in FORTH.

3) An APL system coded in Smalltalk. This would be an acid test
because APL HAS to be very fast at moving data, which is what
really takes the most time.
-drl
New Minor modification
It came down to both of us agreeing: if you need the benefits of C, use C, not C++; [...]


Actually, i'd posit that if you need the benefits of C, use C++ as "a better C than C". Its stronger typing, better use of primitive typing (a char is a char, not a dwarf int; a float is a float, not a dwarf double; the bool type, etc.) make it a stronger candidate for C-type stuff. It's detractors never mention that possibility.
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New Just because *you* don't see it...
...don't mean it ain't there.

A lot of vertical market health software is written in Pascal; similarly, an awful lot of the software that drives the road management systems is written in Pascal.

Niche markets, to be sure, but that's one honkin' big pile o'code.


Peter
[link|http://www.debian.org|Shill For Hire]
[link|http://www.kuro5hin.org|There is no K5 Cabal]
[link|http://guildenstern.dyndns.org|Blog]
New In fact..
..Java is just the afterbirth of UCSD Pascal - the "P" system - so it's an even better analogy that one just based on fashions and trends. In fact I think Joy helped create the miserable P-system as well.
-drl
New Heh...
...I was wondering if it was just me who saw that....

Thanks, Ross.
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
New Re: Heh...
The motivation for the P-system was lack of standards in PCs. Those were the days of the Altair and Heathkits and a lot of other very heterogeneous hardware. The reason for it evaporated when systems became standardized. The fact that the Linux kernel runs on everything and is (more or less) easily ported shows that the argument for Java is just as outdated. Even vastly differening types of hardware operate so similarly, that the issue of portability is moot.

Not that the idea of a VM environment is wrong - just that the VM should not be a stupid process, but something low-level that can talk directly to hardware.
-drl
New BS
The fact that the Linux kernel runs on everything and is (more or less) easily ported shows that the argument for Java is just as outdated. Even vastly differening types of hardware operate so similarly, that the issue of portability is moot.
Having just spent a few months doing a port from Solaris to Linux, I can safely say that you are full of shit.
Regards,

-scott anderson

"Welcome to Rivendell, Mr. Anderson..."
New BS
Were were talking about the Java VM, and systems level things, not an application. Did you ever have an application port that was easy? The variables are too fluid. Systems level porting has a rigid target.

I refuse to let you piss me off any more.
-drl
New When I see you spouting it, I'm going to call you on it.
If that pisses you off, I don't care.

Stop with the hand-waving bullshit ignorant generalizations, and maybe people will stop calling you an idiot.
Regards,

-scott anderson

"Welcome to Rivendell, Mr. Anderson..."
New Fair enough!
-drl
New Do you have a clue why Linux is easily ported?
Apparently not, because it isn't that the hardware is all standardized. At least not according to Linus Torvalds...

Cheers,
Ben
"good ideas and bad code build communities, the other three combinations do not"
- [link|http://archives.real-time.com/pipermail/cocoon-devel/2000-October/003023.html|Stefano Mazzocchi]
New Re: Do you have a clue why Linux is easily ported?
The *methods* (memory management, file systems, networking..) are more or less standardized, as Linus himself said, here:

[link|http://www.oreilly.com/catalog/opensources/book/appa.html|http://www.oreilly.c...es/book/appa.html]

[link|http://www.forwiss.uni-passau.de/archive/linux/personen/interview.html|http://www.forwiss.u...en/interview.html]

[link|http://www.linuxworld.com/linuxworld/lw-1999-03/lw-03-opensources.html|http://www.linuxworl...-opensources.html]

Quote:

"The Linux kernel isn't written to be portable to any architecture. I decided that if a target architecture is fundamentally sane enough, and follows some basic rules then Linux would fundamentally support that kind of model. For example, memory management can be very different from one machine to another. I read up on the 68K, the Sparc, the Alpha, and the PowerPC memory management documents, and found that while there are differences in the details, there was a lot in common in the use of paging, caching, and so on."

Exactly what I claimed.
-drl
New No, that is not quite what you claimed
You claimed that portability is a moot point.

Which is obviously wrong, because plenty of people are out there demonstrating that portability is very far from moot.

What Linus knows is how to achieve portability. As happens with many well-designed solutions, what you have to do makes the problem so transparent, that it is easy to miss that anything was done.

See [link|http://kt.zork.net/kernel-traffic/kt20000501_65.html#5|this] for a longer explanation of how you achieve portability. Then re-read the page that you quoted from. That is what Linus is doing.

Just in case someone missed it, here is how it works. What you do is define a simplified idealized model. Program to that model. For each architecture, supply compatibility macros so that the ugly details of that architecture look like that model. Except in a general outline, the architectures need not work the same way. This approach allows you to hide that fact in a clean way, with the ugly details hidden away nicely in an unobtrusive fashion.

However if you attempted to do the same thing using a different design, then you would very quickly find out that the differences are not minor, and portability is very, very far from being a moot point.

Cheers,
Ben
"good ideas and bad code build communities, the other three combinations do not"
- [link|http://archives.real-time.com/pipermail/cocoon-devel/2000-October/003023.html|Stefano Mazzocchi]
New Well, to me
-drl
New Well, to me "moot" means..
..solved. Something becomes moot once a way to solve the issue has occured. If you keep arguing when a valid solution is at hand, then the argument is not about the original issue. So any points you make are "moot" - it's argument for its own sake. Isn't that what the word means?

Let's see:


Usage Note: The adjective moot is originally a legal term going back to the mid-16th century. It derives from the noun moot, in its sense of a hypothetical case argued as an exercise by law students. Consequently, a moot question is one that is arguable or open to debate. But in the mid-19th century people also began to look at the hypothetical side of moot as its essential meaning, and they started to use the word to mean "of no significance or relevance." Thus, a moot point, however debatable, is one that has no practical value. A number of critics have objected to this use, but 59 percent of the Usage Panel accepts it in the sentence The nominee himself chastised the White House for failing to do more to support him, but his concerns became moot when a number of Republicans announced that they, too, would oppose the nomination. When using moot one should be sure that the context makes clear which sense is meant.


Yep, that's what it means - argument for its own sake.

So, it turns out we agree, but you didn't understand what I was saying.

BTW I DO agree with everything you said in the above post. Not that it means anything to you.

(edit: KDE3's Klipper apparently has cut/paste issues.)
-drl
Expand Edited by deSitter May 14, 2003, 03:55:04 PM EDT
New Why does your position appear to be shifting?
At first you argued that portability was a moot point because all hardware was pretty much the same.

Now you are arguing that it is a moot point because there is a known strategy for achieving it which works (if you have sufficient knowledge and discipline to apply it properly), despite the fact that the hardware is not really the same.

Other than the fact that you are drawing the same conclusion both times, the two arguments do not actually agree.

Regards,
Ben
"good ideas and bad code build communities, the other three combinations do not"
- [link|http://archives.real-time.com/pipermail/cocoon-devel/2000-October/003023.html|Stefano Mazzocchi]
New Modus operandi
Hand-wave until someone calls you on it, then bullshit your way towards an "agreeable" position.
Regards,

-scott anderson

"Welcome to Rivendell, Mr. Anderson..."
New Re: Modus operandi
Tell me exactly what has shifted in my position between now and when I posted about it years ago. You can search Karsten's archives (I can't). You'll find a thread somewhere where 1) pointed out the Linuxworld article 2) waxed ecstatic that someone knew what he was doing 3) generated a similar flamewar, most likely.

-drl
New Re: Why does your position appear to be shifting?
Not at all. My position is exactly as quoted by Linus, in the LinuxWorld article, and in the exchange with the useless academic Tanenbaum, because when I read those back when, they made a large impression on me. He basically made a theory of portability and then implemented it - just as you said. All the talk of portability after this is moot - because it was based on wrong assumptions. "Moot" - for the sake of argument alone, because the reality is otherwise.

Whatever, in any case.
-drl
New Wasn't Pascal written as a teaching tool?
I can't remember where I heard that, but I distinctly remember hearing that it was designed to teach programming concepts and it kind of grew into a "real" language. That was the first language I had formal instruction in. The only one if you discount the Fortran class I took and never used anything from.

Stuff I've written from scratch still looks a lot like what I wrote for that Pascal class: a short (like 15 lines or less) main program at the top, each line of which calls a function or procedure. I like being able to read the first screen worth of a program and know basically what it does.
===

Implicitly condoning stupidity since 2001.
New Yes
see for example [link|http://www.engin.umd.umich.edu/CIS/course.des/cis400/pascal/pascal.html|The Pascal Language Page]

"His principle objectives for Pascal were for the language to be efficent to implement and run, allow for the development of well structured and well organized programs, and to serve as a vehicle for the teaching of the important concepts of computer programming " (emphasis added).
New Re: Wasn't Pascal written as a teaching tool?
drewk [...]a short main program at the top [...]

Wait a minute! I thought the main program in standard Pascal was always at the bottom of the listing.
--
-- Jim Weirich jweirich@one.net [link|http://w3.one.net/~jweirich|http://w3.one.net/~jweirich]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
New Re: Wasn't Pascal written as a teaching tool?
Wait a minute! I thought the main program in standard Pascal was always at the bottom of the listing.
Depends on the status of your love/hate relationship with forward declarations...
-YendorMike

[link|http://www.hope-ride.org/|http://www.hope-ride.org/]
New Forward Declarations
Yendor: Depends on the status of your love/hate relationship with forward declarations...

Hmmm ... as I recall the structure of a Pascal Program, it looks like this ...
\n  program Prog\n    procedure x\n      begin\n      end\n    procedure y\n      begin\n      end\n  begin\n    (* main program goes here *)\n  end.
The forward declaration could be used to allow procedure x to call procedure y (otherwise mutual recursion is really difficult), but all procedures still have to come before the main (at least in standard Pascal ... I'm sure most people didn't write programs in purely standard Pascal).

Then again, maybe my memory is just deficient. (They say memory is the second thing to go ... I forget what the first is).
--
-- Jim Weirich jweirich@one.net [link|http://w3.one.net/~jweirich|http://w3.one.net/~jweirich]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
New Hmm, was Turbo Pascal different about that?
===

Implicitly condoning stupidity since 2001.
New Been too long
Hmmm ... as I recall the structure of a Pascal Program, it looks like this [...] but all procedures still have to come before the main (at least in standard Pascal ... I'm sure most people didn't write programs in purely standard Pascal).
I won't claim to remember the required structure of a Pascal program, as it's been about 10 years since I've even seen one. I do remember that I first learned about forward declarations in Pascal. *shrug* Perhaps the main *does* need to go at the end. That is where ISTR putting most of my main()s, anyway...
-YendorMike

[link|http://www.hope-ride.org/|http://www.hope-ride.org/]
New Nope, you're right.
Jensen/Wirth Pascal required the "main" at the end, afther the local procedures/functions.

J/W Pascal didn't support the concept of an include file either.

Turbo pascal took care of those...er, "oversights"....

[Edit: clarified what I was referring to]
jb4
"We continue to live in a world where all our know-how is locked into binary files in an unknown format. If our documents are our corporate memory, Microsoft still has us all condemned to Alzheimer's."
Simon Phipps, SUN Microsystems
Expand Edited by jb4 May 14, 2003, 01:38:44 PM EDT
New Not when I learned it
Ot not the way I wrote it anyway. Like I said, I like to open a file, read the top, and know what it does.
===

Implicitly condoning stupidity since 2001.
New Same bandaid as C++ templates
More or less. Its actually a little more limited wrt handling primitive types.

But I think we've all seen how effective this solution is (not terribly).




"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Re: The awakening begins
Hi, this post may seem too long, or offtopic at first but I mostly
want to use this typing debate to elaborate on my problems with learning how
to program.
You should know I don't consider myself a programmer yet,I almost did not write any programs (but I of course tested, played with code and example from books I read)

What's really creating the crisis/delay/problem/barrier for me is that
I usually try to figure out the big picture of things before I bother with
details, in creating a progam this is not always possible, sometimes you
have to start from creating the details, until you have large enough concepts
that you can use to figure out the big picture or how a program will finally
look like.
Plus creating any significant program require lots of library studying, trial and error work, wich compared to learning a new language ,is very boring and feels like labor!

Anyway, since I decided to learn how to program, I went to the issue/thing of
choosing a language to learn, my first attempt was Java
why?
The java sdk was free, several free IDEs (Netbeans, eclipse, etc...)
plus j2ee looked like an impressive framework ...

I bought learning-java in 21 days (3 weeks) I read the first week
few chapters from thinking in Java by bruce eeckel and gave up on java !!!
the books does an extremely poor job on keeping you exited about the
wonderful skill/ability to create a functional program

I decided to look for another language ... by that time I have gottin into
linux, got a clear view of the GNU, FSF and the whole thing

Most of the programmer communities seem to have forgotin how it was before
they knew this world existed

I decided to learn about the cult, vim vs emacs, kde vs gnome, python vs perl..
I do believe that those debates are as serious as it gets

Comapring software is serious
we are supposed to learn how to make software
we should know a good program when we see it,
and one way or another some programs should be "by design" better then others
plus we are talking free software here, we dont just get to see what the
program do, we also get to see how it was built, i dont recall ever seeing
an article comparing the source code and source code design choices of 2
programs that do the same thing,
what is good code? how do i know it when i see it !!!!

Some people call these comparisions religious or flame wars ...
Of course if we are gonna settle for vim sux, emacs rules then yes
this is shallow, but if we start discussing whats the best UI interface
or human interface as some like to call it we might really get somewhere..
Plus maybe they have a point, preferring kde or gnome can be purely a matter
of taste (unless one offer a basic need/function that the other doesnt)
but on the source code level, which have better code, and why?

Anyway, I after reading a dozen lang-x vs lang-x vs ....
I picked python
Perl had a nasty and scary reputation of being write only
Python had more free books online (at the time I didnt know how big the perldocs were, probably since I mainly started to favor python since my win98 days)
And python also had Zope, which looked impressive.( till today I haven't learned what zope really do, on how it do it, yet it was mostly the determining factor for me to choose to learn python over other langs )

I downloaded several books, read some, got bored again !!!

C followed python, I already took 2 programming course in uni in which we used C, is low level compared to Java and Python
And it was not object oriented.
It was fun and challenging until I finished learning the semantics

By that time I had already heard of Ocaml and Objective-C

So i started considereing picking up one of them!
For different reasons I picked Ocaml, mainly because it was a functional
lang, and I thought that I need to be introduced to that paradigm.

Aahh, the programming paradigm, I was introduced to that topic
by a post made by ben tilly on perlmonks, and I kind of find out
about this forum from his home node @ perlmonks. (Thank you)

Beside traditional functional programming ocaml also introduced me to types
and to the notion of comparing classes to modules not just to types!

All the time I was learning I was always tried to philosophy programming, and OOP, what are objects? what a program should be like?
(first I design my objects, then I design possible scenarios where those
objects will send message to each other or what? And those scenario they will be objects themselves, right? I don't know, but the stuff i read make it seem that way to me!)

There is a strong barrier for people who want to learn how to program
that many books ignore, and that is how do I really create a program, divide it into
modules, create the main function/loop
After I learn how to create functions and classes, now what, how should they play togethe
maybe I am stupid or avoiding the obvious.

Anyway, I do believe that the CS literature (the sum of books writen on the topic) is full of irresponsible writer and very poorly writen books!

Static typing, good or bad ( why is that even an issues today in 2003 )
Shoudnt the CS literature be mature enough to objectively answer this question!!!

==Types==
lets move to my attempt to understand and philosophy types

why do many ppl insist on ignoring the fact
that interger as a group of acceptable values that have a useful meaning!
are fundamentaly different then ... let say structures(or lists or hashtables) !
they are both so very different I dont think we should call em the same
thing! (types)

==c==
int a = 1;
// a can only hold a value in a certain range

int a = 'a';

this would pass, but is 'a' an acceptable value for an integer no
but the c compiler dont mind it cause 'a' is actually a binary value that
can in turn be meaningful as an integer!

but'a', is not an integer type, not in this world, only in the computer world for sure!!

==ocaml==
Ocaml (which is strictly typed) allow you to do this
type day = Mon | Tue | Wed | Thu | Fri | Sat | Sun ;;

Now any variable known to be of type day can only have one of those values!

==c==
you can emulate this in c by doing
typedef enum { Mon , Tue , Wed , Thu , Fri ,Sat , Sun } weekdays_t;

but we know that weekdays_t is actually a collection of intergers, in ocaml we don't it's another type, and we can never treat Mon as 1 in ocaml!!!

==Objects==
Now Objects are a sturcture (that is of special value to us, so in
that sense it is a value)!

{
Number = 1 ;
Character = 'A' ;
String = "something";
}

But of course, we always had stuctures in C so what's the fuzz ??

==Objective-C==
the free book from apple, does a great job at explaining oop concepts , dynamic typing, binding and more ...
especially messaging!

Objective-C, says that ( or so far it does, I am at page 92 in that book )
data are hidden inside an object, an object have methods
and you can send an object a message to request a value (function)
or perform a task (procedure), if the object encapsulate a value that you
would like to know, you must create a method that retrieve it represent it
to you!
I think this goes inline with Ocaml when it compared classes to modules
and when the objective-c books said that Classes are a true logical piece
of the program (unlike module)

<!---
on side note:
you can use objects to emulate closures!!!
very interesting, i wonder doesnt that mean, that a language with objects
would be making redundant efforts by adding closures!
---!>

==Back to the main topic==

so where does static typing fit in all that how can we discuss it, in a way better then static typind suxxxxx dynamism ruuuulllllllleeeeeeeesssssssssssss
or cause it makes right shorter code, or that it makes me right code easier!
(what can the average person who want to be a descent programmer one day, learn from that?)

I really don't think I know, I don't think I got the big picutre yet, but i will give it a shot anyway:
C is said to be a static and weakly typed language!
Ocaml is static, strict, polymorphic, compiled
Objective-c is OO, dynamic, compiled
Python it's interpreted, dynanmic OO

They all work So It's like <!--- BLANK, WHITE, VOID ---!> for me
Does strict typing allow for less testing ? (if so isn't that good)
wouldn't that mean shorter test session, less time to make software
which can be seen as a benefit!

I read this in the objective-c books
NSobject, is an abstract class, that is not usefull by it self
cause it does nothing in particular!

And I think that's what type checking can do for a new programming student
it's help him construct programs that does something specific and useful, rather that chasing shadows!

Most of those top notch programmers who say I dont need static type
checking are probably skilled enough not to do the mistakes that static typing highlight!
And they enjoy the flexibility of the "all structural types are generic!"
a list in python can hold a combination of arbitary types
(I want to ask experienced programmers when is that useful)

so if i may i see that the current popular terminology defines that:
(...)type checking done at compile time is called static
(...)type checking done at run time is called dynamic

and if i may, i would like to create my own terminology which i think
describe the situatio better:
(...)type checking done by the compiler, is called automated type checking
(...)type checking done at runtime, is called manual type checking

so maybe we have a problem, because manual type checking performed by human
usually lead, to better (as in more flexible) results then that done by
the machine!
this is sad, the solution should not be to recommend manual type checking
but rather create better software type checkers
if the process of verifying which objects are accept by a method and which objects are rejected is structured enough, we can theoratically create a software
that identify them and report them (as they are) to the programmer
I think this should, be part of the compiler job

this already exist, and is known as type inference (check ocaml)

if the certian aspects of the type are irrelevant then the compiler report just
just (in ocaml we 'a list type, which is a list of anything, that is any one kind of things )
it's good the compiler points that to you, right!

===Conclusion===

I dont think the problem is static typing
i think the problem is how we do static typing
and more importantly how we seem to fail to identify types

If a function call, on specific attributes of a type
for example: obj.walk()
the type here should be partially defined
obj should be typed any objects that have attribute method walk()

and this should be make clear in the method or function signature
of course many statically typed langs wont let you do this
and the solution there is as ToddBlanchard said, to refactor ( I assume
this means, redesign) your class hierarchy, to have an abstract type
that have just enough attributes to path this method sucessfully

the dynanmic manual solution wont have this problem since it allow a JIT kind
of type checking
but if an object have to pass this method, but unfortunatly
generate a run-time type error, the solution here would be to read the method implemenation to see which attribute is missing, and add to the object that must use this method
which still seem more akward to me then successful (partial) static typing
and this mean that here you have to completly reveil the implementation
detail whereas in automated type checking, the implementation is partially
reveiled

of course the key issue here is good type inference

we are programmers, our job, is to find ways to let to machine
do as much of the work as possible.
if the machine can do it, the machine should do it
TDD+dynamic typing shouldn't be the solution, to partial types
working on type inference techniques, and automated (static) type checking is!

Of course, until we have that, and maybe we do ( i still dont know all that much about ocaml)
TDD + a dynamic language could be the best we have now!

but of course a descent advice that stresses the importance of testing will always be good!(and it actually do more then type checking)

Anyway, I dont know how to continue any furthur in this debate
But I just wanted to say how I think so that maybe others
can highlight to me, where I think wrong or where I gone bogus

Thanx for reading such a long post :)
And I hope to read long meaningfull post from all!

Expand Edited by systems May 12, 2003, 10:43:42 AM EDT
Expand Edited by systems May 13, 2003, 10:27:56 PM EDT
Expand Edited by systems May 14, 2003, 09:55:29 AM EDT
New A couple answers
The beef most dynamic language fans have with static typing is the loss in flexibility. It puts up all sorts of unnecessary walls and development often feels like an exercise in digging tunnels for no good reason.

These computer models of types just approximate the real world. The world isn't that well organized. You can take slices of things and they seem organized within the slice, but its only one perspective.

A typical example:

Birds can fly.
A penguin is a bird.
A penguin can fly.

An error caused by an overly general assumption early on. (Not all birds can fly - but most do).

This kind of error occurs in program development all the time. You make a simplifying assumption early on. You proceed based on that assumption. Eventually, some new information comes along that doesn't fit your original assumption.

Question - when your assumption is proven to be false, what do you do?

In a strong statically typed language, you have no choice, you must aggresively refactor you class hierarchy to account for these differences. Suddenly instead of abstract class Bird with operation fly(), you have to create FlyingAnimal, add each of the types of birds that can fly individually. And you've lost the birdness of them. You have gained the ability to add bats, flying fish, and flying squirrels.

You can say that you could solve this with multiple inheritance and divide the types into protocols or interfaces. Then you might have something like:

Penguin : <Bird, FurBearing, Swimmer, LandDweller, Diurnal>
Canary : <Bird, Feathered, Flying, TreeDweller, Diurnal>
Bat : <Mammal, FurBearing, Flying, CaveDweller, Nocturnal>

But this fine level of factoring maybe isn't necessary for your application to work (maybe its a zoo food distribution system).

I'd like to refer you to Bart Kosko's "Fuzzy Thinking" as an interesting (and not too technical) read. Fuzzy people's idea of set membership isn't binary, its a float - how strongly do you exhibit a certain membership? How Catholic are you? In my younger dating days I came to understand with girls that there's Catholic and Catholic. That's just life.

OK, so assuming we have a dynamically typed system and we make the same bad assumption. We don't have to refactor the entire type hierarchy to account for shifts in the way we look at things. We can implement the flying protocol on anything we like and simply remember to only hand flyers to things that expect things to fly.


a list in python can hold a combination of arbitary types
(I want to ask experienced programmers when is that useful)


Well, this is Smalltalk. But consider:

"a collection of stuff with nothing in common?"
stuff := OrderedCollection with: (Refrigerator new) with: (Sofa new) with: (Penguin new) with: (BowlingBall new) with: (Sandwich new).

truck := Truck withCapacity: 2000. "One ton truck"

weight := 0.
truck load: (stuff select: [:item |
((weight + (item weight)) < (truck capacity))
ifTrue: [weight := weight + item weight. true]
ifFalse: [false]]).

Apparently all that stuff has something in common after all.




"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New ICLRPD
Fuzzy people's idea of set membership isn't binary, its a float - how strongly do you exhibit a certain membership? How Catholic are you? In my younger dating days I came to understand with girls that there's Catholic and Catholic. That's just life.
Or just:
In my younger dating days I came to understand with girls that there's Catholic and Catholic. That's just life.
===

Implicitly condoning stupidity since 2001.
New Do I C another one...?
Penguin : <Bird, FurBearing, Swimmer, LandDweller, Diurnal>
Penguins are "FurBearing"?!?

Only in Todd's World! :-)


   [link|mailto:MyUserId@MyISP.CountryCode|Christian R. Conrad]
(I live in Finland, and my e-mail in-box is at the Saunalahti company.)
Your lies are of Microsoftian Scale and boring to boot. Your 'depression' may be the closest you ever come to recognizing truth: you have no 'inferiority complex', you are inferior - and something inside you recognizes this. - [link|http://z.iwethey.org/forums/render/content/show?contentid=71575|Ashton Brown]
New Aren't they like seals?
They aren't feathered.



"Packed like lemmings into shiny metal boxes.
Contestants in a suicidal race."
    - Synchronicity II - The Police
New Yes they are.
There are several species of penguins that have feathered crests. Other than that it's just skin and blubber. Seals have hair.
Regards,

-scott anderson

"Welcome to Rivendell, Mr. Anderson..."
New No - they're "almost, but not entirely, unlike" seals.
Toddsko:
They aren't feathered.
Who aren't - the seals? No, that's right.

Penguins, OTOH, being birds, frigging well ARE feathered. (And not just the crests, either; where the heck didya get *that* from, Scott?)

I mean, just look at the blurbs ON THIS SEARCH PAGE ITSELF: [link|http://www.google.com/search?q=penguins+feathers+fur|http://www.google.com/search?q=penguins+feathers+fur] !

Sheesh...


   [link|mailto:MyUserId@MyISP.CountryCode|Christian R. Conrad]
(I live in Finland, and my e-mail in-box is at the Saunalahti company.)
Your lies are of Microsoftian Scale and boring to boot. Your 'depression' may be the closest you ever come to recognizing truth: you have no 'inferiority complex', you are inferior - and something inside you recognizes this. - [link|http://z.iwethey.org/forums/render/content/show?contentid=71575|Ashton Brown]
New NFC.
Regards,

-scott anderson

"Welcome to Rivendell, Mr. Anderson..."
New Does the phrase "Splitting Hairs" come to mind. :-)
New birds are feathered and hairy
anyone who has spent time plucking ducks and geese know that
thanx,
bill
will work for cash and other incentives [link|http://home.tampabay.rr.com/boxley/resume/Resume.html|skill set]

questions, help? [link|mailto:pappas@catholic.org|email pappas at catholic.org]

Carpe Dieu
New It's all feathers.
The stuff underneath is a type of feather that only develops wisps, instead of wisps and barbed interlocking spines.
Regards,

-scott anderson

"Welcome to Rivendell, Mr. Anderson..."
New Re: It's all feathers.
Could be Conrad is a feather expert. Anki says, "Chrissy, put on the Indian outfit!"

(PS: Aren't feathers just morphed scales anyway?)
-drl
New Re, "PS": Yeah, sure - so, whatchathink HAIRS are?!?
New Never really thought about it..
..but these guys have!

[link|http://www.cmnh.org/dinoarch/1998Jul/msg00323.html|http://www.cmnh.org/...Jul/msg00323.html]
-drl
New Sometimes there aren't right answers
Just trade-offs. And a lot of people with strong opinions. And a lot of people without real opinions, but who are scared that the answers might not be what they have learned.

This situation can last indefinitely, particularly since in different problem domains, different answers can be better.

In fact I don't expect debates about typing to go away in my lifetime...

Cheers,
Ben

PS It is always a good feeling to believe that you have had a positive role in someone else's learning curve. So if I helped you at some point, it was my pleasure. :-)
"good ideas and bad code build communities, the other three combinations do not"
- [link|http://archives.real-time.com/pipermail/cocoon-devel/2000-October/003023.html|Stefano Mazzocchi]
New Down with Determinants! :)
-drl
New Please indicate what you changed in an Edit. Thanks. :-)
     The awakening begins - (tuberculosis) - (140)
         Quotes from Uncle Bob - (admin) - (12)
             Maybe it's just me... - (Simon_Jester) - (5)
                 Static languages make the code brittle ... - (bluke)
                 History revisionism - beware !!! (IMHO) - (dmarker) - (3)
                     Re: History revisionism - beware !!! (IMHO) - (JimWeirich) - (2)
                         Another issue was the potential popularity of a lang - (dmarker) - (1)
                             Re: Another issue was the potential popularity of a lang - (JimWeirich)
             Gee...I thought it was a friendly discussion... - (jb4) - (4)
                 Re: Gee...I thought it was a friendly discussion... - (JimWeirich) - (3)
                     Manifest typing....a la Fortran. - (Simon_Jester)
                     Thanks, Jim. Nicely put. -NT - (jb4) - (1)
                         Re: Ditto - Thanks, Jim. -NT - (dmarker)
             Next experiment: try it without OO -NT - (tablizer)
         Java going in the other direction - (bluke) - (109)
             Re: Java going in the other direction - (JimWeirich) - (34)
                 Smalltalk also - (bluke)
                 Speaking of autoboxing - (ChrisR) - (32)
                     gasp -NT - (deSitter) - (2)
                         This is what happens when the foundation sucks - (bluke) - (1)
                             Oh My! - (deSitter)
                     According to Joshua Bloch it hasn't been decided yet - (bluke) - (28)
                         This is just stupid - (tuberculosis) - (27)
                             I think you missed the point - (JimWeirich) - (5)
                                 OK, maybe so - (tuberculosis) - (4)
                                     Re: OK, maybe so - (JimWeirich) - (3)
                                         Well in this case - (tuberculosis) - (2)
                                             Perhaps ... but ... - (JimWeirich) - (1)
                                                 My point was - (tuberculosis)
                             Not J-heads. - (admin) - (1)
                                 Smalltalk as usual is consistent - (bluke)
                             Set Theory - (deSitter) - (18)
                                 Re: Set Theory - (admin) - (12)
                                     Here we go - (deSitter) - (11)
                                         Re: Here we go - (admin) - (10)
                                             Amazing - (deSitter) - (9)
                                                 Re: Amazing - (admin) - (6)
                                                     Re: Amazing - (deSitter) - (5)
                                                         Wow. My first exposure to APL - (Arkadiy) - (1)
                                                             Same as in Objective C -NT - (admin)
                                                         Re: Amazing - (JimWeirich) - (2)
                                                             Heh. - (tseliot) - (1)
                                                                 ROFL -NT - (deSitter)
                                                 No - (Arkadiy)
                                                 Hey Ross, it's only a model. - (mmoffitt)
                                 Hey, watch this! - (drewk)
                                 Unlike DrooK, I'll bite: Ever heard of SQL, ya nitwit?!? -NT - (CRConrad) - (3)
                                     See comment above, applies here as well - (deSitter) - (2)
                                         Better stop talking to yourself then. - (admin)
                                         Your problem is the same you had a year (or was it two?) ago - (CRConrad)
             I remeber Pascal in the very same way - (jb4) - (72)
                 Just had this conversation - (tseliot) - (45)
                     Freep said the same thing - (tuberculosis) - (43)
                         Still waiting for ... - (jb4) - (42)
                             Depends on constraints - (tuberculosis) - (41)
                                 Platforms: - (jb4) - (40)
                                     Don't even get me started - (tuberculosis) - (30)
                                         I'll get you started, alright! - (jb4) - (29)
                                             No I'm not - (tuberculosis) - (28)
                                                 The problem is, you're trying to treat a bool as a number - (jb4) - (25)
                                                     No, I'm trying to branch on a condition - (tuberculosis) - (24)
                                                         21st Century Schitzoid Man - (jb4) - (23)
                                                             You are fighting the language - (tuberculosis) - (22)
                                                                 Tell you what... - (jb4)
                                                                 Can I put my oar in? - (static) - (20)
                                                                     Yeah sure - (tuberculosis) - (19)
                                                                         Such flowerly language toward such a misguided conclusion - (jb4) - (18)
                                                                             Yeah right - (tuberculosis) - (17)
                                                                                 (++true == false) - (ChrisR) - (1)
                                                                                     Just add a little gasoline, and stir!_____;-) - (jb4)
                                                                                 OK, Now I see wht your problem is - (jb4) - (5)
                                                                                     I thought you were going to give up on this - (tuberculosis) - (4)
                                                                                         How sensible is this?!? - (jb4) - (3)
                                                                                             Not convinced - (tuberculosis) - (2)
                                                                                                 Nor am I - (jb4) - (1)
                                                                                                     You guys should be using Modula-2. :-P (new thread) - (Another Scott)
                                                                                 And an answer to your question. - (jb4) - (8)
                                                                                     Wrong answer - (tuberculosis) - (7)
                                                                                         Wrong answer back - (jb4)
                                                                                         They've turned it into Pascal - (deSitter) - (5)
                                                                                             Circular definition. - (CRConrad) - (4)
                                                                                                 Re: Circular definition. - (deSitter) - (3)
                                                                                                     Self-contradiction, and logically inconsistent definition. - (CRConrad) - (2)
                                                                                                         Can someone start a new thread please? -NT - (drewk) - (1)
                                                                                                             What for, aren't the long ones the best? -NT - (CRConrad)
                                                 Comments on supposed idiocy - (JimWeirich) - (1)
                                                     Re: Comments on supposed idiocy - (tuberculosis)
                                     Don't even get me started - (tuberculosis)
                                     You didn't mention types of programs -NT - (tuberculosis) - (7)
                                         Sorry, thot I was clear earlier... - (jb4) - (6)
                                             Still doesn't tell me enough - (tuberculosis) - (5)
                                                 Re: Still doesn't tell me enough - (jb4) - (4)
                                                     The VM's are all written in very portable C - (tuberculosis) - (3)
                                                         Re: The VM's are all written in very portable C - (deSitter) - (2)
                                                             Funny you should mention it - (tuberculosis) - (1)
                                                                 Re: Funny you should mention it - (deSitter)
                     Minor modification - (jb4)
                 Just because *you* don't see it... - (pwhysall)
                 In fact.. - (deSitter) - (15)
                     Heh... - (jb4) - (14)
                         Re: Heh... - (deSitter) - (13)
                             BS - (admin) - (3)
                                 BS - (deSitter) - (2)
                                     When I see you spouting it, I'm going to call you on it. - (admin) - (1)
                                         Fair enough! -NT - (deSitter)
                             Do you have a clue why Linux is easily ported? - (ben_tilly) - (8)
                                 Re: Do you have a clue why Linux is easily ported? - (deSitter) - (7)
                                     No, that is not quite what you claimed - (ben_tilly) - (6)
                                         Well, to me -NT - (deSitter)
                                         Well, to me "moot" means.. - (deSitter) - (4)
                                             Why does your position appear to be shifting? - (ben_tilly) - (3)
                                                 Modus operandi - (admin) - (1)
                                                     Re: Modus operandi - (deSitter)
                                                 Re: Why does your position appear to be shifting? - (deSitter)
                 Wasn't Pascal written as a teaching tool? - (drewk) - (8)
                     Yes - (bluke)
                     Re: Wasn't Pascal written as a teaching tool? - (JimWeirich) - (6)
                         Re: Wasn't Pascal written as a teaching tool? - (Yendor) - (4)
                             Forward Declarations - (JimWeirich) - (3)
                                 Hmm, was Turbo Pascal different about that? -NT - (drewk)
                                 Been too long - (Yendor)
                                 Nope, you're right. - (jb4)
                         Not when I learned it - (drewk)
             Same bandaid as C++ templates - (tuberculosis)
         Re: The awakening begins - (systems) - (16)
             A couple answers - (tuberculosis) - (12)
                 ICLRPD - (drewk)
                 Do I C another one...? - (CRConrad) - (10)
                     Aren't they like seals? - (tuberculosis) - (9)
                         Yes they are. - (admin)
                         No - they're "almost, but not entirely, unlike" seals. - (CRConrad) - (7)
                             NFC. -NT - (admin) - (1)
                                 Does the phrase "Splitting Hairs" come to mind. :-) -NT - (ChrisR)
                             birds are feathered and hairy - (boxley) - (4)
                                 It's all feathers. - (admin) - (3)
                                     Re: It's all feathers. - (deSitter) - (2)
                                         Re, "PS": Yeah, sure - so, whatchathink HAIRS are?!? -NT - (CRConrad) - (1)
                                             Never really thought about it.. - (deSitter)
             Sometimes there aren't right answers - (ben_tilly) - (1)
                 Down with Determinants! :) -NT - (deSitter)
             Please indicate what you changed in an Edit. Thanks. :-) -NT - (Another Scott)

MEAT HELMET!
422 ms