IWETHEY v. 0.3.0 | TODO
1,095 registered users | 0 active users | 0 LpH | Statistics
Login | Create New User
IWETHEY Banner

Welcome to IWETHEY!

New Huh?
C++ is not the only language with guaranteed destructor timing. For instance Perl has guaranteed destructor timing. Which [link|http://search.cpan.org/author/TILLY/ReleaseAction-0.03/ReleaseAction.pm|I have been known to use]. Of course guaranteed destruction timing requires reference counting, which is incompatible with true garbage collection, so many other OO languages won't ever have it.

But destruction mechanics is not what was described. What is described is the ability to do something automatically at scope exit. Which you can do with reliable destructors and a variable whose scope ends at scope exit. But you can do that in other ways as well. For instance in Ruby an ensure block will automatically run at scope exit - even if exiting due to an error. Many other OO languages have constructs to achieve the same effect.

Cheers,
Ben
"good ideas and bad code build communities, the other three combinations do not"
- [link|http://archives.real-time.com/pipermail/cocoon-devel/2000-October/003023.html|Stefano Mazzocchi]
New Re: Huh?
Ben: [...] For instance Perl has guaranteed destructor timing.

Thanks. I stand corrected. I'm not sure I even realized that Perl had destructors. (I shouldn't have been surprized however).

You could argue that Python also guarantees this, but then you would have to take the position that Jython is a different language (rather than an alternative implementation).

Ok, the list of languages with guaranteed destructor/finalizer timing stands at C++, Perl, and C-Python. Any others?
--
-- Jim Weirich jweirich@one.net [link|http://onestepback.org|http://onestepback.org]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
New Dunno, but I have a guess...
In order to be able to say, though, I'd have to know: WTF *is* "guaranteed destructor/finalizer timing"???

(I'd have thought that in any sensible language, stuff is "guaranteed" to execute when you bloody well *call* it; neither sooner, nor later...)


   [link|mailto:MyUserId@MyISP.CountryCode|Christian R. Conrad]
(I live in Finland, and my e-mail in-box is at the Saunalahti company.)
Your lies are of Microsoftian Scale and boring to boot. Your 'depression' may be the closest you ever come to recognizing truth: you have no 'inferiority complex', you are inferior - and something inside you recognizes this. - [link|http://z.iwethey.org/forums/render/content/show?contentid=71575|Ashton Brown]
New Guaranteed destructor timing explained
Often when an object is finished being used, resources used need to be cleaned up. For instance a network connection should be closed down gracefully, memory should be freed, file locks should be released or temporary files destroyed. For this reason many (most?) OO languages offer the ability to define actions that will be called just prior to object destruction.

But now we run into a question. When will the system realize that objects are done and can be destroyed? In systems which use reference counting, the answer is generally that as soon as you are finished with it, the reference count hits 0 and it is destroyed. In systems which use garbage collection, the answer is that you only finalize things when garbage collection runs. Therefore if you use reference counting it is possible to identify which point in code (generally upon assigning something else to a variable, or exiting a variable's scope) destruction will happen, and rely on that precise behaviour. But with true garbage collection the answer is whenever the system decides that it is time to do housekeeping.

Cheers,
Ben
"good ideas and bad code build communities, the other three combinations do not"
- [link|http://archives.real-time.com/pipermail/cocoon-devel/2000-October/003023.html|Stefano Mazzocchi]
New OK, Jim, add one more: Delphi. (And rename the concept!)
Ben Tilly writes:
Often when an object is finished being used, resources used need to be cleaned up. For instance a network connection should be closed down gracefully, memory should be freed, file locks should be released or temporary files destroyed. For this reason many (most?) OO languages offer the ability to define actions that will be called just prior to object destruction.
Listen, Perl-boy, has anyone ever told you that your condescending way can be mighty irritating at times...? If not, consider it done herewith.

I am a frigging OO programmer, you know; I bloody well know what a destructor is! My question was only and specifically about Jim's expression "guaranteed destructor timing" -- which was used, both in his original post and several subsequent ones (including yours that I'm replying to now) in a confusing and self-contradictory way that showed that it isn't really destructors you're talking about. Thankfully, Scott began to clear this up a little... And I'll spell it out all the way for you, so you won't have to embarass yourself like this again.


But now we run into a question. When will the system realize that objects are done and can be destroyed?
Ah, but the problem is not only about objects -- and therefore, not only about destructors -- but, as you say above (and judging from my memory of the whole thread; Jim already wrote an explanatory follow-up to someone else on this, IIRC), about making sure that all allocated "RESOURCES" are released. Since many "resources" you use even in OO code (except probably in weirdo environments like Smalltalk) are NOT objects -- OS stuff like memory blocks, file handles and locks, temp files, and network connections -- the problem scope is wider than just "guaranteed destructor timing" (and IIRC, that's what Jim explained/admitted in his second post).

The problem, thus, is really about being able to "guarantee" that some "resource"-releasing piece of code executes in a timely fashion (or at all, for that matter). And that doesn't necessarily have anything to do with destructors, per se; that's what try...finally blocks are for: In a finally block, you can make sure to release "resources", including but not limited to destruction of objects created in the try block; you can also (or, ONLY) release OTHER (i.e, non-object) "resources" allocated in the try block -- and, since the whole try...finally block may well be in a method other than the destructor of the calling object, it doesn't, Q.E.D, necessarily have to do with any destructor at all. So, Jim, rename your concept!


In systems which use reference counting, the answer is generally that as soon as you are finished with it, the reference count hits 0 and it is destroyed. In systems which use garbage collection, the answer is that you only finalize things when garbage collection runs. Therefore if you use reference counting it is possible to identify which point in code (generally upon assigning something else to a variable, or exiting a variable's scope) destruction will happen, and rely on that precise behaviour.
Funny -- how the fuck would C++ get on the list, if these were the only two ways to get rid of objects?!? Standard C++ doesn't, AFAICT, use either reference-count-based or "true" (So, is reference counting the same as lying, or what???) garbage collection; AFAIK, you bloody well have to call "destroy" or "release", or whateverthefuck it's called in C++, to kill any object you created. (Unless it happens to live on the stack, in which case it's "guaranteed" to get killed whenever the stack unwinds past it, like any normal variable in any non-OO language; this unnecessary and confusing dualism is one of the major reasons C++ sucks, BTW.) The fact that Jim did put C++ on "the list" was one of the main reasons I asked WTF he meant; clearly, it can't have been as simple as you are making it out to be, since in that case C++ wouldn't have got onto "the list".

Or, waitaminnit... Perhaps he actually meant that he put it onto "the list" precisely because in standard C++, destructors are "guaranteed" to get called when you call them, only he wasn't for some reason able to bring himself to say so in so many words? That's possible, I suppose... Actually, come to think of it, that's the only logically possible explanation. So, Jim, since Delphi works exactly like (my recollection of) C++ in this respect (and because it has try-except-finally blocks), put Delphi on the list, too. (And probably Ada, Modula-2/3/whatever, Oberon, and all other Pascal-derived OO languages, too.)


(
But with true garbage collection the answer is whenever the system decides that it is time to do housekeeping.
Which is one of the major reasons Java sucks, BTW.)


   [link|mailto:MyUserId@MyISP.CountryCode|Christian R. Conrad]
(I live in Finland, and my e-mail in-box is at the Saunalahti company.)
Your lies are of Microsoftian Scale and boring to boot. Your 'depression' may be the closest you ever come to recognizing truth: you have no 'inferiority complex', you are inferior - and something inside you recognizes this. - [link|http://z.iwethey.org/forums/render/content/show?contentid=71575|Ashton Brown]
Expand Edited by CRConrad July 17, 2003, 07:54:15 AM EDT
New Introducing The Magic Of HTML
Oi. Finn-boi. Listen up.

Your screeds would be markedly easier to read if you did this instead of *this*.



Peter
[link|http://www.debian.org|Shill For Hire]
[link|http://www.kuro5hin.org|There is no K5 Cabal]
[link|http://guildenstern.dyndns.org|Blog]
New Yeah, yeah, but not when I'm in a hurry.
Which I tend to become when I'm pissed off.

[Edit - add:] There, Whinge-Boi; happy, now?


   [link|mailto:MyUserId@MyISP.CountryCode|Christian R. Conrad]
(I live in Finland, and my e-mail in-box is at the Saunalahti company.)
Your lies are of Microsoftian Scale and boring to boot. Your 'depression' may be the closest you ever come to recognizing truth: you have no 'inferiority complex', you are inferior - and something inside you recognizes this. - [link|http://z.iwethey.org/forums/render/content/show?contentid=71575|Ashton Brown]
Expand Edited by CRConrad July 17, 2003, 07:55:58 AM EDT
New Well, slow down, then.
Rush-boi.


Peter
[link|http://www.debian.org|Shill For Hire]
[link|http://www.kuro5hin.org|There is no K5 Cabal]
[link|http://guildenstern.dyndns.org|Blog]
New Asterisks don't bother me
________________
oop.ismad.com
New It
-drl
New It
-drl
New STOP SAYING THE WORD!!
Regards,

-scott anderson

"Welcome to Rivendell, Mr. Anderson..."
New Suffice to say...
Suffice to say...<mumble> is one of the words the knights of Ni cannot hear!

Emphasis on the mumble.


I'm gonna go build my own theme park! With Blackjack! And hookers! In fact, forget the park!
New was it Nye or Ni?
I never got it quite right.

(Anyway...serves you right for saying "Ni!" to a poor defenseless woman.....)
New Ni!
Infidel.


Peter
[link|http://www.debian.org|Shill For Hire]
[link|http://www.kuro5hin.org|There is no K5 Cabal]
[link|http://guildenstern.dyndns.org|Blog]
New Sh!


I'm gonna go build my own theme park! With Blackjack! And hookers! In fact, forget the park!
New WTF?
-drl
New Re: OK, Jim, add one more: Delphi. (And rename the concept!)
CRC: [...]My question was only and specifically about Jim's expression "guaranteed destructor timing"

C++ guarantees that a destructor is run when a local object goes out of scope. Most other OO languages do not offer such guarantees. Ben pointed out that Perl does. I pointed out that C-based Python does, but the Java based Python does not.

In C++, it is a common idiom to use the guaranteed destructor timing to implement resource management (this is called the Resource Acquisition is Initialization, or RAII idiom). The trick is to allocate a resource in a constructor and release the resource in the destructor or finalizer. Essentially you turn a memory management technique into a general purpose resource management tool.

The problem is that RAII only works when you can guarantee the destructor or finalizer is deterministically called at the end of a scope.

The original poster was claiming that RAII was a killer argument for OO. I disagreed because: (1) Many[1] OO languages don't support it, and (2) there are usually better ways of handling the problem.

[1] I've changed my language from "most" to "many" because there seems to be a number of languages that could support it. But my point remains that there are many strongly OO languages that don't support RAII, e.g. Smalltalk, Ruby, Jython, Eiffel, CLOS.

P.S. I've included Eiffel in the non-RAII list, but now that I think on it, Eiffel supports stack based objects ("expanded" objects in Eiffel lingo), so I suppose that finalizers on expanded objects would have guaranteed timing. But I've never seen an Eiffel program that actaully takes advantage of it.
--
-- Jim Weirich jweirich@one.net [link|http://onestepback.org|http://onestepback.org]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
New Ah; the problem seems to be I didn't take into account...
...your apparently myopically limited horizon. :-(

Jim W. writes:
C++ guarantees that a destructor is run when a local object goes out of scope.
What's with this weird concentration on "local" objects? BTW, am I correct in surmising that this means stack-based objects? If so: Don't heap-based ones count, or what?

Instance the first of intellectual myopia -- seeing only "local" objects.


Most other OO languages do not offer such guarantees.
Really? Well, OK, if you say so... I must be missing something, though, so could you please just answer this simple question (that I thought I'd asked already, but perhaps I just implied it, too subtly): What the fuheck is wrong with the destructor of an object just simply being called when you actually CALL it?!? Isn't that "guarantee" enough???


In C++, it is a common idiom to use the guaranteed destructor timing to implement resource management (this is called the Resource Acquisition is Initialization, or RAII idiom). The trick is to allocate a resource in a constructor and release the resource in the destructor or finalizer. Essentially you turn a memory management technique into a general purpose resource management tool.
Ah, there's the rub: Once you C++ people start calling something "an idiom" and invent an acronym for it, you automatically assume that raises the concept to the elevated status of "Only in C++ can you be THIS advanced!"... Or, in other words: Yeah, so what else is new?!? That's so frigging obvious, people do it all the time in Delphi -- only, they didn't go to the trouble of inventing a Scientific-Sounding Acronym (SSA) for it, because it's an OOWODS -- an Obviously Obvious Way Of Doing Stuff.

Instance the second of intellectual myopia -- assuming that obvious concepts "only" exist as such in C++.


The problem is that RAII only works when you can guarantee the destructor or finalizer is deterministically called at the end of a scope.
If "the end of a scope" is all you're interested in, yes... But is it? Why? Should it be -- and again, why? Doesn't your fixation on "end of scope" limit the usefulness of the concept to only "local" objects -- what about "non-local" (i.e, heap-based?) ones?


The original poster was claiming that RAII was a killer argument for OO. I disagreed because: (1) Many[1] OO languages don't support it, and (2) there are usually better ways of handling the problem.
Yeah, sure, it's only one obvious way, among many, to do stuff. But still, iff'n ya gots it, it's nice, so... AFAICS, it's still a somewhat persuasive argument for those OO languages that do have it; and therefore, for at least a significant subset of OOP. (And perhaps one against those that don't?)


[1] I've changed my language from "most" to "many" because there seems to be a number of languages that could support it. But my point remains that there are many strongly OO languages that don't support RAII, e.g. Smalltalk, Ruby, Jython, Eiffel, CLOS.
Do up your fly, Jim, your bias is showing: If they don't support this trivial OO concept, maybe they aren't so "strongly" OO after all, eh? Who said they were? Based on which criteria -- and who got to choose those criteria?


   [link|mailto:MyUserId@MyISP.CountryCode|Christian R. Conrad]
(I live in Finland, and my e-mail in-box is at the Saunalahti company.)
Your lies are of Microsoftian Scale and boring to boot. Your 'depression' may be the closest you ever come to recognizing truth: you have no 'inferiority complex', you are inferior - and something inside you recognizes this. - [link|http://z.iwethey.org/forums/render/content/show?contentid=71575|Ashton Brown]
New Wow, are we in the same universe?
Its obvious that we are not communicating for some reason, but if you want to drop the vitrol and one-upmanship and continue in a more professional manner, I'm willing to give it a try.

Perhaps some of our confusion lies in the languages we are familier with. I confess that I don't know Delphi, but perhaps you could educate me in the following areas:

  1. Can you allocate objects that are stored on the stack (as opposed to an object that is allocated on the heap, but has a reference on the stack)?

  2. If (1) is YES, are destructors run on stack based objects when the stack frame is deallocated?

  3. Does Delphi provide any automated heap management (either GC or ref-counted), or does it rely on manual reclaimation of heap objects.

And if the above questions don't make sense in the context of Delphi, I would be interested in hearing why. Thanks.

In addition to Delphi, what other OO languages are you familier with? I ask in order to find some common ground for better communication.

To answer some of your questions.

CRC: What's with this weird concentration on "local" objects? BTW, am I correct in surmising that this means stack-based objects?

Yes, a local object in C++ is stack allocated (unless the static keyword is used ... let's not go there).

CRC: If so: Don't heap-based ones count, or what?

From the original Poster: [...] a function that is automatically (or automagically) called at scope exit. Since the lifetime of a heap variable is independent of scope, then they aren't pertinent to the question.

[ oops ... I hit save before I finished this ... I'll continue in separate posting]
--
-- Jim Weirich jweirich@one.net [link|http://onestepback.org|http://onestepback.org]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
Expand Edited by JimWeirich July 17, 2003, 02:32:44 PM EDT
New (... continuing from the previous message)
Sorry, the mouse slipped before I finished the previous posting. I was trying to address some of your questions ... I'll finish here...

CRC: What the [bleep] is wrong with the destructor of an object just simply being called when you actually CALL it?!?

Depending on the context, explicitly calling a destructor could be a good or bad thing. If the language automatically called destructors, then manually calling it might not be a great idea. But refering to the context of the original poster: automatically called at scope exit, I am assuming the destructor is called automatically. Manually calling the destructor is uninteresting in this context.

CRC: [...] you C++ people [...think...] "Only in C++ can you be THIS advanced!"

I'm not sure where you got the idea that I thought RAII was an advanced concept. It is using C++'s rather primitive memory management facilities for uses beyond memory management. This seems to me to be a bad idea, forgivable in C++ only because C++ has little else to offer in its place. Those that used the RAII idiom in Python will have problems running their code in Jython because the memory management is different. As Ben pointed out, assumptions about the behavior of memory management is causing great discussions about the move in Perl 6 to a true GC approach.

CRC: Doesn't your fixation on "end of scope" [...]

The reason for the "end of scope" fixation is merely because that is the context of the original posters statement.

CRC: If they [Smalltalk, Eiffel, Ruby, Jython] don't support this trivial OO concept, maybe they aren't so "strongly" OO after all, eh? Who said they were? Based on which criteria -- and who got to choose those criteria?

I've heard people dismiss C++ as non-OO, but this is the first time I've ever heard anyone claim that Smalltalk was non-OO. My only claim was that these languages are generally accepted as being OO languages, yet don't support a RAII style idiom, then how critical can the RAII idiom be to the OO concept. If you wish to argue against these languages being OO, then I for one would like to see that (but probably in a different thread).

--
-- Jim Weirich jweirich@one.net [link|http://onestepback.org|http://onestepback.org]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
New Combining replies into a single (bigger) slice of humble pie
Jim W. writes:
It[']s obvious that we are not communicating for some reason, but if you want to drop the vitr[i]ol and one-upmanship [...]
Ah, uhm... I'm sorry. I was cranky, misread you and misremembered the thread, and went out of line in my reply; please forgive me.


[...] and continue in a more professional manner, I'm willing to give it a try.
And I'll be delighted to try and behave in a more civilised manner, if that's good enough for you.[1]


Perhaps some of our confusion lies in the languages we are famili[a]r with. I confess that I don't know Delphi, but perhaps you could educate me in the following areas:
Sure.

Can you allocate objects that are stored on the stack (as opposed to an object that is allocated on the heap, but has a reference on the stack)?
Nope.

If (1) is YES, are destructors run on stack based objects when the stack frame is deallocated?
As Rick M. would say, "Mu".

(Well, come to think of it, there may actually have been some deep-OO hacks in this direction, but I can't remember for sure if that's really what my vague recollection of something I read on the Web years ago was about... If you really, really want to know, google for the phrase "Halvard Vassbotn" and some suitable keywords.)

Does Delphi provide any automated heap management (either GC or ref-counted), or does it rely on manual recl[am]ation of heap objects[?]
The answer to this is slightly messy, but with a strict literal interpretation of the question ("...any..."), the short-version reply boils down to: Yes.

The reason why the longer answer is a bit messy is because Delphi has two ways to refer to objects; i.e, two different kinds of types to declare object variables as: Object classes, or interfaces. (Rather like Java, no?) Straight class-type references have no GC whatsoever; interface references are managed by ref-counted garbage collection. Here's an example from the Delphi 7 Help files:
The Delphi compiler provides most of the IInterface memory management for you by its implementation of interface querying and reference counting. Therefore, if you have an object that lives and dies by its interfaces, you can easily use reference counting by deriving from TInterfacedObject. If you decide to use reference counting, then you must be careful to only hold the object as an interface reference, and to be consistent in your reference counting. For example:

procedure beep(x: ITest);
function test_func()
var
  y: ITest;
begin
  y := TTest.Create; // because y is of type ITest, the reference count is one
  beep(y);           // the act of calling the beep function increments the reference count
                     // and then decrements it when it returns
  y.something;       // object is still here with a reference count of one
                     // (Though presumably it'll be gone after the "end;". -- CRC)
end;


This is the cleanest and safest approach to memory management; and if you use TInterfacedObject it is handled automatically.
This, together with the fact that you can refer to the same object either as a class-type or an interface reference, has interesting consequences (on which more below).


And if the above questions don't make sense in the context of Delphi, I would be interested in hearing why. Thanks.
Of course they did, and you're welcome. (How could they not make sense?)


In addition to Delphi, what other OO languages are you famili[a]r with? I ask in order to find some common ground for better communication.
The usual: C++, Java; Visual Basic and JavaScript, to the extent you want to count those as OO. Smatterings of others; mostly just what I've picked up from books, or, more recently, code samples on the 'Net: Ada, Modula-2/3, Eiffel, C#, maybe some others I forget... Oh yeah: Rudimentary Smalltalk and LISP, though only to the extent that I've been able to conclude that they're syntactically, uh... weird. (There, has that at least somewhat allayed your fears that I'm so totally one-track I know nothing but Delphi? :-)


[Jim's answers to some of my earlier questions:]
Yes, a local object in C++ is stack allocated (unless the static keyword is used ... let's not go there).
Why not -- ya think it would just be another occasion for me to say, "That's one of the reasons C++ sucks"? :-)

CRC: If so: Don't heap-based ones count, or what?
From the original Poster: [...] a function that is automatically (or automagically) called at scope exit. Since the lifetime of a heap variable is independent of scope, then they aren't pertinent to the question.
Ah, yes, one of the things I forgot; why I should have re-read the thread... :-(

OTOH, "a function that is automatically (or automagically) called at scope exit" doesn't necessarily have to mean object destruction, does it? (cf Scott's post, and some of my earlier comments, about "finally".) Anyway, from what I understand of Delphi's mixed class / interface model, a scope exit can actually kill a heap-allocated object. Delphi's Help continues:
If you do not follow this rule, your object can unexpectedly disappear, as demonstrated in the following code:

function test_func()
var
  x: TTest;
begin
  x := TTest.Create; // no count on the object yet
  beep(x as ITest);  // count is incremented by the act of calling beep
                     // and decremented when it returns
  x.something;       // surprise, the object is gone
end;
And, since all Delphi objects, "local" or not, are allocated on the heap, TInterfacedObject or some other Interface implementation ought to be able to kill a non-"local" object, too. That is, if the "x := TTest.Create;" bit had been outside this function (but executed some time before it was called, preferably... :-) , then after this function returned, "x" would be gone. So, no, apparently: Heap variables "aren't pertinent to the question" only if one insists on limiting oneself to C++ (and similar languages).


[ oops ... I hit save before I finished this ... I'll continue in separate posting]

Sorry, the mouse slipped before I finished the previous posting. I was trying to address some of your questions ... I'll finish here...
Uh, why did you do it that way? I mean, you'd obviously discovered the Edit function here, so why didn't you just use that -- not in order to make me have to eat two separate slices of humble pie, I hope? :-)

CRC: What the [bleep][1] is wrong with the destructor of an object just simply being called when you actually CALL it?!?
Depending on the context, explicitly calling a destructor could be a good or bad thing. If the language automatically called destructors, then manually calling it might not be a great idea. But refer[r]ing to the context of the original poster: automatically called at scope exit, I am assuming the destructor is called automatically. Manually calling the destructor is uninteresting in this context.
Hmm... Yeah, probably.

(OTOH, Delphi's class libraries (the VCL and CLX frameworks) are to quite a considerable extent "semi-automatic" in this respect: The network of "Owner" relationships in, for example, a TForm descendant, make sure that when the top-level object is destroyed (possibly by some Interface-brokered trick as per above) all its "Owned" components are destroyed (in order, and each executing its destructor), too. Quite "manual", in the sense that the calls are explicitly there in the framework source-code -- as opposed to, say, C++ with its spontaneous compiler-black-magic auto-generated creation and destruction of objects -- but "automatic" in the sense that to the user (not end-user; framework user, application programmer), when he's done with something and gets rid of it, all its dependent objects go away too, almost "as if by magic".)

CRC: [...] you C++ people [...think...] "Only in C++ can you be THIS advanced!"
I'm not sure where you got the idea that I thought RAII was an advanced concept.
Well, if everybody thought it was perfectly obvious and trivial, they wouldn't have gone to the trouble of inventing an acronym for it, would they? But, of course I shouldn't have blamed you, personally, for this, and I'm sorry it came out that way.


It is using C++'s rather primitive memory management facilities for uses beyond memory management. This seems to me to be a bad idea, forgivable in C++ only because C++ has little else to offer in its place. Those that used the RAII idiom in Python will have problems running their code in Jython because the memory management is different. As Ben pointed out, assumptions about the behavior of memory management is causing great discussions about the move in Perl 6 to a true GC approach.
Allow me to take this opportunity to say, "That just goes to show the whole idea of garbage collection sucks." Thank you.


CRC: Doesn't your fixation on "end of scope" [...]
The reason for the "end of scope" fixation is merely because that is the context of the original poster[']s statement.
Ah, yes; forgot that; sorry.

(Then again, ever heard of how discussions tend to wander a bit from where they started? Ever thought of letting them do so, and just go with the flow...?)

CRC: If they [Smalltalk, Eiffel, Ruby, Jython] don't support this trivial OO concept, maybe they aren't so "strongly" OO after all, eh? Who said they were? Based on which criteria -- and who got to choose those criteria?
I've heard people dismiss C++ as non-OO, but this is the first time I've ever heard anyone claim that Smalltalk was non-OO. My only claim was that these languages are generally accepted as being OO languages, yet don't support a RAII style idiom, then how critical can the RAII idiom be to the OO concept. If you wish to argue against these languages being OO, then I for one would like to see that (but probably in a different thread).
A: No, that wasn't your "only" claim -- you used the expression "strongly OO", and whether that terminology was your own invention or not, the mere act of using it does imply something more: That you agree that these languages are somehow "more strongly" OO than others. My questions about where this comes from remain unanswered.

B: Who said the RAII idiom was supposed to be "critical to" the OO concept? Judging from Greg's original post, he saw it as a major benefit of using (at least some) OO languages, which AFAICS is another thing altogether.

(B 2: Still and again, if Smalltalk, Eiffel, Ruby, and Jython can't provide even this apparently so obvious and trivial and "primitive" benefit of OO, doesn't that speak at least against them being more "strongly OO" than others, that do provide this benefit?)

C: I didn't say Smalltalk, Eiffel, Ruby, and Jython are non-OO; I only objected to your painting them as somehow "more OO", or "better OO", than, for instance, C++ or Delphi. (And please don't try to hide behind someone else, who[ever] may have coined your "strongly OO" expression -- I've let that slide on the "RAII idiom" thing above, and that should be enough -- your using these expressions is enough of an endorsement of their connotations that you are intellectually obliged to stand up for them.) That's not quite the same thing.

C 2: This constantly pisses me off from the proponents of... Uh, how do I put this in a single phrase... The proponents of what I will have to call, for lack of a better catch-all, "Weird-OO" -- advocates of Smalltalk, Java, Lisp... (and perhaps Eiffel, Ruby, and Jython too) -- languages with weak typing, garbage collection, bass-ackwards syntax (and just generally "weirdo" ["weirdoo"?] concepts like "closures" and whatnot), etc, etc... Whenever one tries to discuss the merits of various OO languages with these guys, the discussion sets out from their pre-conceived reality, where "MyWeirdOOLanguage is more-and-better-OO than YourNonWeirdOne", if it isn't an axiom, at least follows immediately from the axioms "The less typing and structure a language has, the more-and-better-OO it is" and "The more weird and incomprehensible features, and the less natural-language-like syntax, a language has, the more-and-better-OO it is"... One has to fight against these preconceptions for ages, before one gets to non-slanted ground where a discussion could even start on something approaching an even footing... It's a bit like talking to Bryce, only in reverse.

(C 2 b: Yes, you've probably noticed a strong overlap between the set of languages in my preceding paragraph and those you mentioned yourself, above; and therefore might be tempted to suggest that there is already a catch-all phrase for them: "Strong OO". But, that's precisely my point: Why should the discussion be conducted in your [collective or personal "you"; take your pick] prejudicial terms? That's pretty much declaring them "more" or "better" OO; a tacit -- but rather obvious -- acceptance-as-axiom of what we set out to prove or disprove.)

C 3: So, if we ever do get around to starting that other thread (there is a check-box and a combo-box for this very purpose just below the reply-text edit box; you're perfectly free to start it from this) on whether Smalltalk, Eiffel, Ruby, and Jython really are OO, then can we please have that discussion on my terms -- where the proponents of "weird-OO" get to start from the disadvantaged position of defending themselves against my (arbitrary, but no more so than their opposite) accusation of "Ha! How can that even hope to be OO, when it doesn't even have strong typing?"; and only after they've successfully defended their favourite languages against that (and other similar percieved-strikes-against), then they get to try to argue that their candidates are somehow better than mine? If not, then why not?



[1]: I don't know if you followed that discussion a while ago on what kind of place it actually is we have here; my standpoint -- declared, if not in that thread, then earlier (repeatedly, IIRC) -- was that since I hang out here on my time, I am not willing to subscribe to some arbitrary "professional" code of behaviour, whatever that may mean in your (collective "you"; and overwhelmingly American, with all what that entails of political-correctness- or religion-based prudery) vocabulary. IOW, I swear in my free time; if you have a problem with that, too bad... OTOH, I do object to having my words altered by someone else, even in my free time: If you can't bring yourself to even quote someone without inserting silly "[bleep]"-ing in what they say, then please don't even quote me at all. (As a slight revenge, I've taken the liberty of correcting your spelling in places... Does that annoy you? If yes, then maybe you see my point... If not, please try to understand that your "[bleep]"-ing does annoy the fuck out of me, regardless, and then maybe you'll see my point anyway.)


   [link|mailto:MyUserId@MyISP.CountryCode|Christian R. Conrad]
(I live in Finland, and my e-mail in-box is at the Saunalahti company.)
Your lies are of Microsoftian Scale and boring to boot. Your 'depression' may be the closest you ever come to recognizing truth: you have no 'inferiority complex', you are inferior - and something inside you recognizes this. - [link|http://z.iwethey.org/forums/render/content/show?contentid=71575|Ashton Brown]
New Lots of Comments ...
Wow, lots of stuff to reply to ...

CRC: Ah, uhm... I'm sorry. [...]

Thanks. We've had good conversations in the past, I'm glad we can continue to do so. And I'll refrain from gratuitous editing of your quotes too.

Thanks for the Delphi answers. Other than knowing that is is a object-oriented Pascal, my knowledge of Delphi is almost nil. I'll probably have some followup questions or comments after I digest the material.

Regarding background, my major languages are C, C++, Java, Ruby, Eiffel (though C++ and Eiffel are a bit rusty). Along the way I've picked up some Perl, Forth Haskel, Modula 2 (no Modula3), Smalltalk, Python, TCL, Lisp/Scheme and (pre-object) Ada. I've probably left out something. I think I once knew FORTRAN, but I won't admit to it any more.

Regarding the word "strong"...

CRC: No, that wasn't your "only" claim -- you used the expression "strongly OO", and whether that terminology was your own invention or not, the mere act of using it does imply something more: That you agree that these languages are somehow "more strongly" OO than others. My questions about where this comes from remain unanswered.

I probably would have done better to leave off the word "strong". My argument was can be paraphrased as follows: If we have a list of languages that everyone agrees are OO languages, and none of these languages support feature X, then calling feature X a "benefit of OO" is a bit of a stretch.

The use of the phrase "strongly OO" was meant in the sense that the OO support in these languages is strong enough that no one would call them non-OO languages. "Better" was not implied. Indeed, the object models of these languages are so different that to argue that one is better would most likely exclude the others from the list.

CRC: [...] he saw it as a major benefit of using (at least some) OO languages [...]

Then a better subject line would have been "Great OO Argument Closer for a small number of OO languages".

Regarding Weird-OO (clever catch phrase) Languages ...

CRC: This constantly pisses me off from the proponents of [...] "Weird-OO" -- advocates of Smalltalk, Java, Lisp... (and perhaps Eiffel, Ruby, and Jython too) -- languages with weak typing, garbage collection, bass-ackwards syntax (and [...] "closures" and whatnot), [...]

You are again painting with a broad brush. I tried to include a variety of languages in the list. Some are dynamically typed, some are statically typed (we can argue the definitions of strong/weak typing some other time).

Weird syntax? Eiffel's syntax has strong roots in the Pascal family (via Ada). Ruby is more inline with the Pascal family than the C, Smalltalk or Lisp family.

Garbage collection technology is over 40 years old and is a component of many major production languages, both OO and non-OO. Calling it weird-OO is just, well, weird.

Closures are just one tiny step away from nested procedures in Pascal. In fact, scratch that. I think I could argue that nested procedures in Pascal are closures. They just aren't anonymous closures.

I do believe that you have run up against language evangelists, and I probably have probably played the role of one in the past (and will again in the future if I feel like it). But I think you are reading way too much in my posting in this thread. I think I've been more or less advocacy free and just reporting verifiable facts (e.g. RAII works in language X, it doesn't work in language Y, etc).

CRC: Ha! How can that even hope to be OO, when it doesn't even have strong typing?

In Chapter 2 of Object Oriented Software Construction, Bertrand Meyer lays forth the criteria of object orientation. On page 25 he names a criteria Static Typing: "A will-defined type system wshould, by enforcing a number of type declartion and compatibility rules, guarantee the run-time type safety of the systems it accepts". It sounds like you and the Eiffel language designer are on the same page.

--
-- Jim Weirich jweirich@one.net [link|http://onestepback.org|http://onestepback.org]
---------------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)
New You remind me of the blub paradox
From [link|http://www.paulgraham.com/avg.html|Beating The Averages], your complaints about "wierd languages" reminded me of it very strongly.

As for defending the right of dynamically typed languages to call themselves OO, well that is a silly requirement. The phrase "object oriented" was coined (AFAIK) by Alan Kay, and he went on to design the first intentionally object-oriented language, which he called Smalltalk. I say intentionally, because he found a lot of the ideas in an earlier language known as Simula.

Any decent discussion of the history of OO acknowledges this, and goes on to point out that other OO languages such as C++ then borrowed heavily from Smalltalk.

Seriously, can you find me any discussion anywhere where someone with any semblance of a clue refers to Smalltalk as not being really OO? Can you find any reasonable discussions of OO that list strong-typing as needed for the concept? Or are you just trying to define convenient pre-conditions for discussion with no concern for reality?

Cheers,
Ben
"good ideas and bad code build communities, the other three combinations do not"
- [link|http://archives.real-time.com/pipermail/cocoon-devel/2000-October/003023.html|Stefano Mazzocchi]
New Speaking of PL History
Here's my collection of [link|http://www.angelfire.com/tx4/cus/people/|Mug Shots]. :-)

As to the question, I would that Kay has had second thoughts about using the term "Object", much prefering the concept of "Message" these days.
New Kay's thinking
I've seen him say something to the effect, because he thinks people have largely missed the point. Its not so much that there are classes and objects, but that there is a messaging system and every object may be thought of as a server. This is markedly different from the function calling behavior of the so-called "normal" (which are actually rather abnormal from my point of view) languages.

Strong static typing, ugly algolish syntax, lack of decent garbage collection in many cases, makes for a poor cousin to true OO languages.



Smalltalk is dangerous. It is a drug. My advice to you would be don't try it; it could ruin your life. Once you take the time to learn it (to REALLY learn it) you will see that there is nothing out there (yet) to touch it. Of course, like all drugs, how dangerous it is depends on your character. It may be that once you've got to this stage you'll find it difficult (if not impossible) to "go back" to other languages and, if you are forced to, you might become an embittered character constantly muttering ascerbic comments under your breath. Who knows, you may even have to quit the software industry altogether because nothing else lives up to your new expectations.
--AndyBower
Expand Edited by tuberculosis Aug. 21, 2007, 05:40:24 AM EDT
New xlnt
-drl
New Why are you so upset?
I was not trying to upset you. I was attempting to make sure that I was crystal clear. I also attempted to avoid making premature categorizations, for instance I had been about to just say that OO languages defined destructors when I realized that AFAIK JavaScript does not. How many others are the same? How many stripped down little languages call themselves OO? I have NFC.

I am sorry that this made you feel talked-down to. That was not my intention...
I am a frigging OO programmer, you know; I bloody well know what a destructor is! My question was only and specifically about Jim's expression "guaranteed destructor timing" -- which was used, both in his original post and several subsequent ones (including yours that I'm replying to now) in a confusing and self-contradictory way that showed that it isn't really destructors you're talking about. Thankfully, Scott began to clear this up a little... And I'll spell it out all the way for you, so you won't have to embarass yourself like this again.

[explanation of alternate strategies for resource freeing strategies at block exit deleted]

Geez, when you decide to make yourself look silly, you don't go halfway, do you? The fact that the very first post by gfolkert was not necessarily talking about destructors was indeed clarified at [link|http://z.iwethey.org/forums/render/content/show?contentid=109997|http://z.iwethey.org...?contentid=109997]. If you will read that post, see the techniques, and note who wrote it you might realize that I am perfectly aware of that. You might also start acknowledging the right person.

Which leads to the common sense advice that before you write a rant, review the thread.

Now there is one thing that Jim's response to you reminded me of. C++ differentiates between stack based and heap based objects, and manages them differently. Since all of the systems that I have dealt with extensively keep data in the heap, it slipped my mind that he would include C++ because of how it handles stack based objects. My bad. (Though many C++ programmers do use automatic reference counting in the form of smart pointers. And that is what I was thinking of in my response.) He happened to walk into an active debate in the Perl world right now, because Perl is going from reference counting (Perl 5) to true garbage collection (Perl 6) and so there have been interminal debates over the exact value of guaranteed destructor timing.

And answering one of your sarcastic comments, no, reference counting is not the same as true GC. Systems with true GC should be able to collect all garbage - including garbage that refers to itself through circular references. Reference counting fails to ever collect circular references unless you manually adjust for that. (Weak references make that easier.)

As for whether true GC is better or worse than reference counting, people debate that forever. To forestall pointless debate here, allow me to summarize:

  1. Reference counting is easier to understand.
  2. Reference counting offers more deterministic behaviour.
  3. Once implemented, true GC is easier to maintain. (In a large software project with reference counting, it is virtually inevitable that bugs will crop up in the counting, leading to memory leaks. This is far less of an issue with true GC.)
  4. GC handles circular references, which can easily come up in complex data structures, reference counting does not.
  5. Neither offers clearly superior performance. Or rather each clearly outperforms the other in the right circumstances. GC runs are not cheap. But neither are cache misses caused by constantly having to update reference counts in random places in memory.
  6. People tend to resist moving away from whichever one they currently use.

But like it or not, true GC is an idea that is working its way into more and more environments as time goes on. Higher order languages favoured by those who are into "research languages" have long favoured true GC. See Lisp, Smalltalk, Haskell, and Ocaml for examples. Mainstream languages being created by major software vendors in recent years have followed suit. This is true whether you are talking the Microsoft world (VB, C#, etc), web development (JavaScript) or Sun's world (Java). (Before you get annoyed at my not including Delphi, I didn't include Perl either - and Perl at this point probably has more users than Delphi.) There are some preliminary signs that scripting languages are starting to head that way (Ruby has it, some forms of Python do, the next generation of Perl will). Older mainstream languages and direct derivations thereof (C, C++, Pascal, etc) do not, and seem unlikely to change.

So why the trend? My take is that people who have built enough big systems get sick and tired of bugs in reference counting and eventually go with true GC because it is more reliable with less work. Of course I could be wrong, but that is the biggest argument given for true GC if you look for references on it, and it fits with comments from specific project leaders that I know who have chosen to use it in their projects...

Cheers,
Ben
"good ideas and bad code build communities, the other three combinations do not"
- [link|http://archives.real-time.com/pipermail/cocoon-devel/2000-October/003023.html|Stefano Mazzocchi]
Expand Edited by ben_tilly July 17, 2003, 10:50:14 AM EDT
New Caw, caw: My own fault, mostly, mis-remembering the thread.
Ben T. writes:
I was not trying to upset you. I was attempting to make sure that I was crystal clear. [...] I am sorry that this made you feel talked-down to. That was not my intention...
Well, happens all too easily when one tries (too hard?) to be "crystal clear"... How could that not sound, to the suspicious mind, like (you thought) you were talking to the village idiot? :-)

Mainly my own fault, though; got in a cranky mood from misreading you (and Jim). Sorry about that. That's about all the crow I'm going to eat, here -- gotta leave some room for my reply to Jim... But, lest it be forgotten, I am sorry about that, and will try to learn from this lesson.


Geez, when you decide to make yourself look silly, you don't go halfway, do you?
Fuck, no, I don't believe in doing things by halves! :-)


Which leads to the common sense advice that before you write a rant, review the thread.
Yup, thanks; that's the lesson I'll try to learn from this .


And answering one of your sarcastic comments, no, reference counting is not the same as true GC.
Well, I wasn't trying to say the two variants were the same thing, either. What I was objecting to was your naming one variant "true" garbage collection while contrasting it to another...


As for whether true GC is better or worse than reference counting, people debate that forever. To forestall pointless debate here, [...]
...as per the abvove, by calling one implementation "true" you are perforce implying that the other is "false", and once you start discussing in terms like that, what's the use of having any "debate" in the first place?

(Not that I'd probably join a debate about contrasting garbage collection implementations even if it were conducted in less prejudicial terms; I don't hold with the whole idea.)


But like it or not, true GC is an idea that is working its way into more and more environments as time goes on.
Yeah, but that's the infamous Fly Diet Argument.


   [link|mailto:MyUserId@MyISP.CountryCode|Christian R. Conrad]
(I live in Finland, and my e-mail in-box is at the Saunalahti company.)
Your lies are of Microsoftian Scale and boring to boot. Your 'depression' may be the closest you ever come to recognizing truth: you have no 'inferiority complex', you are inferior - and something inside you recognizes this. - [link|http://z.iwethey.org/forums/render/content/show?contentid=71575|Ashton Brown]
New Just responding to the GC point...
...as per the abvove, by calling one implementation "true" you are perforce implying that the other is "false", and once you start discussing in terms like that, what's the use of having any "debate" in the first place?

Real garbage collection algorithms get at anything that is garbage - including circular references. Reference counting does not detect self-referential garbage. It therefore is incomplete.

Therefore saying that reference counting is not true GC is not meant as a pejorative, it is descriptive. As a GC algorithm, reference counting is incomplete and has unfixable serious bugs. Of course it does things that real GC algorithms cannot do. Perhaps you don't want real GC? After all many complex systems haven't felt it necessary...

Cheers,
Ben
"good ideas and bad code build communities, the other three combinations do not"
- [link|http://archives.real-time.com/pipermail/cocoon-devel/2000-October/003023.html|Stefano Mazzocchi]
     Great OO arguement closer - (folkert) - (33)
         Re: Great OO arguement closer - (JimWeirich) - (32)
             Danke... -NT - (folkert)
             Huh? - (ben_tilly) - (29)
                 Re: Huh? - (JimWeirich) - (28)
                     Dunno, but I have a guess... - (CRConrad) - (27)
                         Guaranteed destructor timing explained - (ben_tilly) - (26)
                             OK, Jim, add one more: Delphi. (And rename the concept!) - (CRConrad) - (25)
                                 Introducing The Magic Of HTML - (pwhysall) - (11)
                                     Yeah, yeah, but not when I'm in a hurry. - (CRConrad) - (1)
                                         Well, slow down, then. - (pwhysall)
                                     Asterisks don't bother me -NT - (tablizer) - (8)
                                         It -NT - (deSitter)
                                         It -NT - (deSitter) - (6)
                                             STOP SAYING THE WORD!! -NT - (admin) - (4)
                                                 Suffice to say... - (FuManChu) - (3)
                                                     was it Nye or Ni? - (Simon_Jester) - (2)
                                                         Ni! - (pwhysall) - (1)
                                                             Sh! -NT - (FuManChu)
                                             WTF? -NT - (deSitter)
                                 Re: OK, Jim, add one more: Delphi. (And rename the concept!) - (JimWeirich) - (9)
                                     Ah; the problem seems to be I didn't take into account... - (CRConrad) - (8)
                                         Wow, are we in the same universe? - (JimWeirich) - (7)
                                             (... continuing from the previous message) - (JimWeirich) - (6)
                                                 Combining replies into a single (bigger) slice of humble pie - (CRConrad) - (5)
                                                     Lots of Comments ... - (JimWeirich)
                                                     You remind me of the blub paradox - (ben_tilly) - (3)
                                                         Speaking of PL History - (ChrisR) - (2)
                                                             Kay's thinking - (tuberculosis) - (1)
                                                                 xlnt -NT - (deSitter)
                                 Why are you so upset? - (ben_tilly) - (2)
                                     Caw, caw: My own fault, mostly, mis-remembering the thread. - (CRConrad) - (1)
                                         Just responding to the GC point... - (ben_tilly)
             Scope exit vs. object cleanup - (admin)

Powered by orbiting brain lasers!
119 ms