Jim W. writes:
It[']s obvious that we are not communicating for some reason, but if you want to drop the vitr[i]ol and one-upmanship [...]
Ah, uhm... I'm sorry. I was cranky, misread you and misremembered the thread, and went out of line in my reply; please forgive me.
[...] and continue in a more professional manner, I'm willing to give it a try.
And I'll be delighted to try and behave in a more
civilised manner, if that's good enough for you.[1]
Perhaps some of our confusion lies in the languages we are famili[a]r with. I confess that I don't know Delphi, but perhaps you could educate me in the following areas:
Sure.
Can you allocate objects that are stored on the stack (as opposed to an object that is allocated on the heap, but has a reference on the stack)?
Nope.
If (1) is YES, are destructors run on stack based objects when the stack frame is deallocated?
As Rick M. would say, "Mu".
(Well, come to think of it, there
may actually have been some deep-OO hacks in this direction, but I can't remember for sure if that's really what my vague recollection of something I read on the Web years ago was about... If you really,
really want to know, google for the phrase "Halvard Vassbotn" and some suitable keywords.)
Does Delphi provide any automated heap management (either GC or ref-counted), or does it rely on manual recl[am]ation of heap objects[?]
The answer to this is slightly messy, but with a strict literal interpretation of the question ("...
any..."), the short-version reply boils down to: Yes.
The reason why the longer answer is a bit messy is because Delphi has two ways to refer to objects; i.e, two different
kinds of types to declare object variables as: Object classes, or interfaces. (Rather like Java, no?) Straight class-type references have no GC whatsoever; interface references are managed by ref-counted garbage collection. Here's an example from the Delphi 7 Help files:
The Delphi compiler provides most of the IInterface memory management for you by its implementation of interface querying and reference counting. Therefore, if you have an object that lives and dies by its interfaces, you can easily use reference counting by deriving from TInterfacedObject. If you decide to use reference counting, then you must be careful to only hold the object as an interface reference, and to be consistent in your reference counting. For example:
procedure beep(x: ITest);
function test_func()
var
y: ITest;
begin
y := TTest.Create; // because y is of type ITest, the reference count is one
beep(y); // the act of calling the beep function increments the reference count
// and then decrements it when it returns
y.something; // object is still here with a reference count of one
// (Though presumably it'll be gone after the "end;". -- CRC)
end;
This is the cleanest and safest approach to memory management; and if you use TInterfacedObject it is handled automatically.
This, together with the fact that you can refer to the same object either as a class-type or an interface reference, has interesting consequences (on which more below).
And if the above questions don't make sense in the context of Delphi, I would be interested in hearing why. Thanks.
Of course they did, and you're welcome. (How could they
not make sense?)
In addition to Delphi, what other OO languages are you famili[a]r with? I ask in order to find some common ground for better communication.
The usual: C++, Java; Visual Basic and JavaScript, to the extent you want to count those as OO. Smatterings of others; mostly just what I've picked up from books, or, more recently, code samples on the 'Net: Ada, Modula-2/3, Eiffel, C#, maybe some others I forget... Oh yeah: Rudimentary Smalltalk and LISP, though only to the extent that I've been able to conclude that they're syntactically, uh... weird. (There, has that at least somewhat allayed your fears that I'm so totally one-track I know nothing but Delphi? :-)
[Jim's answers to some of my earlier questions:]
Yes, a local object in C++ is stack allocated (unless the static keyword is used ... let's not go there).
Why not -- ya think it would just be another occasion for me to say, "That's one of the reasons C++ sucks"? :-)
CRC: If so: Don't heap-based ones count, or what?
From the original Poster: [...] a function that is automatically (or automagically) called at scope exit. Since the lifetime of a heap variable is independent of scope, then they aren't pertinent to the question.
Ah, yes, one of the things I forgot; why I should have re-read the thread... :-(
OTOH, "a function that is automatically (or automagically) called at scope exit" doesn't necessarily have to mean object destruction, does it? (cf Scott's post, and some of my earlier comments, about "
finally".) Anyway, from what I understand of Delphi's mixed class / interface model, a scope exit
can actually kill a heap-allocated object. Delphi's Help continues:
If you do not follow this rule, your object can unexpectedly disappear, as demonstrated in the following code:
function test_func()
var
x: TTest;
begin
x := TTest.Create; // no count on the object yet
beep(x as ITest); // count is incremented by the act of calling beep
// and decremented when it returns
x.something; // surprise, the object is gone
end;
And, since
all Delphi objects, "local" or not, are allocated on the heap,
TInterfacedObject or some other Interface implementation ought to be able to kill a non-"local" object, too. That is, if the "
x := TTest.Create;" bit had been outside this function (but executed some time before it was called, preferably... :-) , then after this function returned, "
x" would be gone. So, no, apparently: Heap variables "aren't pertinent to the question"
only if one insists on limiting oneself to C++ (and similar languages).
[ oops ... I hit save before I finished this ... I'll continue in separate posting]
Sorry, the mouse slipped before I finished the previous posting. I was trying to address some of your questions ... I'll finish here...
Uh, why did you do it that way? I mean, you'd obviously discovered the Edit function here, so why didn't you just use that -- not in order to make me have to eat two separate slices of humble pie, I hope? :-)
CRC: What the [bleep][1] is wrong with the destructor of an object just simply being called when you actually CALL it?!?
Depending on the context, explicitly calling a destructor could be a good or bad thing. If the language automatically called destructors, then manually calling it might not be a great idea. But refer[r]ing to the context of the original poster: automatically called at scope exit, I am assuming the destructor is called automatically. Manually calling the destructor is uninteresting in this context.
Hmm... Yeah, probably.
(OTOH, Delphi's class libraries (the VCL and CLX frameworks) are to quite a considerable extent "
semi-automatic" in this respect: The network of "Owner" relationships in, for example, a TForm descendant, make sure that when the top-level object is destroyed (possibly by some Interface-brokered trick as per above) all its "Owned" components are destroyed (in order, and each executing its destructor), too. Quite "manual", in the sense that the calls are explicitly there in the framework source-code -- as opposed to, say, C++ with its spontaneous compiler-black-magic auto-generated creation and destruction of objects -- but "automatic" in the sense that to the user (not end-user; framework user, application programmer), when he's done with something and gets rid of it, all its dependent objects go away too,
almost "as if by magic".)
CRC: [...] you C++ people [...think...] "Only in C++ can you be THIS advanced!"
I'm not sure where you got the idea that I thought RAII was an advanced concept.
Well, if everybody thought it was perfectly obvious and trivial, they wouldn't have gone to the trouble of inventing an acronym for it, would they? But, of course I shouldn't have blamed you, personally, for this, and I'm sorry it came out that way.
It is using C++'s rather primitive memory management facilities for uses beyond memory management. This seems to me to be a bad idea, forgivable in C++ only because C++ has little else to offer in its place. Those that used the RAII idiom in Python will have problems running their code in Jython because the memory management is different. As Ben pointed out, assumptions about the behavior of memory management is causing great discussions about the move in Perl 6 to a true GC approach.
Allow me to take this opportunity to say, "That just goes to show the whole idea of garbage collection sucks." Thank you.
CRC: Doesn't your fixation on "end of scope" [...]
The reason for the "end of scope" fixation is merely because that is the context of the original poster[']s statement.
Ah, yes; forgot that; sorry.
(Then again, ever heard of how discussions tend to wander a bit from where they started? Ever thought of
letting them do so, and just go with the flow...?)
CRC: If they [Smalltalk, Eiffel, Ruby, Jython] don't support this trivial OO concept, maybe they aren't so "strongly" OO after all, eh? Who said they were? Based on which criteria -- and who got to choose those criteria?
I've heard people dismiss C++ as non-OO, but this is the first time I've ever heard anyone claim that Smalltalk was non-OO. My only claim was that these languages are generally accepted as being OO languages, yet don't support a RAII style idiom, then how critical can the RAII idiom be to the OO concept. If you wish to argue against these languages being OO, then I for one would like to see that (but probably in a different thread).
A: No, that wasn't your "only" claim -- you used the expression "
strongly OO", and whether that terminology was your own invention or not, the mere act of using it
does imply something more: That
you agree that these languages are somehow "more
strongly" OO than others. My questions about where this comes from remain unanswered.
B: Who said the RAII idiom was supposed to be "critical
to" the OO concept? Judging from Greg's original post, he saw it as a major benefit
of using (at least some) OO languages, which AFAICS is another thing altogether.
(
B 2: Still and again, if Smalltalk, Eiffel, Ruby, and Jython can't provide even this apparently so obvious and trivial and "primitive"
benefit of OO, doesn't that speak at least against them being
more "
strongly OO" than others, that
do provide this benefit?)
C: I didn't say Smalltalk, Eiffel, Ruby, and Jython are
non-OO; I only objected to your painting them as somehow "
more OO", or "
better OO", than, for instance, C++ or Delphi. (And please don't try to hide behind someone else, who[ever] may have coined your "strongly OO" expression -- I've let that slide on the "RAII idiom" thing above, and that should be enough -- your
using these expressions is enough of an endorsement of their connotations that you
are intellectually obliged to stand up for them.) That's not quite the same thing.
C 2: This constantly pisses me off from the proponents of... Uh, how do I put this in a single phrase... The proponents of what I will have to call, for lack of a better catch-all, "
Weird-OO" -- advocates of Smalltalk, Java, Lisp... (and perhaps Eiffel, Ruby, and Jython too) -- languages with weak typing, garbage collection, bass-ackwards syntax
(and just generally "weirdo" ["weirdoo"?] concepts like "closures" and whatnot), etc, etc... Whenever one tries to discuss the merits of various OO languages with these guys, the discussion
sets out from
their pre-conceived reality, where "
MyWeirdOOLanguage is more-and-better-OO than YourNonWeirdOne", if it isn't an axiom, at least follows immediately from the axioms "
The less typing and structure a language has, the more-and-better-OO it is" and "
The more weird and incomprehensible features, and the less natural-language-like syntax, a language has, the more-and-better-OO it is"... One has to fight against these preconceptions for ages, before one gets to non-slanted ground where a discussion could even start on something approaching an even footing... It's a bit like talking to Bryce, only in reverse.
(
C 2 b: Yes, you've probably noticed a strong overlap between the set of languages in my preceding paragraph and those you mentioned yourself, above; and therefore might be tempted to suggest that there
is already a catch-all phrase for them: "Strong OO". But, that's precisely my point: Why should the discussion be conducted in
your [collective or personal "you"; take your pick] prejudicial terms? That's pretty much
declaring them "more" or "better" OO; a tacit -- but rather obvious -- acceptance-as-axiom of what we set out to prove or disprove.)
C 3: So, if we ever do get around to starting that other thread (there is a check-box and a combo-box for this very purpose just below the reply-text edit box; you're perfectly free to start it from this) on whether Smalltalk, Eiffel, Ruby, and Jython really are OO, then can we please have that discussion on
my terms -- where the proponents of "weird-OO" get to start from the disadvantaged position of defending themselves against
my (arbitrary, but no more so than their opposite) accusation of
"Ha! How can that even hope to be OO, when it doesn't even have strong typing?"; and only
after they've successfully defended their favourite languages against that (and other similar percieved-strikes-against),
then they get to try to argue that their candidates are somehow better than mine? If not, then why not?
[1]: I don't know if you followed that discussion a while ago on what kind of place it actually is we have here; my standpoint -- declared, if not in that thread, then earlier (repeatedly, IIRC) -- was that since I hang out here on
my time, I am
not willing to subscribe to some arbitrary "professional" code of behaviour, whatever that may mean in your (collective "you"; and overwhelmingly American, with all what that entails of political-correctness- or religion-based prudery) vocabulary. IOW, I swear in my free time; if you have a problem with that, too bad... OTOH, I
do object to having
my words altered by someone else, even in my free time: If you can't bring yourself to even quote someone without inserting silly "[bleep]"-ing in what they say, then please don't even quote me at all. (As a slight revenge, I've taken the liberty of correcting your spelling in places... Does that annoy you? If yes, then maybe you see my point... If not, please try to understand that your "[bleep]"-ing
does annoy the fuck out of
me,
regardless, and then maybe you'll see my point anyway.)