Post #54,566
10/2/02 7:37:42 PM
|
Re: I didn't invent it
It's an operator, why not call it that? ARG! Mathematicians...
-drl
|
Post #54,572
10/2/02 7:55:24 PM
|
It isn't, necessarily
In Perl it is possible to view it as an operator (though that isn't the grammatical part of speech that it plays), but Ruby has them to, and in Ruby it doesn't act like an operator.
Cheers, Ben
"Career politicians are inherently untrustworthy; if it spends its life buzzing around the outhouse, it\ufffds probably a fly." - [link|http://www.nationalinterest.org/issues/58/Mead.html|Walter Mead]
|
Post #54,573
10/2/02 7:58:33 PM
|
Re: It isn't, necessarily
What's wrong with all of us? We're all using scripting languages.
-drl
|
Post #54,574
10/2/02 8:03:32 PM
|
Why is that wrong?
Computers have reached a point where for most of what needs to be done, programmer time is orders of magnitude more valuable than computer cycles. Scripting languages do a better job at that balance than system languages do.
There is also a productivity sweet spot with teams of 2-6 people. So if you can get a development effort down to needing that many people, it is an absurdly big win.
Cheers, Ben
"Career politicians are inherently untrustworthy; if it spends its life buzzing around the outhouse, it\ufffds probably a fly." - [link|http://www.nationalinterest.org/issues/58/Mead.html|Walter Mead]
|
Post #54,629
10/3/02 8:37:49 AM
|
Not me
I spent the weekend marrying [link|http://www.fscript.org|FScript] and [link|http://www.toodarkpark.org/computers/objc/|Objective C] with [link|http://www.ghg.net/clips/CLIPS.html|CLIPS].
I need to do a cocoa UI and a palette for Interface Builder. Then you'll have expert systems capabilities at your fingertips as needed. Just define the rules for your objects and the system enforces them. Also need to define several deftemplates for common system classes.
Java, C##, Javascript, these languages are all from the dark ages. Who needs em?
I am out of the country for the duration of the Bush administration. Please leave a message and I'll get back to you when democracy returns.
|
Post #55,026
10/5/02 1:55:11 AM
|
You've got life easy
just try dealing with some of the stuff I sometimes have to deal with in industrial automation -- they make C look good.
For example, recently, I have had to write a Logo! program in its version of ladder logic, Aromat IEC61131 program (I used Structured Text, which looks a bit like Pascal, but has some very interesting quirks), and an Animatics SmartMotor program (roughly similar to GW BASIC). At least all three of them were short.
Python is so much more fun -- and productive. Thankfully, I've been doing some of that, too lately, along a bit of wiring and electrical troubleshooting. I haven't blown anything up yet -- but some of my coworkers tried.
Tony
|
Post #54,589
10/2/02 10:19:07 PM
|
Not idiomatically.
They might be lexical* operators, but they're not necessarily functional operators. The latter is what people usually mean by the term "operator" in the context of a programming language. Like "+" or "and". The semi-colon in PHP is a lexical operator, too.
Wade.
* That might be the wrong term: I could mean syntactic. Or semantic.
"Ah. One of the difficult questions."
|
Post #54,590
10/2/02 10:26:22 PM
|
In Perl they are dereferencing operators
"Career politicians are inherently untrustworthy; if it spends its life buzzing around the outhouse, it\ufffds probably a fly." - [link|http://www.nationalinterest.org/issues/58/Mead.html|Walter Mead]
|
Post #54,613
10/3/02 1:43:33 AM
|
semicolon rant IV
The semi-colon in PHP is a lexical operator, too.
@#^%@$ Semicolons!
I have been hating them for multiple decades, ever since Pascal in college. They are sooooo archaic.
I kind of dig the Python approach: if you want to a line to wrap, then use parenthesis (if not clear) to leave it open at the end of the line. This avoids a "continuation character" like VB's, and most wrappings are due to long function/method parameter lists anyhow.
I suppose that is kind of LISPish in principle. If LISP was economically viable, I just might dig into it more. It seems to have a lot of cool features. Paul Graham is my current hero.
________________ oop.ismad.com
|
Post #54,631
10/3/02 8:41:17 AM
|
Or you can use periods.
Its what Smalltalk uses. Most human languages too.
dictionary at: 'Greeting' put: 'Hello'.
Seems OK to me.
I am out of the country for the duration of the Bush administration. Please leave a message and I'll get back to you when democracy returns.
|
Post #54,999
10/5/02 12:40:32 AM
|
re: periods
Its what Smalltalk uses. Most human languages too.
Nope. Don't like that either. COBOL had the problem where the presence or absense of a period can change the entire meaning. At least most C-cloned langs give a syntax error instead of a different interpretation if semi ommitted. COBOL is what happens when you try to hard to "be like English".
But, I prefer to let the end-of-line be the indicator. It is more WYSIWYG imo.
It is a personal preference, but that is the way I like it.
________________ oop.ismad.com
|
Post #55,189
10/6/02 6:29:49 PM
|
re: periods
But this is wrong, at least in the UNIX world where everything is stream (and "UNIX world" means - the Internet). The EOLs are delimiters in one context and not the other. Nothing could be simpler.
-drl
|
Post #55,655
10/8/02 11:40:57 PM
|
Huh?
________________ oop.ismad.com
|
Post #55,757
10/9/02 11:59:35 AM
|
Re: Huh?
When C was invented, the most popular languages, COBOL and FORTRAN, were line-oriented. Typically a program would go into the system via a punched card reader. In FORTRAN, you had columns 7 to 72 to scribble in on a keypunch machine. Everyone saw that having to worry about carriage control at the language level was brain-dead. So a delimiter was needed to tell the compiler that a line was concluded. What would you have picked?
-drl
|
Post #57,386
10/17/02 3:53:17 PM
|
Who is "everyone"?
Everyone saw that having to worry about carriage control at the language level was brain-dead.
I don't see the problem. I am not the only one who dislikes semicolons. Bertrand Meyer made them optional because of complaints of Eiffel semicolons, for example.
Besides, there were other languages around to serve as possible solution ideas. LISP does not need semi's, for example.
Just accept the fact that some people don't like semi's.
________________ oop.ismad.com
|
Post #57,406
10/17/02 5:09:06 PM
|
Once More With Feeling
Everything in UNIX is byte-stream oriented. The compiler chews on a byte stream. Because it was developed as a time-share system with terminals (a radical idea at that time), it uses newlines to format byte-streams for the screen. That's a particular delimiter. The C compiler uses another delimiter in its input stream of code - and GASP! someone had the thought that it would be a GOOD THING if humans could SEE them! The semi-colon is very expressive - stop, but keep going.
It's a total, complete, non-issue.
-drl
|
Post #57,751
10/18/02 11:20:43 PM
|
I don't care about the 1970's, I hate them here and now
That is a bunch of bolarky! There are plenty of non-semicolon languages that run just fine on UNIX. If semi's made stuff easier in the 70's, well, this is NOT the 70's. Get modern.
I don't like semi's and work faster without the damned things. That is a FACT. If you like them, that is fine, but don't force them down everybody else's throat.
________________ oop.ismad.com
|
Post #57,775
10/19/02 12:43:55 AM
|
Simple solution.
And that is don't use languages with semi-colons as statement terminators, if it bothers you so much.
There's no need (or call) for you to insist that everyone subscribe to your view. I happen to intensely dislike identifiers with a leading underscore, but I can't make everyone stop using them. (Fortunately, it's less of an issue for me in PHP because of the leading $ on all variables.)
Wade.
"Ah. One of the difficult questions."
|
Post #57,782
10/19/02 1:54:47 AM
|
The Galileo of Computing!
Sorry.
It's really funny to watch you have a stroke over a stupid delimiter. Maybe we should replace it by a new character just for you - the "catastrophe".
-drl
|
Post #58,195
10/22/02 2:32:44 AM
|
Others complain too
It's really funny to watch you have a stroke over a stupid delimiter.
That happens to be the topic at hand. If you have Bertrand Meyer's OOSC2 you will see that others have found it problematic also. Page 897 partial quote: "The War of the Semicolon [title]....Two clans inhabit the computer world, and the hatred between them is as ferocious as it is ancient...."
It is just one of those annoying things that keep slowing things down. I used Pascal extensively in College, and never did get used to semi's. Perhaps if *all* the langs I ever used only had semi's, then it would be a natural habit by now. However, I encounter a mix.
If one keeps losing their car keys once every 2 weeks, then a combination-based lock starts to look pretty good (like the Nissan Maxima).
BTW, some non-semi languages do NOT need VB-style dashes or continuation characters. Python and LISP come to mind.
________________ oop.ismad.com
|
Post #57,407
10/17/02 5:12:06 PM
|
Wrong.
Everyone saw that having to worry about carriage control at the language level was brain-dead. No, it isn't. cf. Python
Regards,
-scott anderson
"Welcome to Rivendell, Mr. Anderson..."
|
Post #57,414
10/17/02 5:34:53 PM
|
Re: Wrong.
We're talking 1970s, FORTRAN, COBOL, etc. Carriage control was heavily device-oriented and record-oriented, because IBM made record-oriented systems. Byte-stream oriented UNIX was really radical.
Some of us were around in antediluvian times :) Things change.
Here's a history of carraige control:
[link|http://www.ibiblio.org/pub/languages/fortran/ch2-14.html|http://www.ibiblio.o...rtran/ch2-14.html]
-drl
|
Post #57,487
10/18/02 7:25:31 AM
|
This is a general trend
Later-invented systems often find simple solutions to problems that are so bad that earlier systems never considered doing it that way. Byte-oriented UNIX with lines parsed out versus records is one example. Automatic memory management in new languages is a more familiar one today.
Cheers, Ben
"Career politicians are inherently untrustworthy; if it spends its life buzzing around the outhouse, it\ufffds probably a fly." - [link|http://www.nationalinterest.org/issues/58/Mead.html|Walter Mead]
|
Post #57,418
10/17/02 5:41:27 PM
|
Historical trivia that applies...
I seem to recall that the early C compilers actually had a 80 character limit that paralleled a Hollorith card. If you ran over the 80 char line limit, the line was truncated. New compilers have no line limit and span lines without a '\\' for a line break. A delimiter is needed. No question. This works. I would rather a simple syntax, than the tyranny of making everything I code intuitively readable to all potential readersmean?average?every bum off the street? WHO?.
|
Post #57,465
10/17/02 9:59:49 PM
|
Nah, not Kernighan and Ritchie.
K&R: Blanks, tabs, newlines, and comments (collectively called "white space") as described below are ignored except as they serve to separate tokens. Some white space is required to separate otherwise adjacent identifiers, keywords, and constants. So any C compiler that does otherwise would not be K&R.
Alex
"I have a truly marvelous demonstration of this proposition which this margin is too narrow to contain. -- Pierre de Fermat (1601-1665)
|
Post #57,488
10/18/02 8:13:04 AM
|
Right, Not K&R
I didn't start using C until 1984 or so, and every compiler I saw was mostly K&R compliant. ISO came later. I was told that a lot of the code I was looking at was written like it was because of the 80 line limitation on the system it came from. I never saw the original system, and the story came to me as probably third hand word-of-mouth. I suppose you can say that if it's not K&R, it's not a C compiler. I suppose that the story could also be bullshit, although I did not pass it on with such knowledge. A lot of the information I possess, I have through informal sources, including this forum. I recalled the story and thought that it pertained to the thread. Sorry about that..
Hugh
|
Post #57,489
10/18/02 8:30:21 AM
|
Don't be. Sorry, I mean.
|
Post #57,548
10/18/02 12:37:33 PM
|
Re: Right, Not K&R
No, it's very interesting! Even as late as the late 70s, college computing was done on a Forbin Project style 'frame - ours was a Cyber, one of the fastest computers then made. The hapless student/researcher would code up his program on Hollerith cards and present the deck at the I/O window - later a public card reader was available and you could submit your own deck, eat some horrid dry cookies or chips with your 10th Dr. Pepper of the day, and wait for the I/O window troll to deliver your output.
General availability of time-share terminals only came in early 80s IIRC.
PROFS for VM/CMS (Professional Office System or some such), the GroupWise of its day, had "virtual card readers" and "virtual punch files". GASP!
-drl
|
Post #58,034
10/21/02 12:00:00 PM
|
Actually, might it have been a "standard" to fit terminals?
When I first went to work (in the mid-80's) programming C, it was a "standard" that the code would match a certain format - and, except for long strings (back in those days C compilers didn't append adjacent strings together), it was recommended that we use 80-character lines (at worst) and preferably something like 72 columns, for readability on the Wyse75 terminals we had at the time and, I suppose, the ability to more easily print stuff out on the 80-column line printers we had at the time.
If I remember correctly, you could put those Wyse's into 132-column mode, but you'd probably go blind if you did so. And if you went over 80 columns on any particular terminal, you'd have to monkey with termcap/terminal settings, and it seems to me it always seemed to take a lot more effort than it was ever worth.
The lawyers would mostly rather be what they are than get out of the way even if the cost was Hammerfall. - Jerry Pournelle
|