Post #4,368
8/9/01 11:35:15 AM
|
Hooey.
We will have AI when we couple brain events to bodily functions just like humans. "Human" brain = machine intellect + your skin, heart, lungs, and bowels (for various meanings of '+').
We can't match a child's intellect in a machine today because we have basically two camps in AI "research":
1) Those who are trying to throw software at the problem: "If we just wrote down all the rules of life...", and
2) Those who are trying to throw hardware at the problem: "If we just had a billion billion bits..."
Neither group is going to get anywhere; a Turing candidate has to have a way to train its mind through its body. Emotion (its bodily sources and feedback effects) is the addition of value to machine-like fact. This is the way humans weight events: why a human will listen to the person screaming, "FIRE!" in the kitchen over the TV screaming "BUY ME!"--your gut knows before your brain does where to focus attention. Current machines have no *inherent* mechanism for this--it must all be coded. And nobody wants to do that much coding.
My $2.00.
That's her, officer! That's the woman that programmed me for evil!
|
Post #4,381
8/9/01 12:23:41 PM
|
exactamento how do you code an orgasm?
when you are walking down the street and a smiler of what ever breed turns you on the brain does a quick inventory/processes when you walk into a parked car bark your shins and while you are falling down the solution to the problem you have been working on all week surfaces. How the heck are you going to program that? thanx, bill
Our bureaucracy and our laws have turned the world into a clean, safe work camp. We are raising a nation of slaves. Chuck Palahniuk
|
Post #4,417
8/9/01 3:15:22 PM
|
Re: exactamento how do you code an orgasm?
Well, they can fake it, like a lot of women do! :)
But you can't fake a thermal runaway.
Alex
Only two things are certain: the universe and human stupidity; and I'm not certain about the universe. -- Albert Einstein (1879-1955)
|
Post #4,420
8/9/01 3:48:50 PM
|
It appears that greed has been coded.
[link|http://www.theregister.co.uk/content/4/20907.html|Robots learn greed is good.]
One of the seven deadly sins at a time.
Alex
Only two things are certain: the universe and human stupidity; and I'm not certain about the universe. -- Albert Einstein (1879-1955)
|
Post #4,426
8/9/01 4:58:36 PM
|
"Lust" should be amusing ....
Jay O'Connor
"Going places unmapped to do things unplanned to people unsuspecting"
|
Post #4,435
8/9/01 7:34:18 PM
|
Microsoft has dibs on implementing that one...
|
Post #4,575
8/10/01 9:31:01 PM
8/10/01 9:31:52 PM
|
P H B-line
>> "We see robots working in the frenzy of the trading pit while humans are elevated to a managerial role," he said. <<
Elevated?????
The only reason computers don't do management is because being honest and accurate is easier to program than manipulation and BS.
________________ oop.ismad.com
Edited by tablizer
Aug. 10, 2001, 09:31:52 PM EDT
|
Post #4,573
8/10/01 9:22:51 PM
|
st....think tank
>> when you are walking down the street and a smiler of what ever breed turns you on the brain does a quick inventory/processes when you walk into a parked car bark your shins and while you are falling down the solution to the problem you have been working on all week surfaces. How the heck are you going to program that? <<
Many of my best ideas come to me on the flusher [1], not after falling. Does that mean R2D2 needs to grow up on the can? (.....Wait, he *is* a can.)
[1] Nicknamed "think tank".
________________ oop.ismad.com
|
Post #5,214
8/14/01 10:30:12 PM
|
Well, I suppose that's an appropriate...
...nickname to indicate where your thoughts should go.
Christian R. Conrad The Man Who Knows Fucking Everything
|
Post #4,930
8/13/01 6:19:15 PM
|
Which is why camp #1 will never get anywhere
That's her, officer! That's the woman that programmed me for evil!
|
Post #4,408
8/9/01 2:47:53 PM
|
Yeah - who cares? Organisms care.
I don't mean "care" like get all gushy about The Children. I mean "care" in the sense that I care about getting that next breath, like I care about dinner when I'm hungry, like my dog cares about getting out the door without a leash.
AI, as currently being developed, doesn't care. Deep Blue may win, but it will take no joy fron the victory and no pain from setbacks on the way.
Organisms care. Even pretty basic ones. The iguanas I'm taking care of for the summer really do like bananas. And I've never met a computer that liked - or didn't like - anything.
White guys in suits know best - Pat McCurdy
|
Post #4,574
8/10/01 9:26:36 PM
|
Agony of defeat
>> AI, as currently being developed, doesn't care. Deep Blue may win, but it will take no joy fron the victory and no pain from setbacks on the way. <<
How do you know that; you are not Deep Blue. Perhaps D.B. simply lacked a way to express it.
Anyhow, the reward-and-punishment system is fairly well understood, partly by research into drug addiction. The agony of defeat may simply be a high-ranked message that backtracks to the behavior trigger "path" that produced the defeat-causing behavior.
If you send a suppressor message back to the origin of given behaviors upon loss or defeat, eventually those nodes responsible for the offending behavior will accumulate "bad carma", sort of like Slashdot. (Still does not stop me from anti-OO trolling :-) Aka "Delta rule" of AI?
Besides, that still does not mean that emotion is needed for real AI. You may be carbon-centric in your thinking here. It might be romantic to think that humans are unique, but don't hold your breadth. No "binary bigotry" here. Lt. Riker won't let you bash Data like that.
________________ oop.ismad.com
|
Post #4,860
8/13/01 11:48:51 AM
|
Not bashing Cmdr. Data
Current directions of research do not look likely to ever produce an AI that cares. Fiction examines the possibility - and it may be possible, just not by continuing the way we are doing it now. Organisms - even rather simple ones - care and they do so presumably by entirely "natural" (as opposed to supernatural) processes, so it would appear that it is possible. But neithere a smarter algorithm nor more RAM will do it.
I guess what it comes down to is that the movie AI was mis-titled. The important innovation has nothing to do with intelligence. It has to do with drive.
White guys in suits know best - Pat McCurdy
|
Post #4,861
8/13/01 11:49:28 AM
|
Not bashing Cmdr. Data
Current directions of research do not look likely to ever produce an AI that cares. Fiction examines the possibility - and it may be possible, just not by continuing the way we are doing it now. Organisms - even rather simple ones - care and they do so presumably by entirely "natural" (as opposed to supernatural) processes, so it would appear that it is possible. But neithere a smarter algorithm nor more RAM will do it.
I guess what it comes down to is that the movie AI was mis-titled. The important innovation has nothing to do with intelligence. It has to do with drive.
White guys in suits know best - Pat McCurdy
|
Post #4,862
8/13/01 11:49:28 AM
|
Not bashing Cmdr. Data
Current directions of research do not look likely to ever produce an AI that cares. Fiction examines the possibility - and it may be possible, just not by continuing the way we are doing it now. Organisms - even rather simple ones - care and they do so presumably by entirely "natural" (as opposed to supernatural) processes, so it would appear that it is possible. But neithere a smarter algorithm nor more RAM will do it.
I guess what it comes down to is that the movie AI was mis-titled. The important innovation has nothing to do with intelligence. It has to do with drive.
White guys in suits know best - Pat McCurdy
|
Post #4,900
8/13/01 3:46:35 PM
|
End Loop: Not bashing Cmdr. Data :)
Alex
Only two things are certain: the universe and human stupidity; and I'm not certain about the universe. -- Albert Einstein (1879-1955)
|
Post #4,479
8/10/01 12:25:15 AM
|
Well.. we shall *never* have That!
It ain't ever gonna be a matter of, "nobody wants to do that much coding" IMO. The answer would be 42 after a few hundred centuries. (How could anything not meta-human actually successfully create a black-box replica?)
From what tangible or theoretical 'overview' might one conduct the - especially internal - overview? No one human could even absorb something so trivial as the X-million lines of Windoze code (even if it actually did something besides moving electrons around).
Smart breadmakers, car operating robots (on Very-well controlled smart highways) yada yada.
But *never* a bartender to whom you might say 42 or even, Play it, Sam! and invoke that momentary *emotional* response of not-only deja-vu but.. the entire (holographic is Way too simplistic) distilled feeling of all one's remembered 'associations'. (or even a facsimile of, I hated that movie..)
It's such a fool's errand it fits perfectly the epithet re a student's homework, this work is so poor it's not even wrong.. (But that won't cause a tiny minority to cease pursuit of the chimera - or not dream of beating Michael Jordan and getting girls.)
Pure materialism at its most blatant level - like our current environment?
No half-dollars for Hal here..
Ashton who thinks we couldn't even 'make' a tick though we could play one on digital Tee Vee
|
Post #4,571
8/10/01 9:19:56 PM
|
Body is not a prereq.
>> We can't match a child's intellect in a machine today because we have basically two camps in AI "research":......<<
A 3rd category could be the "magic formula" stance. "If we simply figure out the right linking formula/algorithm." I tend to stand in that camp.
>> Neither group is going to get anywhere; a Turing candidate has to have a way to train its mind through its body. <<
Just because we are used to doing that way does not mean that is the only way. Hellen Keller was missing lots of feedback that most of us take for granted, but was quite sentient. A child with a severed spine can still learn. I will agree that it is tougher, but not a prerequisite.
________________ oop.ismad.com
|
Post #4,931
8/13/01 6:25:19 PM
|
Give me an alternate solution or stfu.
Saying, "there might be another way," without providing another way is ego-surfing at its worst.
"Achieving Artifical Intelligence" is inherently a statement of anthropomorphic expectation. If you want some other sort of intelligence, talk about it in another thread.
Dang, I'm getting tired of shotgun snipers...
That's her, officer! That's the woman that programmed me for evil!
|
Post #5,080
8/14/01 2:34:15 PM
|
"human-like" vs. "smart"
>> "Achieving Artifical Intelligence" is inherently a statement of anthropomorphic expectation. If you want some other sort of intelligence, talk about it in another thread. <<
This is terminology issue which probably could last forever, but there are two competiting definitions of AI. One is making machines "human-like", and the other is making them "smart". If a machine was Volcan-like, I would still consider it "smart", but not really "human-like".
>> Saying, "there might be another way," without providing another way is ego-surfing at its worst. <<
The bottom line is that we don't know. It is like OO proof: without it all one can say is that they DON'T KNOW if X is objectively better than Y. (Note that OO being better for your own mind is not the same as being objectively better.)
If you DO know for sure one way or the other, then please present your evidence.
________________ oop.ismad.com
|
Post #5,085
8/14/01 2:43:41 PM
|
On another tack
There might be another way of achieving AI, just as their might be a way of making real strawberry jam without the need for strawberries, sugar, water or pectin; stating this is a completely worthless exercise UNLESS you are willing to discuss the mysterious "other way". Because it's *bleedin' obvious* that there *MIGHT* be another way.
If you can't be arsed to think, at least have the good grace to not parade your intellectual laziness here.
-- Peter Shill For Hire
|