IWETHEY v. 0.3.0 | TODO
1,095 registered users | 1 active user | 1 LpH | Statistics
Login | Create New User
IWETHEY Banner

Welcome to IWETHEY!

New Until they can expain . . .
. . how slow organic connections can do graphic processing and analysis at a rate that makes the fastest supercomputer lower than a flatworm - until then they are nowhere.
[link|http://www.aaxnet.com|AAx]
New Explanation
Supermassive parallelism and vast data bandwidth.

We simply don't hook billions of slow CPUs together with a hugely efficient interconnect fabric, like the human brain does, cuz we can't.

Yet.


Peter
[link|http://www.no2id.net/|Don't Let The Terrorists Win]
[link|http://www.kuro5hin.org|There is no K5 Cabal]
[link|http://guildenstern.dyndns.org|Home]
Use P2P for legitimate purposes!
[link|http://kevan.org/brain.cgi?pwhysall|A better terminal emulator]
New We can - but it wouldn't run C well
So we don't.

However, Henry Fuchs did just that with the early PixelPlane architectures.

[link|http://accad.osu.edu/~waynec/history/lesson5.html|http://accad.osu.edu...tory/lesson5.html]



[link|http://www.blackbagops.net|Black Bag Operations Log]

[link|http://www.objectiveclips.com|Artificial Intelligence]

[link|http://www.badpage.info/seaside/html|Scrutinizer]
New Would love to see the rough sketches,
ideas for the microcode in the consciousness module, the first 500 routines for the Aha! reflex and maybe.. Vols I and II of the proposed handler for the occasional evanescent impression (and its discretion plug-in natch.)

But what I'd really Really like to see the early flowcharts for, is:
the ineffable overlay.

Anything less? just another adding machine -- something Authoritative -- that could write tickets for proceeding through a Stop sign at 0.0013 kph == Against the Rules, ergo Not-good ... Is-bad == Punish.



(Still... if this work keeps a few restless minds away from further work on the Q-bomb.. wtf - give 'em a few 10-billions/yr for a while; at least until we get past the unpleasantness over ~ Why can't a Wog be more like a Gringo? Y'know?)


New Double (or more) standard
When you can show me recognition of the ineffable in more than 25% of the population, then maybe I'll use that as a standard for A.I. Until then you sound like you're looking for something more human than humans.
===

Purveyor of Doc Hope's [link|http://DocHope.com|fresh-baked dog biscuits and pet treats].
[link|http://DocHope.com|http://DocHope.com]
New Perhaps we are missing a point here -
which you just made, but didn't notice (?)

Yes - if what you want is YAN robotic machine which can do a few more things sans human supervision: carrion. (They Will - there's VC $$ for eliminating human workfare, in a spread-sheet-defined 'World'.)

Since it's only the (.001-to-1%) who ever accomplish anything notable/useful, while we don't remotely understand any Aspect of their special qualities - why, there's one massive Ignorance-flag for The Designer. (No revelation that many humans go through the complete birth-death cycle by rote, sans a single original thought, as you suggest.) Machine-like behavior; present in every corporate slogan and forming the grammar of that familiar disingenuous patois we have so come to despise. What a legacy!

So which "AI" is AI going to settle for?



Personally, I think that the best our organizations (and their motivation$) could ever produce.. is apt to be of <class=dumbth scale=refined>. The best of our intellects won't be modelling their self-replication any Eon soon IMO. I'll give odds.. or, when floated: sell short.



oTpy

Expand Edited by Ashton Sept. 6, 2006, 06:00:57 PM EDT
New No, it'll happen
Idiot parents sometimes raise genius kids ... by accident, I guess. Once we've got the A.I. that can produce the most efficient design of its own successor, then all bets are off and SkyNet is real.

We all rate "self awareness" as somehow indicative of intelligence. But pure dumb chance and lots of trial and error came up with the design for us. Electronic "generations" can operate on a much faster scale. Who's to say what that'll produce?


And yes, I noticed that point.
===

Purveyor of Doc Hope's [link|http://DocHope.com|fresh-baked dog biscuits and pet treats].
[link|http://DocHope.com|http://DocHope.com]
New See Charles Stross, Accelerando, Economics 2.0
New See also "Dial F for Frankenstein"
Reproduced here in [link|http://www.cybered.co.th/eng/DIAL.html|hideous hypertext.]

cordially,
Die Welt ist alles, was der Fall ist.
New I was is thinking... (upon re-reading that)
Frankenstein's Monster Sings Christmas Carols while roasting chesnuts on an open fire, accompanied by Roger Whittaker and Vanilla Ice.

Seems about right.
--
[link|mailto:greg@gregfolkert.net|greg],
[link|http://www.iwethey.org/ed_curry|REMEMBER ED CURRY!] @ iwethey
Freedom is not FREE.
Yeah, but 10s of Trillions of US Dollars?
SELECT * FROM scog WHERE ethics > 0;

0 rows returned.
New The thing about Frankenstein's monster...
...is that the "monster" was basically good at heart until rejected as an abomination by its own creator and others around it. Even then, it only sought revenge against its own creator.
Hurt me if you must, but let the duckie go!
New Ummm, you definitely missed that point.
--
[link|mailto:greg@gregfolkert.net|greg],
[link|http://www.iwethey.org/ed_curry|REMEMBER ED CURRY!] @ iwethey
Freedom is not FREE.
Yeah, but 10s of Trillions of US Dollars?
SELECT * FROM scog WHERE ethics > 0;

0 rows returned.
New Naw, I went away at a tangent.
That was just something that always bothered me whenever people would bring up Frankenstein as the ultimate "OMG Technology EVIL!" book.
Hurt me if you must, but let the duckie go!
New No... I was think about the irony of the ....
AI thingy and the fact that Abby something trying to sing with grunts and groans and screeches (due to the fire) vs. the Ultimate Evil...

Though with Roger Whittaker and Vanilla Ice makes it a trilogy of Evul from teh MOOSic wurld.

Sheesh, you really do have to get those neck-bolts adjusted. And soon.
--
[link|mailto:greg@gregfolkert.net|greg],
[link|http://www.iwethey.org/ed_curry|REMEMBER ED CURRY!] @ iwethey
Freedom is not FREE.
Yeah, but 10s of Trillions of US Dollars?
SELECT * FROM scog WHERE ethics > 0;

0 rows returned.
New Who is the target audience?
That glossary is insane.
===

Purveyor of Doc Hope's [link|http://DocHope.com|fresh-baked dog biscuits and pet treats].
[link|http://DocHope.com|http://DocHope.com]
New I expect in the long term the widespread adoption of C . . .
. . will come to be seen as one of the greatest mistakes in the development of computers.

I remember a comment from an inteview in Computer Languages ("The magazine for those who program in more than two languages"). When the intervied luminary was asked, "If major development companies are using languages like Modula why don't we hear more endorsements? All we hear about is C."

The response was, "These companies are reluctant to issue endorsements because they consider their non-use of C to be a competative trade secret".
[link|http://www.aaxnet.com|AAx]
New Was that the LISP guys who did the Yahoo! stores?
===

Purveyor of Doc Hope's [link|http://DocHope.com|fresh-baked dog biscuits and pet treats].
[link|http://DocHope.com|http://DocHope.com]
New You are not alone there
At PARC we estimated that we could get
about a factor of 5 from special low level (HW+firmware) design. If
Moore's Law is doubling every 18 months, then this is almost 4 years
of being ahead if you can compete with ordinary silicon (your factor
of 8 would be 3 turns of Moore's Law, or about 4.5 to 5 years). The
Alto was a big success because it took Chuck Thacker and two
technicians only about 3.5 *months* to make the first machine, and it
only took another month to move the first Smalltalk over from the
NOVA to the Alto. So we were off and running.

If we believe Chuck's estimate that we've lost about a factor of a
thousand in efficiency from the poor design choices (in many
dimensions) of Intel and Motorola (and Chuck is a very conservative
estimator), then this is 10 doublings lost, or about 180 months, or
about *15 years* for Moore's Law to catch up to a really good scheme.
This is a good argument for trying very different architectures that
allow a direct target to be highly efficient VHLL *system* execution.

A small group approaching this should try to do everything with
modern CAD and avoid getting messed up with intricate packaging
problems of many different types. So I would look at one of the
modern processes that allows CPU and memory logic to be on the same
die and try to make what is essentially an entire machine on that die
(especially with regard to how processing, memories and switching
relate). Just how the various parallelisms trade off these days and
what is on and off chip would be interesting to explore. A good
motivator from some years ago (so it would be done a little
differently today) is Henry Fuch's "pixel planes" architecture for
making a renderer as a smart memory system that has a processor for
each pixel. Such a system can be have a slower clock and still beat
the pants off a faster single processor von Neumann type architecture.

-Alan Kay

[link|http://lists.squeakfoundation.org/pipermail/squeak-dev/2003-March/055371.html|http://lists.squeakf...March/055371.html]



[link|http://www.blackbagops.net|Black Bag Operations Log]

[link|http://www.objectiveclips.com|Artificial Intelligence]

[link|http://www.badpage.info/seaside/html|Scrutinizer]
New Specialized processors and distributed computing.
The eye is very, very good at seeing things that we've evolved to see. [link|http://en.wikipedia.org/wiki/Cone_cell|Cones] and [link|http://en.wikipedia.org/wiki/Rod_cell|rods] in the retina have different sensitivities, so a lot of "image processing" is done before the electrical impulses even hit the optic nerve.

Once the impulses reach our brains, specialized regions take over interpret what we see. While we probably don't have [link|http://en.wikipedia.org/wiki/Grandmother_cell|grandmother neurons], I wouldn't be surprised if a very small number of neurons were involved in such recognition. I don't know about you, but there have been times when I've seen someone far off in a crowd, someone who I hadn't seen in years, yet I instantly had the feeling I knew that person even if I couldn't make out their features clearly at a distance. The human brain is staggeringly good at recognizing faces.

Oh, and I'm not sure that the digital camera folks aren't catching up very rapidly with what we're able to do. E.g. [link|http://www.canon.ca/digitalphotography/english/ctech_article.asp?id=208&tid=6|Canon's Digic II Processor].

Cheers,
Scott.
     Quest for A.I. "futile" - (drewk) - (22)
         Re: Quest for A.I. "futile" - (inthane-chan)
         I just subscribed to "Skeptic Magazine" - (folkert)
         'Hacking' the brain OS and AI - (andread) - (19)
             Until they can expain . . . - (Andrew Grygus) - (18)
                 Explanation - (pwhysall) - (16)
                     We can - but it wouldn't run C well - (tuberculosis) - (15)
                         Would love to see the rough sketches, - (Ashton) - (11)
                             Double (or more) standard - (drewk) - (10)
                                 Perhaps we are missing a point here - - (Ashton) - (9)
                                     No, it'll happen - (drewk) - (8)
                                         See Charles Stross, Accelerando, Economics 2.0 -NT - (inthane-chan) - (7)
                                             See also "Dial F for Frankenstein" - (rcareaga) - (6)
                                                 I was is thinking... (upon re-reading that) - (folkert) - (4)
                                                     The thing about Frankenstein's monster... - (inthane-chan) - (3)
                                                         Ummm, you definitely missed that point. -NT - (folkert) - (2)
                                                             Naw, I went away at a tangent. - (inthane-chan) - (1)
                                                                 No... I was think about the irony of the .... - (folkert)
                                                 Who is the target audience? - (drewk)
                         I expect in the long term the widespread adoption of C . . . - (Andrew Grygus) - (2)
                             Was that the LISP guys who did the Yahoo! stores? -NT - (drewk)
                             You are not alone there - (tuberculosis)
                 Specialized processors and distributed computing. - (Another Scott)

Yeah, would be nice if "despair" wasn't such an appropriate word choice.
80 ms