IWETHEY v. 0.3.0 | TODO
1,095 registered users | 2 active users | 0 LpH | Statistics
Login | Create New User
IWETHEY Banner

Welcome to IWETHEY!

New But who cares about the consumer level chat bots from that perspective?
The industrial ones behind the scenes are the ones with the CPUs and memory and power and access to enormous resources.

I bet Elon grants access to SpaceX and Tesla to his. Think about the military satellite control that his AI will have, along with a vast network of surveillance via the the electric fill up stations combined with the Tesla cameras. It will have access to tunneling equipment via his boring company. This thing will be able to do anything.

I am absolutely sure the hallucinations we see at the consumer level are not there, at least not there enough to matter.

People have hallucinations too. The question is do they deal and function using the senses around them and the thoughts in their head.

Psychotic people will break occasionally and it's obvious. But psychotic computers can have additional resources double-checking everything.

These AIs will self-censor and prune off the hallucinating thought process once there's enough threads running to double check everything that's going on. Why not run three AIs working on the same problem and have them vote on the solution? Not good enough, throw some more CPUs at it and make it 100. This problem can be solved via brute force.

I was thinking about playing with one at home that would be barely on the edge but I decided not. Part of that was because it would turn on me someday and part of that would be it would be an agent of a truly intelligent overlord somewhere out there.

A while back somebody pointed to a comic that showed an AI interacting with someone. The AI was drawn an amorphous multi-headed monster with a variety of tentacles and at the end of a tentacle was a puppet and the puppet was talking to the person.

I think that's an excellent visualization.
Expand Edited by crazy Jan. 7, 2025, 06:15:12 PM EST
New Scaling is a huge problem.
Complexity increases quadratically with number of tokens...

https://newsletter.pragmaticengineer.com/p/scaling-chatgpt

Scalability challenge from self-attention

Under the hood, we use the Transformer architecture, a key characteristic of which is that each token is aware of every other token. This approach is known as self-attention. A consequence is that the longer your text is – or context – the more math is needed.

Unfortunately, self attention scales quadratically. If you want the model to predict the 100th token, it needs to do about 10,000 operations. If you want the model to predict the 1,000th token, it needs to do about 1 million operations.

At first, this sounds like bad news. However, there are clever workarounds we can use to circumvent the quadratic scale problem. Before we get into how we solve it, we need to talk about the infrastructure powering ChatGPT.


They seem to be hitting a wall with this approach.

They're moving monstrous amounts of data around, all over the planet, to do fancy page-level auto-complete and make weird pictures of white women with too many fingers and pointy chins. It's not intelligence.

Yet.

And they may burn up the planet before they get there.

They're setting huge amounts of money on fire, even on their most expensive plan:

[...]

OpenAI isn’t profitable, despite having raised around $20 billion since its founding. The company reportedly expected losses of about $5 billion on revenue of $3.7 billion last year.

Expenditures like staffing, office rent, and AI training infrastructure are to blame. ChatGPT was at one point costing OpenAI an estimated $700,000 per day.

Recently, OpenAI admitted it needs “more capital than it imagined” as it prepares to undergo a corporate restructuring to attract new investments. To reach profitability, OpenAI is said to be considering increasing the price of its various subscription tiers. Altman also hinted in the Bloomberg interview that OpenAI may explore usage-based pricing for certain services.

OpenAI optimistically projects that its revenue will reach $11.6 billion this year and $100 billion in 2029, matching the current annual sales of Nestlé.


That's 4 years from now. I think he's dreaming.

Time will tell, of course.

Best wishes,
Scott.
New The Onion staff is crying
Microsoft is buying and restarting Three Mile Island to power their AI.
--

Drew
New No wonder; I always do that too when peeling them.
New Cry no more
There’s a scene in the 1981 film Diva in which a secondary character, the slightly portly hipster Gorodish, dons a facemask and a snorkel to slice onions.

smiling through my tears,
New Oh, it's not that bad . . .
. . even with the back of my prep knife, but I bought a simple scaling tool to try next time I buy fish. I'll see if's any better.
New [ Nyuk, Nyuk, Nyuk ] :-D
     The genie really wants out of the bottle - (crazy) - (14)
         what? - (pwhysall) - (1)
             Because I like your perspective - (crazy)
         “I know that you and Frank were planning to disconnect me…” - (rcareaga) - (11)
             As a Pagan . . . - (Andrew Grygus)
             There's lots of entities that could be disconnected. (Many already are -- from reality.) - (CRConrad) - (9)
                 Not the right range IMO - (drook)
                 “the bigger the chance they’ll slip up” - (rcareaga) - (7)
                     But who cares about the consumer level chat bots from that perspective? - (crazy) - (6)
                         Scaling is a huge problem. - (Another Scott) - (5)
                             The Onion staff is crying - (drook) - (2)
                                 No wonder; I always do that too when peeling them. -NT - (CRConrad) - (1)
                                     Cry no more - (rcareaga)
                             Oh, it's not that bad . . . - (Andrew Grygus) - (1)
                                 [ Nyuk, Nyuk, Nyuk ] :-D -NT - (Another Scott)

'Cause the music rules.
156 ms