IWETHEY v. 0.3.0 | TODO
1,095 registered users | 0 active users | 0 LpH | Statistics
Login | Create New User
IWETHEY Banner

Welcome to IWETHEY!

New See the 3rd page of your link.
That system thermally throttles quickly (it only runs at full power very briefly). AFAIK, all the micro PCs do that.

That's usually incompatible with gaming.

I agree with adminiscott - if you want to game, get a gaming box like a PS5.

If you want to do computer stuff, then think carefully about what you want it for. If you want to do heavy-duty computing, you still need a big box and decent cooling and it probably won't be small (or cheap).

Last November got J an M2Max MacStudio from B&H (she's been a Mac person for ages). It's quiet and has decent horsepower and decent ports. Spendy, of course, but not as ridiculous as some of their boxes. I don't know how they are for running Linux these days (if that's your thing), or if they somehow make running Linux/Unix-like things inside macos painful.

Happy hunting!

Cheers,
Scott.
New Understood.
If I want to run gaming at any given level, I have to have a certain number of cuda units. If I want one of those baby PCS, I have to plug in an external GPU coprocessor. I've been reading and watching videos and now have at least a baseline understanding of the amount of power required to run modern games on a PC. It's a lot.

Same thing for localized ai. Downloadable models are rated on the number of billions of parameters they work on. There's the entry level ones that can run in 20 GB of memory and there's the top end ones that can run in 90 GB of memory. So if I want to run a top end one I've got to have a huge amount of memory. And then I want those same GPU cuda units for it to use.

I think I'll give it another 6 months and see where the downloadable AI development goes and combine that with the possibility of me getting a stack of used graphic cards.

I've got too much yard work and pond work and all kinds of around the house work to do before getting distracted with computer fun stuff. But I'll use it as a target reward. I'll accomplish x number of tasks over the summer such as finishing the pond and I'll reward myself with an AI development station for the winter.
Expand Edited by crazy May 22, 2024, 12:42:34 PM EDT
New It's always just about ready
No matter what you want to do - gaming, video editing, AI dev - there's always new hardware on the horizon that is so much better that you might as well wait.
--

Drew
New New AOOSTAR external card coming.
https://www.notebookcheck.net/AOOSTAR-teases-external-graphics-card-solution-with-unreleased-AMD-Radeon-GPU.841420.0.html

$750.

I can see the utility, in limited circumstances. Otherwise, a mini-tower PC would seem to be better in 99% of circumstances.

And a PS5 would still seem to be better for gaming (unless Sony continues their trickery with demanding subscriptions and so forth - I haven't kept up with that).

Cheers,
Scott.
New Looks like I want a steam account
So then it becomes a matter of creating a PC that fulfills the requirements of steam and then upping it if I find it affordable.

I initially thought I wanted a tiny PC simply because I'd like to be able to put it on top of my projector. But I can have it off to the side and have an HDMI cable going from it. That's fine. Also, larger PC has a greater chance of having an initial steady-state airflow that keeps the CPU cool. Many of these mini PCS are great to start off with but start screaming when you attempt to pull any real tasks out of them.

Also, I've been wanting to do cuda programming for 10 years. Seriously. I was being introduced to it at boa and then I got arrested. So that's always been hanging on the back of my head. I want to code in Python and toss off many many cuda tasks.
New Lots of folks are doing amazing things with GPUs.
ScienceDirect.com

[...]

The Neptune implementation of the EM-EMC algorithm can perform parallel calculations using either CPU (multi-core) or GPU (many-core) processors, making maximum use of parallelism in each case to reduce execution times. All simulations were performed with a Supermicro workstation, with dual Intel Xeon Gold 6136 CPUs (2 x 12 = 24 physical cores; 3.0–3.7 GHz clock rate) and an Nvidia Titan RTX GPU (4608 CUDA cores; 1.35–1.77 GHz clock rate). Single precision floating point calculations were used to maximize performance.

[...]

Execution time using the GPU implementation was 2.7 min (4608 cores), compared to 26.2 min using the parallel CPU implementation (24 cores).

[...]


Have fun!

Cheers,
Scott.
New Is cuda still a thing?
Real question. I look at what I was using 10 years ago and I'm pretty sure that's not what I'd use if I were starting over today.
--

Drew
New What other are you thinking of?
I'd say Nvidia owns the world as far as parallel processing via gpus right now and the world seems to have standardized on it.

https://developer.nvidia.com/about-cuda#:~:text=CUDA%20extends%20beyond%20the%20popular,you%20can%20accelerate%20your%20applications.

Of course, that's from nvidia's perspective. I heard of some projects that put front end libraries and tried to allow for seamless farming out between different co-processors which in turn would create cuda and other code.

Do you have any direction you suggest I look into?
New I don't actually know what's current, which sparked the question, do you?
Without looking it up, I'm guessing if I went back 10 years I'd have been looking at rust, or go, or ruby on rails. And I haven't heard anyone talking about those lately.
--

Drew
New Bad comparison
To start off with cuda processors are the cornerstone of Nvidia graphic cards. They were originally created to create polygons or do whatever other massive co-processing it takes to keep game screens filled with pretty pictures. They are simpler than regular processors but they can pack a shitload of them in a single card. Thousands of them.

So then cuda programming became whatever style that is requiring to toss off tasks to the cuda processors on an Nvidia card.

Ruby is an object oriented scripting language. Ruby on rails is a framework which essentially is a whole bunch of libraries and programming styles to use the rails framework to create database driven websites. Or at least that's what I seem to recall from many years ago.

Rust is a very recent systems programming language that supposedly compiles to quick machine code while simultaneously allows for memory safety.
New Ah, I was thinking cuda was a language, not a hardware platform
--

Drew
New Wha, you haven't? How?
...if I went back 10 years I'd have been looking at rust, or go, or ruby on rails. And I haven't heard anyone talking about those lately.
Places like Hacker News and programming YouTube, Rust is pretty much all they talk about. (OK, not all: Also Zig, and Python of course, and still a perhaps surprising amount of Go. But yeah, lots of Rust.)
--

   Christian R. Conrad
The Man Who Apparently Still Knows Fucking Everything


Mail: Same username as at the top left of this post, at iki.fi
New Yep, Python/Rust/Go/Typescript are the 4 horsemen now
Unless you're doing Windows programmering, in which case it's C#.
Regards,
-scott
Welcome to Rivendell, Mr. Anderson.
New I changed my mind
Why should I bend over backwards to learn s*** that I won't apply in my life? F*** that. I'll get a PS5 and game and enjoy myself.

I assume you have this overarching target goal of your life. The target goal is kick the kids out of the nest and make sure that they are taken care of and then enjoy yourself on the way out. I've been forced into enjoy myself on the way out so I might as well enjoy myself on the way out.
New Mostly by not reading about programming languages any more
Which was my point. I have no idea what, if anything, that I used to know about is still around.
--

Drew
New You remember the old days when c was used to programming operating systems, with a bit of assembler
Not anymore. Rust is where it's at. Think of c but a little safer.

I learned c from kerningham and Richie. I remember in '77 when my brother Leon was talking to me on the phone. I was in 9th grade. He was a grown up. He started to describe a computer that he was working on in his job. He described Unix to me. He said it would be a good idea if I ever came across Unix to go learn it. It was interesting and worth a career.

3 years later, I fell into a technical support job that involved entry-level Motorola Tandy xenix systems. And the first Intel Unix systems. And I was handed kerningham and Richie and told to go learn C while doing tech support.

And a rocketing career was born. I am one of the original old geeks. I'm a gray beard. I'm not part of the original Unix group. I'm not one of those brilliant originalists. I'm a hanger on. But it was fun hanging on.
New I was never that deep in the stack
End user apps only for me. Closest I got to "real programming" was Pascal in school, and still like the "main loop and functions" construction. Then my first real world use of computers was Macromedia Dreamweaver and Adobe PageMaker doing static websites and newspapers.

Then prototyping office automation in MS Access before turning it into a multi-user web app on a SQL back end; Active Server Pages, then PHP. Couple web developer gigs on PHP and Postgres/MySQL.

I've done my own Linux server admin when this was all new, but now I just want to click a button to install an app then start creating "content." I'm glad there are people who like doing the back-end stuff, but I'm not one of them any more.
--

Drew
New macOS on a Mac Studio is literal actual UNIX
so running unix-y stuff is fine.

I wouldn’t put Linux on a Mac. It’s a ridiculously expensive way to run Linux, and the x86 platform is better supported anyway.
     Hey gamers: what hardware to get? - (crazy) - (38)
         PS5 then. - (malraux)
         What Adminiscott said. - (static) - (1)
             Okay, sounds good - (crazy)
         On the other hand - (crazy) - (20)
             Can that thing run an open source trainable model? - (crazy) - (1)
                 Not local, but ChatGPT on Raspberry Pi smart speaker - (Another Scott)
             See the 3rd page of your link. - (Another Scott) - (17)
                 Understood. - (crazy) - (15)
                     It's always just about ready - (drook)
                     New AOOSTAR external card coming. - (Another Scott) - (13)
                         Looks like I want a steam account - (crazy) - (12)
                             Lots of folks are doing amazing things with GPUs. - (Another Scott)
                             Is cuda still a thing? - (drook) - (10)
                                 What other are you thinking of? - (crazy) - (9)
                                     I don't actually know what's current, which sparked the question, do you? - (drook) - (8)
                                         Bad comparison - (crazy) - (1)
                                             Ah, I was thinking cuda was a language, not a hardware platform -NT - (drook)
                                         Wha, you haven't? How? - (CRConrad) - (5)
                                             Yep, Python/Rust/Go/Typescript are the 4 horsemen now - (malraux) - (1)
                                                 I changed my mind - (crazy)
                                             Mostly by not reading about programming languages any more - (drook) - (2)
                                                 You remember the old days when c was used to programming operating systems, with a bit of assembler - (crazy) - (1)
                                                     I was never that deep in the stack - (drook)
                 macOS on a Mac Studio is literal actual UNIX - (pwhysall)
         Depends what games you want to play with what input method - (pwhysall) - (13)
             Oh how I wish... - (CRConrad) - (8)
                 Even large case fans... - (pwhysall) - (7)
                     He doesn't even have a fan club, let alone a curve to grade them on. - (CRConrad) - (6)
                         Noctua looks like the quietest - (crazy) - (3)
                             Bah. - (CRConrad) - (2)
                                 Don't give him time for research, procrastination or paralysis through analysis - (crazy) - (1)
                                     He's 21, he can handle T LEAST PART OF THE RESEARCH HIMSELF - (CRConrad)
                         The quietest fans are from BeQuiet - (pwhysall) - (1)
                             Thanks! -NT - (CRConrad)
             It will be a console for a year or two - (crazy) - (3)
                 Different controllers for different games - (drook) - (1)
                     [ rofl ] -NT - (Another Scott)
                 Third party controllers are aimed at competitive gamers... - (pwhysall)

You didn't need to share.
148 ms