|
Randor wrote: Don't try to combine a gaming machine and a Dev box, what a ridiculous idea. Worked for me.
Not that the GPU gets used a lot during coding, but it's not in the way either. VS works perfectly fine on a gaming machine. There's also a server here, in the living room and having two desktops on two writing desks in the living might be a bit much for her. There's also a server here, running 24 hrs/w.
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
honey the codewitch wrote: Edit: My preferred would be a 4090 but for fire hazard and cooling issues. I like to build 10 year systems, and a 4090 is by best shot at that.
Even if you keep the core of the system for a decade, improvements in perf/watt on the GPU are high enough that you'd probably come out ahead at the ~5 year mark buying the same level of performance @ 80% lower power. Unless you want your gaming performance to decline from god-tier to potato over your system lifetime you'll need at least one, probably two GPU swapouts at similar performance tiers as the original card.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
|
|
|
|
|
I only really play Fallout 4, though I mod it to the point where it strains my 2080 TI
I will continue to play Fallout 4 until Fallout 5 comes out. At this rate I will probably be dead or blind by the time Fallout 6 launches.
A 4090 should do it for me, I think. Though on reflection, a 3090 TI is a safer bet for my system and I could always swap it out later.
To err is human. Fortune favors the monsters.
|
|
|
|
|
Agree.
Been there done that, several years ago, when high performance GPU's showed up.
Not sure when water cooled systems came to be the need,
but if one does not have that option, pump air through an open air chassis as much as possible.
External fans an option.
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
jmaida wrote: if one does not have that option, pump air through an open air chassis as much as possible.
My chassis can take liquid but I don't want it because it is open air. The way the radiator mounts in it, I'd be wondering if the positive air pressure was flowing out all the gaps in order to keep the pet hair and dust out. Right now it works perfectly with 4 120mm fans 1cm away from the top glass panel. It pushes the air against the glass, causing it to flow out toward the edges in all directions and push air out those gaps.
The case is a Thermaltake Level 20 VT. It's by far the nicest, best engineered case I've ever owned. The airflow was thought out very carefully, at least for air cooling, which is what I'm doing, and plan to continue.
To err is human. Fortune favors the monsters.
|
|
|
|
|
honey the codewitch wrote: The airflow was thought out very carefully, at least for air cooling
Have you thought of trying liquid air cooling?
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
I did briefly consider submerging the entire thing in a mineral oil bath and cooling it with a box fan.
Or liquid nitrogen, but that gets messy and i have pets. It could end poorly.
To err is human. Fortune favors the monsters.
|
|
|
|
|
honey the codewitch wrote: I did briefly consider submerging the entire thing in a mineral oil bath and cooling it with a box fan.
The Cray-2 approach...
Java, Basic, who cares - it's all a bunch of tree-hugging hippy cr*p
|
|
|
|
|
Has anyone ever made a Cray-2 simulator running on a modern PC? I'd be curious to know how the hardware 1985 Cray-2 would compare to an emulator on a top-range gaming PC of 2022!
I suspect that the result might be like my Alma Mater's Cray-1: After a few years it was thrown out - its processing speed was certainly high enough, but for the major tasks - weather forecasting and FEM - the volume of the raw data to be processed was so immense (for the day) that the CPU was idling, waiting for the input channels to fill up memory. The Cray-1 was replaced by a Cray-2; apparently, the Cray-2 I/O-capacity was a lot higher. I wouldn't be surprised if a modern CPU/GPU can compete in processing power, but given similar weather forecasting / FEM tasks, the bottleneck would be the I/O, just like the Cray-1.
Yes, there are datacenter-oriented versions of the top chips, with loads of PCIe lanes. They would probably come out better than the gaming-oriented chips.
|
|
|
|
|
I think unless you want to go full-on liquid cooling, you'll need to compromise.
From what I understand, you don't need the 4090; just get a 30xx series card (maybe a ti one) and get the best air cooling solution you can get (double tower cooler) and add RGB fans, because everything is cooler with RGB.
If the usual ambient air temperature is relatively cool, it should be good enough (don't trust me here, I have no clue)
But what I can see, most people recommend an AIO for the i9 series cpu.
CI/CD = Continuous Impediment/Continuous Despair
|
|
|
|
|
I'm not going liquid because my Thermaltake Level 20, while it supports liquid, is an open air chassis design. The glass front and top are air gapped from the chassis. Because of this I need positive air pressure along the top of the case. Currently I have 4 120mm fans about 1cm away from the glass, pushing against it, causing the air to flow to the edges of the glass and produce positive pressure around the case's air gaps. With liquid, I have to remove those fans to fit the radiator, and between that and my video card, all the air will flow out of the left side of my case only.
It's not so much a cooling issue, the airflow. It's a cat hair issue.
So I stepped down to an i5-13600K since it runs at just over 180 watts instead of 250.
I've heard the 4080 and the 3090 TI have the same power draw, so I may get a 4080. I heard underclocking them they are super efficient (performance per power draw) at 300 watts
Here's my new target specs
PSU: EVGA Platinum 1000 watt
Mobo: ASUS/ROG STRIX 690z G Gaming Wifi
CPU: I5-13600K
RAM: 64GB Corsair Dominator
GPU: 2080 TI though I'm probably upgrading to a 3090TI or 4080
Storage: 2TB Samsung 990 Pro NVMe system drive. 3x1TB RAID 1 Samsung 990 Pro, for 20 Gigabyte a second reads (I think?) for secondary fast storage
Chassis: Thermaltake Level 20 VT
To err is human. Fortune favors the monsters.
|
|
|
|
|
> I think unless you want to go full-on liquid cooling, you'll need to compromise.
These days you can just get a standard AIO system for CPUs (Corsair H150 and co) and be done with it. Far less effort than going all-out, and not really an issue unless you want to start overclocking.
-= Reelix =-
|
|
|
|
|
I would get an AMD processor. They have a new series coming that will not have the X suffix and will use about 65W. That should be rather easy to keep cool. Rumor: AMD Ryzen 7900/7700/7600 non-X specifications + pricing revealed, release date of Q1 2023.[^]
You can also down-clock GPUs and reduce the heat they emit. The site linked to also reviews video cards so they have a lot of info on those too.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
I'm after maximum single core performance, and no 65W processor will net me that, much less an AMD which is better at multithreading than single threading in terms of efficiency and performance.
I'm on a Ryzen 7 4750G right now.
I've considered getting a 4080 and underclocking it, because it's super efficient at 300W but right now I have a 2080 TI and it works great.
To err is human. Fortune favors the monsters.
|
|
|
|
|
|
Note: I have to go with the i9 spec in that review because it doesn't have my i5, but single core perf is close to the same on both.
After reading that review I'm still going with the intel, as it outperforms the Ryzen on GCC benchmarks, and some other single core heavy tasks - i'm guessing a lot of it has to do with having more on die cache.
This is about compile times with GCC, so anything that bests it is my go to.
The other thing is the 13th gen are new, and they haven't filled out the model line, so I may have some upgrade paths in the future. The Ryzen is a little older, and where it sits in the AMD lineup it's probably near their top end for that generation of chip. I think. I know a lot more about intel's habits in terms of chip development, their tick/tock cycles, etc.
To err is human. Fortune favors the monsters.
|
|
|
|
|
I currently have a 5950X at work and it writes to an NVME drive and build times are lightning fast. I use VS22 with about 100 or so projects and it is really quick.
Have Fun.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
I've come to appreciate having a tiny box on my desk, so these days I have a NUC sitting right next to me, but RDP into my big rig, which generates a lot of heat, is loud, big, and--most importantly--in another room, so it doesn't bother me at all.
I have 3 monitors connected to the NUC, including one running at 4K - it's got plenty of horsepower for that.
|
|
|
|
|
I switched over to a single 55" 4k QLED smart TV as my monitor. I hate multimon for many reasons.
My PC is a relatively small cubish thing. A Thermaltake Level 20 VT. Dimension (H x W x D) 348 x 330 x 430 mm (13.7 x 13 x 16.9 inch) - 2 and a half pill bottles tall, for perspective (don't ask )
My PC isn't very loud, even on air, unless I'm pushing the GPU, and currently I'm not anywhere close to having heat issues, because the system is fairly modest compared to my target upgrade.
To err is human. Fortune favors the monsters.
|
|
|
|
|
Right...I was merely making the suggestion to build a beefy system but put it in another room (so the heat/noise isn't an issue for where you work), and then just RDP into it. I've been doing that for a few years now, and I wouldn't allow a loud PC back into my office.
|
|
|
|
|
One good thing: You never need to spend money on buying an electric heater.
|
|
|
|
|
I think I read a day or two ago that you want to stick with ITX. Why is that? Seems to me that limits your options.
Paul Sanders.
If I had more time, I would have written a shorter letter - Blaise Pascal.
Some of my best work is in the undo buffer.
|
|
|
|
|
No. I'm avoiding ITX. My chassis fits MicroATX.
To err is human. Fortune favors the monsters.
|
|
|
|
|
This makes me feel better.
I've been buying the Precision Laptop/Workstations.
They are gamer designs I use for development. I don't play any games, so I sacrifice the GPU a bit.
(as long as I can run 4 monitors, or one 55inch 4k UHD, I am good).
BUT... That said... We've seen overheating issues. I could leave my old machine running for days.
Now I power off every night.
I also have a piece of wood UNDER the back of the machine to increase airflow. One of my devs has his sitting on a laptop cooler (5 fan design) as he doesn't have the A/C options, and the summer months are a problem.
I've noticed it's only getting worse.
FWIW, I agree with building machines for long-term. I shoot for 5yrs. After 2yrs, I usually buy a "cold spare" (A used version of the same computer, off-lease), and use it for testing/validating my backups (cloned drives).
I can't fathom getting 10yrs from a machine. Hardware (specifically USB seems to fail before that time).
But the cost of a new machine is on par with the 80 hrs it takes to move my licenses to it. In the last 2 builds, I've slowly started using more VMs for various development environments. I am hoping to get this down to 40hrs. The part I truly hate is LICENSED software tied to the DRIVE_ID (Quickbooks). So if I upgrade my hardrive, the software doesn't work until I jump through some hoops. But I have like 4 of those. It's just adding time to the process...
Finally comment on heat. One of the devs who was NOT cooling his machine ran into NVME issues where the drives were getting too hot and faulting. He actually assumed the drive was bad, and went through the restore process (swapping back to the previously cloned drive, and then restoring from backup any changed files).
The next day, he checked the the drive he replaced, and it was working fine! Scary.
I read somewhere that some building in iceland/greenland uses BTC miners to heat the building.
Basically the BTC is a wash for the energy consumption, and the heat is now beneficial. Probably the most expensive heater ever built. (Not sure it's a true store or it was a plan, FWIW).
But I feel your heated pain!
|
|
|
|
|
I should have written that I shoot for 10 years. Realistically I end up replacing or upgrading components along the way, like adding more RAM, or an NVMe (although in this case I'm starting with NVMe sys drive, and RAID 1 NVMe x3 secondary storage so I can't get much faster than that with current tech. I designed it such that my read speeds off secondary storage will saturate my PCIe 3.0 bus.
To err is human. Fortune favors the monsters.
|
|
|
|
|