I want to get a gaming PC but I know nothing about them, advice thread

ThatZenoGuy

Zealous Evolutionary Nano Organism
It really won't get loud at all. The cooler is massive overkill for the CPU already and the GPU is, uh, a 6700 XT, a 230w card, and it has a fan pointed right at it for that case. It's fine. Worst comes to work, he can buy an extra fan or two for like 20 bucks.
That parts themselves won't get too hot, but that heat ends up trapped in the case unless there's enough ventilation. Also in hindsight if OP doesn't know what computer to buy he probably has no clue about fan curves.
 

Agent23

Ни шагу назад!
You don't need that many case fans for such low end parts. The CPU barely hits its TDP and the lower power GPU will be getting directly fed by the bottom fan, where the CPU will have hot air exhausted out the back thanks to the fan placed there. It's not like I recommended a 4090 and 13900k with that case or something(though even then, at that point you'd have the money for more fans).

Worst case? Just take off the side panel.
Yeah, since they went below 22nm the TDP is not that bad, but I'd make sure to have st least 2-3 extra fans for intake and outtake.

What I did was to have two intake fans at the lower front and one side and 2 at the upper and the back of the case, but the FX series was a frigging furnace since AMD could not switch away from 32nm.
 

Vyor

My influence grows!
That parts themselves won't get too hot, but that heat ends up trapped in the case unless there's enough ventilation. Also in hindsight if OP doesn't know what computer to buy he probably has no clue about fan curves.

There is plenty of ventilation. One in, two out(because PSU fan).

Yeah, since they went below 22nm the TDP is not that bad, but I'd make sure to have st least 2-3 extra fans for intake and outtake.

What I did was to have two intake fans at the lower front and one side and 2 at the upper and the back of the case, but the FX series was a frigging furnace since AMD could not switch away from 32nm.

Node doesn't equate to TDP. The 13900k is on a 7nm node and pulls over 300w, the 4090 has a 450w tdp on 5nm.
 

Agent23

Ни шагу назад!
There is plenty of ventilation. One in, two out(because PSU fan).



Node doesn't equate to TDP. The 13900k is on a 7nm node and pulls over 300w, the 4090 has a 450w tdp on 5nm.
Node size impacts energy consumption and efficiency.
And smaller components would mean less of a "surface" to create resistance.

I know what was required to adequately cool an FX vs. a modern Ryzen with the same core count and in the case of the Ryzen somewhat higher clock speeds.

But I will agree that if the architecture changes, like with video cards cramming more cores and increasing frequency you will get a higher TDP.

GPUs are probably the only segment that has continued to evolve at a high rete in recent years, since CPU speed and core count have stayed mostly the same for most product lines.

Hell, if pure speed is what you want I think that the old Centurion/FX-9k cclocks higher at iirc 5Ghz or above.

But that abomination required extreme cooling IIRC and they had to "cherrypick" the highest quality silicon for those units.
 

Vyor

My influence grows!
Node size impacts energy consumption and efficiency.

Architecture matters more. It has always mattered more. The i7-980 was a 6c/12t CPU on 32 nm with a tdp of 130w, the contemporary FX part on 32nm was the FX 9590 at a 220w tdp. The first was faster.
 

Allanon

Well-known member
My advice is to remember games are (often needlessly) going to get more and more demanding. So whatever you do get make sure it can be upgraded into the future easily enough by you and keep in mind such computers use LOTS of electricity so if you are going to work on it follow all safety precautions.
 

ThatZenoGuy

Zealous Evolutionary Nano Organism
My advice is to remember games are (often needlessly) going to get more and more demanding. So whatever you do get make sure it can be upgraded into the future easily enough by you and keep in mind such computers use LOTS of electricity so if you are going to work on it follow all safety precautions.
My dude right now games are so unoptimized that even beefy monsters like 4080's can't go ultra 60fps with all the bells and whistles. Unless OP wants to spend a comical amount of cash, futuregames are not an option. I've basically given up on 'modern' pc games for this reason.
 

ThatZenoGuy

Zealous Evolutionary Nano Organism
at what resolution?
1440p+
Although it depends on the game in question, and flipping the settings to medium lets you run these games on some pretty average hardware.

The issue is that nowadays games hardly look better than games 5+ years older, but require several times the computing power for no mechanical gain.
If my Ryzen 3600 can run a game from the early 2000's with hundreds of NPC's why do modern games struggle with WORSE AI and less NPC's!?
 

Vyor

My influence grows!
1440p+
Although it depends on the game in question, and flipping the settings to medium lets you run these games on some pretty average hardware.

The issue is that nowadays games hardly look better than games 5+ years older, but require several times the computing power for no mechanical gain.
If my Ryzen 3600 can run a game from the early 2000's with hundreds of NPC's why do modern games struggle with WORSE AI and less NPC's!?

There are very, very, very, very, very few games a 4080 can't run at max 1440p60fps.

And you have no idea what you're talking about.
 

ThatZenoGuy

Zealous Evolutionary Nano Organism
There are very, very, very, very, very few games a 4080 can't run at max 1440p60fps.

And you have no idea what you're talking about.
Mind being less rude? Thank you.

RT_1440p-color.png
 

ThatZenoGuy

Zealous Evolutionary Nano Organism
I see a 79FPS average at max settings with raytracing, that ain't struggling. Without RT it gets well over 100FPS.
A state of the art GPU costing something like 2-3 PS5's combined should not be going sub 100fps with a silly harry potter game.
Sub-100fps with some monster tech demo game with absurd effects and dozens of NPC's doing stuff at the same time? Sure. But harry potter game?...
Without raytracing it'll smash those FPS, of course. But then now you have a GPU where you aren't even using all of it's hardware, which is equally silly.
Nvidia really needs to sell GTX cards again with hefty discounts or something.
Additionally if these monster cards are struggling already, what about the next generation of games? What will average hardware do with those?
 

Vyor

My influence grows!
A state of the art GPU costing something like 2-3 PS5's combined should not be going sub 100fps with a silly harry potter game.
Sub-100fps with some monster tech demo game with absurd effects and dozens of NPC's doing stuff at the same time? Sure. But harry potter game?...
Without raytracing it'll smash those FPS, of course. But then now you have a GPU where you aren't even using all of it's hardware, which is equally silly.
Nvidia really needs to sell GTX cards again with hefty discounts or something.
Additionally if these monster cards are struggling already, what about the next generation of games? What will average hardware do with those?

NPCs don't touch the GPU at all, that's all CPU.

And next gen games will... perform more or less the same.
 

ThatZenoGuy

Zealous Evolutionary Nano Organism
NPCs don't touch the GPU at all, that's all CPU.

And next gen games will... perform more or less the same.
NPC's certainly do touch the GPU lmao, more polygons, more textures, effects, etc.
Typically NPC's are more detailed than environmental features because you'll be seeing them up close, and they're more elaborate models.
My best guesses are next gen games will be even more unoptimized messes.
We're approaching a state of gaming where upscaling is required to have decent FPS.
 

Users who are viewing this thread

Top