I want to get a gaming PC but I know nothing about them, advice thread

Jormungandr

The Midgard Wyrm
Founder
She certainly hasn't let me down so far, and compared to the intel I3 I was using before her, she's a monster. XD
Yeah, you can't go wrong with AMD processors (FX excluded). :) They've always been reliable and, historically when they were just starting out and cloning intel processors, outdid intel at their own game (you could buy a x386/x486 processor that was actually a step above intel's offering in the 90's) a lot of the time.

Though, I've got an old intel i3 5005U in my old laptop (bought it in Aug'16), and it was surprisingly great at light gaming (such a Warcraft III, Pharaoh/Cleopatra, et cetera), video-watching/streaming, and web-browsing with a ton of windows open.

The 5005U is like a beefed up Celeron in practicality -- dual core, 2.0 GHZ, 3MB cache -- but it worked great for basic stuff.

My laptop died a few years back, and since I've shifted back on to using my desktop most of the time (another reason why I upgraded to a Ryzen), I'm not sure what to do with it, repairs aside.
 

Vyor

My influence grows!
I was rocking a Phenom II Black Edition with 12GB of RAM on an OEM Foxconn motherboard for over a decade (2010/11 until last year). It was beginning to show its age though, especially when Adobe did their absolutely retarded "this processor is no longer supported by our apps" move (since businesses are still using old processors, even aging Xeons from the second-hand market, in their rigs, and Adobe basically fucked them all over).

The main issue was that they started to require AVX2 because of compiler updates.

Phenom doesn't even support AVX *1* so...
 

Vyor

My influence grows!
especially since the FX successor line could, well, act as makeshift BBQs with how hot they ran.

They ran less hot than more or less every other processor because they hit a thermal shutoff point at 65c.

For comparison, modern intel shuts off at 110 and modern AMD shuts off at 105.

They also pulled a similar amount of power to the phenom chips, particularly when OCd.
 

Jormungandr

The Midgard Wyrm
Founder
The main issue was that they started to require AVX2 because of compiler updates.

Phenom doesn't even support AVX *1* so...
I understand the "why", but that didn't help all those people and businesses that were using/reusing old workhorses for their development rigs (e.g. Xeons with a ton of cheap DDR3).

The sheer rage people had for Adobe was pretty vicious at the time.
They ran less hot than more or less every other processor because they hit a thermal shutoff point at 65c.

For comparison, modern intel shuts off at 110 and modern AMD shuts off at 105.

They also pulled a similar amount of power to the phenom chips, particularly when OCd.
Wait, they set it to 65? What fucking moron thought that'd be a good idea?
 

Vyor

My influence grows!
The sheer rage people had for Adobe was pretty vicious at the time.

The rage of idiots always is.

Wait, they set it to 65? What fucking moron thought that'd be a good idea?

Blame the node, the CPU started becoming unstable at temps over that. Global Foundries fucked them, intel fucked them harder. AMD was on a serious time crunch and money crunch when they were designing FX. That it was as "good" as it was is a miracle. They competed well enough with phenom but weren't much faster per core until you started leveraging AVX.
 

Jormungandr

The Midgard Wyrm
Founder
The rage of idiots always is.



Blame the node, the CPU started becoming unstable at temps over that. Global Foundries fucked them, intel fucked them harder. AMD was on a serious time crunch and money crunch when they were designing FX. That it was as "good" as it was is a miracle. They competed well enough with phenom but weren't much faster per core until you started leveraging AVX.
FYI, I was one of those "idiots" -- not everyone could afford a new processor just for one program, and not all businesses and freelancers were able to stump up a shit ton of money to replace their rigs on such short, out of the blue notice.

Their rage was justified as Adobe fucked a lot of people over, disrupted businesses and livelihoods.
 

Vyor

My influence grows!
FYI, I was one of those "idiots" -- not everyone could afford a new processor just for one program, and not all businesses and freelancers were able to stump up a shit ton of money to replace their rigs on such short, out of the blue notice.

Their rage was justified as Adobe fucked a lot of people over, disrupted businesses and livelihoods.

Pentiums existed. And celerons.
 

Jormungandr

The Midgard Wyrm
Founder
Pentiums existed. And celerons.
Not everyone uses intel, though until Ryzen they were the best option, and Adobe pulled this out of the blue in 2018/2019 -- using a Pentium or Celeron for any serious graphic design or rendering work? You're nuts.

Xeons could do it simply because they were surprisingly versatile and could brute force things (especially if you were lucky to get a motherboard with two sockets e.g. ex-server) -- combine that with cheap DDR3 RAM? You had a solid, cost-effective rig.

Adobe should've given six months notice to allow people and businesses to make plans, but they pulled it out of the blue. That's what fucked people over.
 

Vyor

My influence grows!
Not everyone uses intel, though until Ryzen they were the best option, and Adobe pulled this out of the blue in 2018/2019 -- using a Pentium or Celeron for any serious graphic design or rendering work? You're nuts.

They gave 3 years notice when the compiler updated.
 

Stargazer

Well-known member
So I've been looking at PC parts recently, and - holy smokes, graphics card prices are actually reasonable again! At least in the entry to mid range performance level. Looking at Microcenter (a great computer store chain in the US that I'm fortunate enough to live near), here are some prices:

Multiple Radeon RX 6600s are at $200
The cheapest Radeon RX 6650 XT, the ASRock Challenger D model, is $250.
The cheapest Radeon RX 6700 XT, also an ASRock Challenger D, is $340. The next real jump in performance from AMD is pretty steep, with a PowerColor Red Dragon Radeon RX 6800 XT at $530.

The closest competition from Nvidia are an Asus TUF Gaming GeForce GTX 1660 Ti for $230, an MSI Ventus GeForce RTX 3050 for $280, and a Zotac Twin Edge RTX 3060 for $330. The next real jump in performance from Nvidia doesn't come until the price point of $410, with an Asus Geforce RTX 3060 Ti. But I would argue it's not worth buying due to low VRAM. In fact, I don't think it's worth buying an Nvidia card right now until you get to the brand new Geforce RTX 4070 at $600, but I'll get to that.

(The following is just some general budget PC building advice)

The 1660 Ti is an older gen card with only 6GB of VRAM, and the 3050 loses to the 6600, let alone the 6650 XT. Neither are worth buying versus the less expensive AMD cards.

The 3060 also trails far behind the 6700 XT in most games with traditional rendering. It pulls ahead in games with heavy ray tracing effects, but in such a case both would struggle to have smooth framerates even at 1080p. At least this 3060 model has 12 GB of VRAM, but still, the 6700 XT seems like the much better value for just $10 more.

The 3060 Ti at $410 still can't cleanly beat the 6700 XT, but even worse, it's actually an 8 GB model. In fact, all the 3060 Ti, 3070, and 3070 Ti models in stock at Microcenter only have 8 GB of VRAM. Recent games have been increasingly hitting a performance wall on 8 GB cards. It even can hamper Nvidia's advantage in ray tracing performance. You don't get more VRAM until the brand new RTX 4070.

So, AMD dominates the low to mid range of the market, up to the $350 price point. The price to performance curve isn't completely back to where it was before the chip shortage apocalypse - I feel like 10 years ago, cards with 128 bit memory buses like the RX 6600/6650 XT would have launched at $150 or less. Still, it's a damn sight better than it's been for the last couple years. You can actually build a PC with current gen console-level performance for less than twice the cost of a console. (More on that in a bit).

Between $350-$500 is a bit of a no man's land at the moment. There's almost a $200 gap between the cheapest 6700 XT and the cheapest 6800 XT, and Nvidia's cards in that price range are really not worth it right now due to the limited VRAM. If someone is looking to buy with that budget, it's probably better to wait for AMD and Nvidia to fill out that price range with new Radeon 7000 and Geforce 4000 series cards.

Above that price, all I have to say is, man... Does anyone remember when AMD launched the Radeon HD 7970 as their flagship card at $550 in 2012, and people at the time thought that was kind of overpriced? :p

And below the $200 price point, there is the Radeon RX 6500 XT, but... Don't bother. It continues the storied tradition of low end junk graphics chips that aren't worth even being considered by gamers.

Anyways, one way to look at building a gaming PC on a budget is what level of hardware you need to reach parity with consoles. It's kind of easy to compare with AMD components, thanks to consoles using AMD Ryzen 2 CPUs and RDNA 2 GPUs. I'll compare using the count of the "compute units" (CUs) in each chip.

Xbox Series S: 20 CUs.
PlayStation 5: 36 CUs
Xbox Series X: 52 CUs

Radeon RX 6600: 28 CUs
Radeon RX 6650 XT: 32 CUs
Radeon RX 6700 XT: 40 CUs
Radeon RX 6800 XT: 72 CUs

So, arguably, the RX 6600 is the minimum you need to keep up with this console generation. It has a comfortable margin of CUs over the Series S. As long as games are made to run on the Series S, the 6600 should be able to play them as well.

The 6650 XT gives a little more wiggle room in this regard, but doesn't quite match the PS5. The 6700 XT though jumps past the PS5. With 4 more CUs than the PS5 (and higher clock speeds), it should be able to match or surpass PS5 level visuals. It's still behind the Xbox Series X in CU count, but still, I think the 6700 XT is the closest thing to the "sweet spot" for console level performance.

The 6800 XT leapfrogs past the Xbox Series X, but definitely at a premium. It costs more than the Xbox on its own! So the 6800 XT may still be worth it, but it's well past the sweet spot.
 

Scooby Doo

Well-known member
Oh just an update I was thinking about Baldurs Gate 3, does anyone know if you need high end specs to run that?
 

Users who are viewing this thread

Top