Technology CHIP WARS (Nvidia v. Intel v. AMD v. uhhhh errrrr everybody else or whatever...)

bullethead

Part-time fanfic writer
Super Moderator
Staff Member
You'll need a nice and powerful power supply for that 3090 and I hope to god you got one with decent VRAM cooling on the back otherwise it will die in a few months.
Well, he could buy a big aluminum heat sink for ~$10-12 and a big ass thermal pad and solve the problem that way.
 

*THASF*

The Halo and Sonic Fan
Obozny
You'll need a nice and powerful power supply for that 3090 and I hope to god you got one with decent VRAM cooling on the back otherwise it will die in a few months.

I've got a 1000-watt EVGA SuperNOVA. The card is a reference card from Nvidia themselves. Are you telling me Nvidia screwed up the VRAM cooling on a card with a cooler the size of a cinderblock?




Oh my god. :eek:
 

Vyor

My influence grows!
I've got a 1000-watt EVGA SuperNOVA. The card is a reference card from Nvidia themselves. Are you telling me Nvidia screwed up the VRAM cooling on a card with a cooler the size of a cinderblock?

When it reaches 110c? Yes. Yes they did.

Nvidia reference cards fucking suck. They've always sucked. The most dramatic example you can point to is the damn GTX 590, which had such underpowered VRMs that the damn cards would explode under normal use even when the GPU cores weren't that hot.

Or how about the 980ti's reference VRMs? Where they were so underspeced for the memory voltage that you could actually see where the larger inductors were supposed to go and that eventually killed a lot of cards?

Oh, oh! How about the 2080ti's that died because of the reference cooler design? That was pretty recent, you know, the space invader artifacts? Nvidia even admitted that one was their fault.

Nvidia cards have never been good, certainly not when they're the reference models.
 

bullethead

Part-time fanfic writer
Super Moderator
Staff Member
If it doesn't conflict with his CPU cooler.
Most of the heat sinks of that size are pretty low profile, but yeah, they might interfere with stuff like M.2 SSDs.
RZSr1GW.jpg


(This isn't a one piece big heat sink, but it's the typical height of those things.)
 

Vyor

My influence grows!
Most of the heat sinks of that size are pretty low profile, but yeah, they might interfere with stuff like M.2 SSDs.
RZSr1GW.jpg


(This isn't a one piece big heat sink, but it's the typical height of those things.)

Biggest issue with that is the 2 layers of thermal pads. Also having to pull heat off of a plastic covered piece of shit.
 

*THASF*

The Halo and Sonic Fan
Obozny
When it reaches 110c? Yes. Yes they did.

Nvidia reference cards fucking suck. They've always sucked. The most dramatic example you can point to is the damn GTX 590, which had such underpowered VRMs that the damn cards would explode under normal use even when the GPU cores weren't that hot.

Or how about the 980ti's reference VRMs? Where they were so underspeced for the memory voltage that you could actually see where the larger inductors were supposed to go and that eventually killed a lot of cards?

Oh, oh! How about the 2080ti's that died because of the reference cooler design? That was pretty recent, you know, the space invader artifacts? Nvidia even admitted that one was their fault.

Nvidia cards have never been good, certainly not when they're the reference models.

My memory sensor is showing 98 C Max on the VRAM while gaming. Oh my god, what the fuck, Nvidia? :eek:

Now I'm seriously staring at my card and thinking of doing the thermal pad mod. Shit, none of this would've been an issue if I'd water-cooled because I would've installed new pads while putting on the waterblock and backplate anyway.
 

Vyor

My influence grows!
My memory sensor is showing 98 C Max on the VRAM while gaming. Oh my god, what the fuck, Nvidia? :eek:

Now I'm seriously staring at my card and thinking of doing the thermal pad mod. Shit, none of this would've been an issue if I'd water-cooled because I would've installed new pads while putting on the waterblock and backplate anyway.

And now you know why I will endlessly shit on nvidia over on the SB graphics card thread. Along with things like this:
Alienware to Return Abducted GPU Cores to m15 R5 Laptop With vBIOS Fix

Fuck Nvidia, they're as bad as Apple and Intel combined.
 

Vyor

My influence grows!
Rumor mill of Nvidia looking to source assistance from Intel during the ongoing chip and GPU shortages. Also the Nvidia GT 730 is back baby!


hahahahahahahahahahahaha

Whoever fucking wrote this has 0 clue what the fuck they're talking about.

You want to know the main bottleneck for nvidia? GDDR6(X) and fucking PCBs. Not silicon chips.

Fuck this article, this guy is a fucking idiot. Don't listen to pcgamer, the verge, or similar outlets on anything tech.

Anandtech, Techspot, and GamersNexus are all decent outlets when it comes to tech news. And Tom's hardware.
 

bullethead

Part-time fanfic writer
Super Moderator
Staff Member
Maybe Nvidia will convince Intel to make Pascal chips, then stick shitty RAM on them and expect people to pay Ampere prices for the pleasure.
 

bullethead

Part-time fanfic writer
Super Moderator
Staff Member
They could just ask TSMC to do it. Not like anyone is using 16nm.
Nah, there's probably someone using that node for something. It's not like it's 32nm or some other super obsolete node no one would use for anything if they had a choice.
 

Vyor

My influence grows!
Nah, there's probably someone using that node for something. It's not like it's 32nm or some other super obsolete node no one would use for anything if they had a choice.

According to TSMC earning reports, more people are using 32nm and 55nm than 16nm right now. It's old enough to be uncompetitive but still new enough that it's expensive.
 

Husky_Khan

The Dog Whistler... I mean Whisperer.
Founder
Sotnik
Reuters said:
Nvidia Corp (NVDA.O) may not be able to meet a March 2022 deadline for closing its $40 billion acquisition of British chip technology firm Arm Ltd due to European regulators' reluctance to consider the case until after the summer holidays, people familiar with the matter told Reuters.

Though apparently close to a deal, US, Chinese and European regulators are still scrutinizing the intended transaction before giving it approval.

 

Vyor

My influence grows!
Though apparently close to a deal, US, Chinese and European regulators are still scrutinizing the intended transaction before giving it approval.


Because it would be anti-competitive as fuck.
 

Husky_Khan

The Dog Whistler... I mean Whisperer.
Founder
Sotnik
Thought about posting this in the 'Hate News' thread but... ehhh seems better here.

Toms Hardware said:
The consulting firm estimated that 25% of the graphics cards shipped in the first quarter of 2021 went into the waiting grubby hands of cryptocurrency miners and speculators. That's roughly 700,000 high-end and midrange gaming graphics cards. In monetary terms, we're looking at a hefty sum in the range of $500 million.

Toms Hardware said:
Cryptocurrency miners aren't the only reason for the drastic inflation in graphics card pricing. The pandemic also played a big role in this situation since it forced many factories to temporarily shut down and interrupting supply chains in the process. It's been known that graphics card components, such as GDDR6 memory chips, voltage regulators, capacitors, and other parts, have also gone up in price since the start of the pandemic. Jon Peddie Research measured an increase of up to 70% early in the year.

 

Vyor

My influence grows!
I'm going to be honest, 700k isn't a whole lot when it comes to GPU sales. That's not even 1% of 2018's total sales and only 21% of Q4's total sales(revenue of 2.8 billion dollars), even assuming every GPU sold was 850 dollars(they weren't, most were way cheaper) that's still 3.29 million GPUs sold. Assuming a more sane average selling price of 350 dollars, you raise that to 8 million GPUs, which drops things down to 8 percent. That's for a single quarter, and a quarter where demand was really bad.
 

Husky_Khan

The Dog Whistler... I mean Whisperer.
Founder
Sotnik
Could this be the beginning of the end of the great graphics card shortage?

Toms Hardware said:
According to ComputerBase, graphics card prices have begun to drop as much as 50% in Europe. Availability has also improved significantly, with sales of most GPU models from both AMD and Nvidia doubling month-over-month. This report comes on the heels of ASRock, a GPU maker, noting that GPU pricing is easing as demand from Chinese cryptocurrency miners wanes.

 

Users who are viewing this thread

Top