WMN: t3_32x0t2_t1_cqfi7zn

Type: WMN: non-understanding

Meaning: situated meaning

Context: Online interaction

Corpus: Winning Arguments (ChangeMyView) Corpus

URL: https://convokit.cornell.edu/documentation/winning.html

License:

Dialogue: t3_32x0t2

[TITLE]

CMV: AMD CPUs and GPUs are less practical and more prone to failure than their Intel/Nvidia counterparts.

[NGeX]

After multiple hard failures of AMD products in 3 laptops and one desktop of mine, I am convinced AMD products have a sooner fail rate than Intel and Nvidia. I have had a Radeon GPU fully melt to my MB before and it was not due to lack of air circulation. That whole machine was AMD from top to bottom and it ran nasty hot...even after checking thermal paste and adding additional fans. In my experience, I have not had a LGA1155 or LGA1150 socket CPU overhead...even on a stock CPU fan. I've never had to add additions fans...even with a high powered Nvidia GeForce GTX GPU installed. The machine runs at a same temperature. I also feel like AMD builds are too rigid. Intel and Nvidia build appear more forgiving and interchangeable. I am studying to be and A+ tech. It would be horrifying to remain so biased when dealing with customer's machines. I'd rather like to lose this look of disgust when I see an AMD based machine. Please change my view. _____ Edit: Calling my post anecdotal will not change my view. What will is number. Proof of failure rates, proof of cost efficiency, etc. Yes, my hatred of AMD is likely extremely irrational, but that is just the point. I want to be shown it is so I can be more open-minded when it comes to PCs. I want to not always say; "That GPU is garbage." just because I had multiple bad experiences. I want to tell them it's either garbage or amazing based on cold, hard evidence. Thanks

[Pipstydoo]

I've had both branded CPUs and GPUs and had no problem with either.

[buddylee_moa]

I've had many problems with nvidia too. Gpu tech got so power hungry n hot for awhile there

[MenacingSailboat]

I'll second what /u/TURBO_FAPPER said and add that my own rig has been running a pair of 1GB 5770's in crossfire for the last five years or so. Never had any problems on the graphics front, except that the setup is getting old and can't run things at max settings anymore.

[penguished]

"I have had a Radeon GPU fully melt to my MB before and it was not due to lack of air circulation." I don't believe this at all. Be weary of viral marketers reddit. But if you really had so many problems it is a run of extremely weird luck, nothing more. C'mon folks. They'd go out of business in a week if their product was melting in computers everywhere.

[NGeX]

I'm not saying that Radeons commonly melt to MB, but it did for me.

[Adezar]

My anecdotal evidence is that I have 6 computers in my house, they have all been AMD based for the past 14+ years and get replaced every 3-5 years. I've never had CPU failure except when I had a complete cooling failure once. As for GPU, I always use Nvidia so can't say.

[mario0318]

Without any technical proofs as to why your view is the way it is, I can't see how you could truly change your view. If poor experiences with AMD hardware is all you're going for to make your point, that's usually a bias that will take real world counter-examples to neutralize, i.e. having several Intel machines fail on you. Having been a student of A+, look at hardware not by its brand but by its low level design. I'm sure you know lots of different hardware configurations can have many different results regardless of brand.

[ManRAh]

I had a 900mhz Thunderbird that ran for something like 7 years (edit: it never stopped running, and it's in storage somewhere and would likely still boot just fine). I had multiple Radeon cards that each ran more than 5 years. Only one of the cards actually stopped working, the other was simply outdated. Now I'm running Intel and nVidia simply because at the time they were the most competitive. AMD needs to do some catchup, but the days of low quality are far behind them. I still have friends who refuse to buy anything AMD simply because they *still* believe AMD drivers are buggy piles of shit. You can show them all the reviews and benchmarks you want, but having a rough start apparently haunts you like a ghost. Ignorance and fanboyism.

[DoTheEvolution]

my experience differs, but all is desktop * My first AMD cpu, was probably sempron procesor, worked well, probably still sits somewhere put away or been thrown out * first amd gpu, sapphire HD3870, that thing was like half the price of gt 8800 while delivering 90% of the fps that gt 8800 did. fan died 3 years afer, but replaced with home made solution and I am not kidding that card is still usable and was in heavy service until 2014 (from 2008 I believe), not run by me, but civ4/5 and heroes 3/4/5 even skyrim were all played on it every day (replaced in 2014 by already used gtx560ti) * My current PC has unlocked athlon x3, 440 clocked from 3ghz to around 3.6ghz. Bought in 2010 I believe, its insane value for the price and I play on it a lot. [Heres the article](http://www.tomshardware.com/reviews/athlon-ii-x3-440-gaming-performance,2619.html) that pushed me towards it. Will be very likely buyin skylake near the end of this year, but really I cant bad mouth AMD at all * I was in charge of setting up and running two crypto mining machines when the hype was high, guys bought 6 x r9 290, 3 per macihne, with gigabyte ud3h mobos some pentium cpu / 4gb ram and 1KW seasonic platinum PSUs. These machines run for a year, nonstop - 24/7, during summer heat. They all went through which I did not expect at all! I have one sapphire 290 trix from it, the only one that was non reference cooling and its a fucking monster performer even when bottlenecked on my crappy athlon * I had gt6600 that started to have artifacts in nfs most wanted right after the warranty run out * on another machine c2d 8400 its solid as rock as well I am not saying I would be buying AMD cpu today, or that if I could get gtx980 instead r9 290 I would not went for it in an instant. But quality control and reliability... I have had zero problems with AMD! [STA-CITE]>I am studying to be and A+ tech. It would be horrifying to remain so biased when dealing with customer's machines. [END-CITE]when you see old athlon x2 running win7 and it does not feel like shit at all, compared to old intel offering from that age, you will appreciate that AMD had some great moments. Also would not hurt knowing that during that time when intel was really a distant second in price/performance, it engaged in serious anticompetition bullshit, like forcing/buying exclusivity for its cpus, or even hurting AMD performance in its C++ compiler, this all helped intel maintein the market, while AMD did get hefty out of court settlement

[jakobx]

I usually buy the gear that has the best price/performance ratio in my budget range. Which means ive had plenty of amd CPUs in the past and intel cpus in recent times. None of them died or gave me any problems. I mostly had ATI/AMD cards because NVidia usually has bad price/performance. I had to replace a couple of cards but they were still in warranty so no big deal. (some ati and some nvidia) I had to throw away a couple of notebooks though because of Nvidias bumpgate fiasco. Thats pretty much the only time a CPU or GPU has failed in my experience(if we are ignoring replacing fans). Thankfully the notebooks belonged to people i know and not me. :)

[kraetos]

I've owned a GeForce 420 MX, a Radeon 9700 Pro, a Radeon X1950XT, a GeForce 8800 GTX, a GeForce 580 GTX, a Radeon R9 270, and a GeForce 980 GTX. None of them ever failed on me. What makes your anecdotal evidence any better than mine?

[ComputerSavvy]

To say that one company is better than the other is not a fair comparison because there are so many factors in play that are outside of the control of AMD, Intel or nVidia. First, please watch this video to get a basic understanding of how CPU's or GPU's are manufactured: https://www.youtube.com/watch?v=qm67wbB5GmI&hd=1 The manufacturing processes for both types of chips are pretty much identical and to say that they adhere to very tight tolerances is an understatement. Now that you know what a wafer is and as you can see there can be hundreds of dies on a single wafer. Let's talk about yield. Yield is the number of useful functional dies you can extract from a single wafer that meets some or all of the QA tests. Even with the very tight manufacturing tolerances employed, there will be variances in the quality and performance from one die to the next. Die 57 may pass all the tests, marked and sold at it's rated speed while die 58 right next to it may function but may not pass all the QA tests at it's designed and rated speed but passes all the tests at a lower clock rate. That happens more often than you know. Both dies will be placed into chip packaging and marked / sold at different clock rates and at different price points although they are identical in design. AMD and nVidia sell their chips to the OEM's, Original Equipment Manufacturers who then integrate them into various product lines. Now, once these chips leave the fab, AMD or nVidia have no control over what the OEM's or the end user does with them or how they are handled. Sometimes due to market demand, they may even take a higher end part and down label it and sell it as the lower end part because they can't produce enough of the lower end parts to meet the market demand. This is why OC'ers can sometimes get big performance gains out of one batch of chips while the "exact same" video card chipset in a different brand card can't overclock as high with the same settings. I remember that this happened to the Celeron 300a series, Intel used faster chips and down marked them to fulfill some of the market demand at the time. The over clockers soon discovered that many of these little 300a's could be overclocked to 506Mhz and run stable, even with a factory fan and heatsink combo and outperform the much more expensive Pentium II chips of the day. It was a hunt to find the right batch number series on the packaging to get the "right" one. Over clocking is a lot like running nitrous oxide in a Honda or Toyota engine and getting more horsepower out of it than a stock V-8. It'll work but for how long? To get a successful OC, it's very common to run the chips at a higher than rated voltage. You're basically gambling with your hardware if you OC it. If you buy a video card or a CPU rated for a particular speed, it should run fine at it's rated speed and voltage but there are some brands of video card manufacturers out there that will overclock the chip at the factory and advertise and sell it as an over clocked board. Some motherboards even have overclocking utilities built right into the BIOS. Remember the yield I mentioned earlier? Not all dies are the same, even coming off the same wafer, they will have different performance characteristics. The end result is a crapshoot if a manufacturer or the end user OC's their chip or video card. The OC'd parts will probably have a higher failure rate or it may work just fine for years to come. Like I said, it's a crapshoot. Then there is OEM QA on their own product lines, Brand A video card company may put out a very good quality product while Brand B, using the same reference board design as Brand A and the same graphics chip and put out a substandard product for the same money. Was the computer properly maintained or handled properly? Are the heatsinks completely clogged to the point that you can't even see the fins or the fan can't turn anymore? There are lots of dust contamination photos in r/techsupportgore, I won't link to any. Little Johnny is so eager to assemble his 1st whizbang computer, he bought all the parts from Newegg and read all the do it yourself articles he could find! This is SO easy! Little Johnny rips off the shrink wrap from all the packaging and is curious to see that new $200 video card up close, he's only seen photos on the web. Fingers all over everything while sitting on the bed or carpet, examining all the new parts. Wicked looking heatsink and UV reactive fans on my new bad ass video card! This video card is gonna rock Halflife 3! Static electricity zaps can instantly fry these chips or they can endure a lesser shock which can damage and erode the delicate circuit pathways inside the chip without completely blowing them out. The initial damage is done and can often times happen without the person who inflicted the offending shock, not even know it occurred. Total failure can happen months or even years down the road and the cause of the failure is near impossible to link back to when the original shock damage happened or who did it. "POS AMD/nVidia/Whatever Brand card, I'm never buying one of those again!". Too bad that little Johnny didn't follow proper ESD handling procedures..... All the packaging had this canary yellow sticker with black printing on it, sealing the grey anti-static bag and it has a wicked triangle with a lightning bolt on it. "That would be a cool logo for my garage band! WOO!" On top of all these variables, toss in design errors such as the Intel Fdiv problem or the nVidia die / substrate issue from 2008: http://www.theinquirer.net/inquirer/news/1004378/why-nvidia-chips-defective When you add all of this up, it makes it very difficult to say that AMD is better/worse than nVidia. What **really** matters is how they handle a problem when one arises, do they stick their fingers in their ears, shouting "LA LA LA LA I can't hear you!" over and over and then bury their head in the sand or do they own up to it and make good on a defective product? That's a very important aspect that needs to be addressed too.

[cd66312]

Well may as well add my personal experience. Nvidia 256 - Failed once, replacement actually still runs (needed a PC with a parallel port and my parents happened to still have the PC) AMD Radeon 9800 Pro - Ran for 3 years or so, sold on ebay in working condition AMD 6800 - Ran for 2 years, gave it to a buddy because I needed something better for Battlefield 2142 Nvidia 8800GT - Used as my main gaming GPU for ~5 years, then was used in my home server till last year when I sold it on ebay. Guy I sold it to claimed it didn't work. Nvidia 460 GT - 3 years, stopped working AMD 6850 - Still in use, no problems AMD R280x - Still in use, no issues yet That's all I can remember at the moment, will add more as the come to me. But in my experience, I've never had an AMD fail. EDIT: About CPUs and cooling, while it is true AMD runs a bit hotter than intel, a properly built computer won't have any issues dissipating the heat. I have a Phenom X4 and a 6850 stuffed into an HTPC case without enough spare room to fit a roll of quarters. Absolutely no issues with heat yet, and from the age of those components, you can get an idea of how long they have been running, 24/7 (HTPC after all), and lots of gaming.

[MrF33]

[STA-CITE]> while it is true AMD runs a bit hotter than intel [END-CITE]I'd always heard this wasn't the case, it's just that AMD's run closer to their thermal limit than Intel does. Most people don't recommend letting an AMD get much above 60 C, where Intel chips can tool along at 80-85 C much more safely.

[LuminiferousEthan]

Ya, but I don't see how we can change your view here. You are just citing your own experience. I can do that same. i have had 5 AMD CPU's and 6 or 7 AMD GPU's over the last 10 years and I have never once had a problem with any of them. Ever. And those are only the machines I had for myself. I also build PC's for people and always recommend AMD over Intel for several reasons (one being that Intel CPU's are ridiculously over priced). And I have never had a customer come back with hardware failure after doing this for again like 10 years. Meanwhile the PC's I work with in the office are usually Intels and we have to replace them fairly often.

[Rubykuby]

Try /r/amd. Lots of people happily using AMD hardware. AMD CPUs have become a little lackluster lately, mostly because they haven't released anything new on that front. Their GPUs are still excellent, though. Absolutely no complaints for my HD 7950.

[TURBO_FAPPER]

Your view is based of anecdotal evidence. I've had a 1 GB AMD (6870) for almost 4 years with thousands of hours of gaming on it, and no problems at all. In fact, I only started using the fan controls about a year ago to. My work computer has an Nvidia card that has been replaced after only 5 months. I wouldn't say one is better than the other because of a couple of samples.

[whatanonner]

Fighting anecdotes with anecdotes.

[TURBO_FAPPER]

That's the point of my comment.

[FacelessBureaucrat]

I've been running an AMD Phenom II for six years now, with no issues whatsoever.

[MechanizedAttackTaco]

To throw my anecdote into the ring, every Nvidia graphics card I've owned has died on me within a year.

[Sajamba]

Me too! Runs everything I throw at it, no problem!

[evanx11]

The Phenom II is a trusty CPU. I had mine for four years before replacing it with an Intel, and only because I wanted to upgrade more performance. The processor did its job wonderfully.

[FacelessBureaucrat]

Yes - especially since I have the three-core one with the locked fourth core that was easily unlockable. I have been itching to upgrade, but haven't been able to justify it, since it's still working so well.

[wiseoracle]

I think you just ran into a bad batch of CPU/GPU's in the AMD line. Around 2007-2009, any laptop that had a Nvidia chip would overheat and fail. It was a very common issue I dealt with while working at a computer repair shop on a college campus. The only fix at the time was to either do a warranty repair, or purchase a refurbished/repaired motherboard from a 3rd party seller. Or the other option which was a 50/50 fix was to put the motherboard on a cookie tray and heat up the oven. The theory was that it would get the solder joints fluid again to join the broken connection. But neither option really fixed the problem because it would come back. I ran AMD when Intel was lagging behind with the Pentium 4. I've had several AMD chips then and never had any reliability issues. Recently my Nvidia GTX 470 failed out of the blue. No clue what's wrong with it, but it failed. All hardware have their faults and bad production batches with failure rates.

[vettewiz]

AMD is absolutely garbage. I ran Nvidia for years and years without a shred of a problem. Switched to 2x AMD cards to run 5 monitor setup. Their drivers were written by children, they crash at the hint you're going to do something strenuous. And the cards themselves are impossible to keep cool without jacking their fans up to 100% and running like a rocket ship blast off. Never again AMD, never again.

[40_Minus_1]

You should be aware that your tone makes your argument far less persuasive.

[vettewiz]

I think you're underestimaing how painfully bad the experience has been.

[Mavericgamer]

That's entirely anecdotal; I've had a rig I built for my brother run through 3 NVidia cards due to heating issues and then run fine on a 2x AMD setup. I have a laptop *right now* whose Nvidia drivers won't stop acting up. I've also had a pair of Nvidia cards running in my home rig for some 4 years now without a hint of an issue. Hardware can be weird like that, but what you're describing is an oddity of statistics.

[vettewiz]

It might be, but I'm pretty sure I've seen votes about the driver stability between the two, and Nvidia won substantially. Unfortunately I wouldn't know where to find that anymore.

[0pyrophosphate0]

Votes? You mean a popularity contest?

[DoodleVnTaintschtain]

I have no opinion on the two (I have an AMD R9 290x, and an Intel 4790k, and they're both champs, but sample size of one and all that...). That said, I think he meant more along the lines of consumer survey than popularity contest.

[DaneboJones]

I bet you're talking about Hp dv6000s

[wiseoracle]

Correct. Also the Dell XPS line too. And the Compaq's too.

[MrF33]

[STA-CITE]> No clue what's wrong with it, but it failed. [END-CITE]Other than it being based on something that was released more than 5 years ago :) That's a pretty long lifespan for a piece of hardware that gets as much hard use as a graphics card.

[wiseoracle]

True.. But dammit now I can't play any games. Been too lazy/cheap to buy a new card.

[MrF33]

I went on eBay and snagged a used 780 for pretty cheap (about the cost of a 960) and now I don't even know what to think everything works so dang well. The step up from the 560 to the 780 was much bigger than I thought it would be. Probably try to convince my wife that I need to upgrade the rest of my computer too :)

[Shammyhealz]

[Microsoft did a study](http://www.extremetech.com/gaming/131739-microsoft-analyzes-over-a-million-pc-failures-results-shatter-enthusiast-myths) on CPUs, but failed to identify the companies (they referred to them as Company A and Company B). At stock speeds, the chance of a CPU crash was 1 in 400 for A and 1 in 390 for B. Not a hugely significant difference. Once you get into overclocking they diverge, but we have no way to tell who is who. [Puget Systems](https://www.pugetsystems.com/labs/articles/Video-Card-Failure-Rates-by-Generation-563/) analyzed Nvidia vs ATI GPU failure rates, and Nvidia did perform significantly better there. It's interesting to note that they did use Asus for all of their cards, so differences between manufacturers should be minimal.

[NGeX]

Asus GPU hardware is usually very very good. So to pit those together would be a good test. I'm interested to know what actually failed though. If it was the chip itself or another portion of the hardware that is really unrelated.

[MrF33]

The problems you're describing are not ones which can be inherently attributed to the chipset, but instead to the packaging. AMD and Intel chips have different requirements, heat and dissipate heat in different ways, but that does not mean that one is "better" than the other, it just means they're different. [STA-CITE]>That whole machine was AMD from top to bottom and it ran nasty hot...even after checking thermal paste and adding additional fans. [END-CITE]Just because you checked it does not mean that you did a good job of solving the problem :) [STA-CITE]>In my experience, I have not had a LGA1155 or LGA1150 socket CPU overhead...even on a stock CPU fan. [END-CITE]And I've never had a non-overclocked AMD unit overheat on the stock fan. I have had both a GTX 560 ti and a 780 overheat on me though. But I don't blame that on the cpu/gpu. I blame it on the cooling systems and the third party "overboost" software [STA-CITE]>I also feel like AMD builds are too rigid. Intel and Nvidia build appear more forgiving and interchangeable. [END-CITE]Can you give a reason *why* you think this to be true? Both manufacturers use one basic socket (1150/AM3+) for their range of "consumer" chips, all with multiple manufacturers making different board chipsets. There are just as many AMD offerings as there are Nvidia in the GPU department as well, all of which will work just as well with a FX-4300 as they will with a FX-9300 as they will with an i-4770k. The biggest differences between the manufacturers is down to cost and intent. AMD cpus are going to be less efficient than their Intel counterparts, but also less expensive, so it's a basic tradeoff. AMD gpus are pretty much on par in every way with their equivalent Nvidia counterparts, the difference currently being that the two are on offsetting generational cycles.

[NGeX]

This has helped here a bit. It was a great description of the differences. I've always been of the mind that when purchasing a CPU...while cost doesn't guarantee performance, it usually means a more durable product. This is why I've chosen Intel over AMD for CPUs. As for GPUs...I've just seen a lot of failures over the past 2 years from AMD GPUs. Nvidia ones do fail...but I've seen them fail a lot less. It's not just third party hardware. I've seen top shelf cards die nearly just as often as the basic ones...both AMD and Nvidia.

[MrF33]

[STA-CITE]> it usually means a more durable product. [END-CITE]Not really, it means a product has had more put into making it *perform* better. I've found [THIS SOURCE](https://www.pugetsystems.com/labs/articles/Most-Reliable-Hardware-of-2013-528/) where a company kept track of their product repairs for 2013, and while it's dated it does both support, and contest your claims. According to this group, AMD desktop processors had lower failure rates than Intel (though it's pretty negligible), while AMD graphics cards had a higher failure rate than their NVidia counterparts, in a non-negligible number. This only holds true for the consumer versions of the products. The professional versions (Nvidia Quadro and AMD FirePro) are both pretty much bulletproof. So, overall, Nvidia *is* better than AMD when it comes to GPUs (or at least it was in 2013, I think that AMD has been making a lot of strides since it's 79xx series came out) and as for Intel vs AMD, there is absolutely no difference. It just comes down to what you want out of your CPU and what your budget is.

[NGeX]

∆ View changed on AMD CPUs. You have brought me solid, still fairly up to date data on the matter. I still think Intel is slightly superior, likely because Intel has been around a long time and their CPUs are usually better performing (hence higher cost), but you have convinced me AMD CPUs are not crap. Still not convinced on the AMD GPUs, but sometimes failure is the fault of the hardware, not the chip itself.

[DeltaBot]

Confirmed: 1 delta awarded to /u/MrF33. [^MrF33's ^delta ^history](/r/ChangeMyView/wiki/user/mrf33) ^| [^delta ^system ^explained](/r/ChangeMyView/wiki/DeltaBot)

[eikons]

Okay, we have a thread full of anecdotes now - let's look at some actual data: http://research.microsoft.com/pubs/144888/eurosys84-nightingale.pdf If you don't want to read a boring paper, [here's an article that explains it.](http://www.extremetech.com/gaming/131739-microsoft-analyzes-over-a-million-pc-failures-results-shatter-enthusiast-myths) [Figure 3](http://www.extremetech.com/wp-content/uploads/2012/06/AMD-vs-Intel.png) indicates that there is a difference between the two vendors (they don't indicate which is which) but it's a small one. Certainly not big enough for one individual to notice based on their experience. I'm not too concerned about the "OC" columns, as the amount of consumers who actually overclock are statistically negligible. Whatever conclusions you draw from the tens, hundreds? thousands? of "samples" you encountered in your career, they should not reflect any real difference. For graphics cards, there seems to be a very legitimate reason for your prejudice - [if this article](https://www.pugetsystems.com/labs/articles/Most-Reliable-Hardware-of-2014-616/) is accurate. They report a failure rate of 2.34% on NVIDIA cards and 17.99% for AMD cards in 2014. AMD had "only" 10% failure rate the year before, so it's been a bad year. The data may be somewhat biased because they sell a lot more NVIDIA cards and they discard results for which they don't have a large enough sample size. It's possible that the AMD rate is so high because only high end AMD cards sold enough units to make the list, and high end cards have a higher failure rate. If this is true, NVIDIA got their incredibly low rating from the many budget cards they sell. Still, their most reliable cards seem to be the GTX980 and GTX780 - so take my hypothesis above with a grain of salt. What I would conclude from my armchair research is that there's no reason to worry about CPUs from a failure rate point of view. As for GPUs, there are as many reasons to dislike Nvidia as there are to love them - but they do seem to be the more reliable recommendation at this time. I'm hesitant to straight out recommend everyone to only buy Nvidia products, because nothing halts innovation like the lack of competition. But there you have it.

[NGeX]

So far this is the only post that I see that offers actual data. Part of what I want is people to show me data that AMD is reliable. So far people are convincing me that AMD CPUs do well, but so far, no one has addressed actual build. Every other post about GPUs has yielded no clear facts, other than experience, that Nvidia is better or worse. It seems it differs between releases. Currently, it seems Nvidia out lasts Radeon.

[eikons]

Your initial post was both about CPUs and GPUs, with a little more emphasis on CPUs. I'm not gonna try to change your view on the GPU part, since your view on that seems justified. Hopefully I did change your mind about the CPUs. Especially because AMD CPUs are well priced, making them a good recommendation. We also need AMD to stay afloat to offer competition to Intel/Nvidia. I do a lot of PC build recommendations for people I know, and while I only build Intel/Nvidia systems for myself (3D workstation use, many compatibility issues with AMD) I'm happy to recommend the AMD APUs for people with a budget. I've heard no complaints. But then again that is anecdotal. ;)