You are viewing a single comment's thread from:

RE: CPU & GPU Mining from your laptop

in #altcoin7 years ago

I'm not speculating, I'm relating my experience. I'm saying that I have decades old computers that were used in that way (BOINC, seti@home before that, and mining came later of course but the same kind of GPU and CPU usage either way). I've been doing that sort of stuff since 1997 at least. I have yet to kill a GPU or CPU. Not a single one. There's nothing to compare unless one actually dies. How many GPUs and CPUs have you managed to kill with heat? And are you sure that's what died? Based on my experience, it is a rare event (never in my case) that heat kills a CPU or GPU unless there is a cooling failure. Even then, the CPU/GPU is usually fine because the computer will either throttle or shut off. Again, I don't doubt that heat can reduce lifespan but I just don't think it's significant in most cases. The capacitors and/or VRMs on your motherboard or the power supply or the hard drive will die long before a CPU or GPU will.

Just some more anecdotal evidence, I have a GeForce GTX 260 Core 216 that ran flat out either mining or running BOINC projects 24 hours a day, 7 days a week for 7 years. I just upgraded that machine last year. Still have the card though and there's nothing wrong with it. If shortening the life span means that it will only last 10 years instead of 20 then it doesn't really matter. The CPU on this stupid MacBook Pro runs at 97-98 degrees Celsius only using 75% of the cores (which is why it throttles as soon as the GPU kicks in as I believe the max safe temp is 100C) but it's still running the same as it did four years ago. It's on 24 hours/day most days. That Pentium III Dell Laptop I mentioned was used for VOIP and was on for a year straight (minus the occasional reboot - and this was about a year ago when it was already ancient). In the background BOINC was running and the CPU usage was 100% all the time. That laptop still works fine though it hasn't seen much use since then.

I agree that it doesn't hurt to add additional cooling and that it will probably decrease your chances of hardware failure (though i believe those failures aren't likely to be the CPU or GPU). However, I stand by my belief that laptops should be built with cooling adequate to run flat out indefinitely without throttling or failing in an unreasonably short time (i.e. before the hardware is obsolete anyway).

Computers are designed to compute after all (at least they are supposed to be). As a bit of trivia, before the Pentium generation, there wasn't much in the way of power savings features in CPUs. A 486 ran at a 100% all of the time even if it was executing no-ops so you didn't save energy or heat even if the CPU was "idle". I believe this changed with the first Pentiums but perhaps it was later. Of course a 486 used far less energy than a modern CPU anyway...

Sort:  

I won't continue to argue with you on this after this comment, but every study ever done, including the laws of thermodynamics says you are wrong.

As heat increases electric resistance it increases wear and tear on the CPU and GPU and further causes degradation of the CPU and GPU meaning they will require more voltage over time to run at the same frequency. This is not simply an assumption, it is a fact - a cooling fan will increase the lifespan of a CPU.

Here is a great post about the subject. http://www.overclockers.com/forums/showthread.php/723980-Truth-about-CPU-degradation

I don't know what it is exactly you think I am wrong about. My experience is my experience, I'm not wrong about it.

That chart has no specific temps associated with it and is using a completely made up timeline (it even says so at the bottom). I'll take my real world experience over a made up timeline.

Like I said, I've been using some chips for 10 years or more and using them hard and they still work fine. All modern CPUs have cooling fans or they would burn up almost immediately. The point was whether or not laptops have SUFFICIENT cooling. I simply said that if they don't I consider them defective. Based on my experience and that chart I must conclude that every laptop I've ever used has sufficient cooling to live a long life without adding additional cooling. Even my Macbook Pro which runs 24x7 most days at 97+ degrees Celsius for the last 4+ years hasn't degraded yet (though it's definitely not sufficiently cooled or it wouldn't throttle with GPU usage - and because of the way it is cooled, those cooling pads don't really help anyway). Again, I'm not arguing that heat doesn't matter, I'm just saying notebooks should have sufficient cooling already or they are defective. Having to add additional fans to your laptop just because you are using the CPU for long periods of time is like having to add additional cooling fans to the radiator of your car just because you are going on a long trip. I wouldn't by such a car or such a laptop.

I make sure desktop systems I build have sufficient cooling to actually be used. I expect laptop manufacturers to do the same but maybe you think I'm setting the bar too high?

Yes, heat can and will reduce lifespan of chips but the reality is that they will still last many years even running hot as long as you don't push them past their maximum safe operating temperature. If it dies in 10 years instead of 20 years then I really don't care.

I'm not disagreeing with you about heat shortening the life of chips. I'm disagreeing with you about whether laptops should be sufficiently cooled as designed (I believe a decent laptop SHOULD be) and perhaps the timeline in which you believe chips will fail when subjected to higher than average temps. The oldest CPU I have in operation is a Pentium II. The oldest laptop CPUs I have are a Pentium III and PowerPC G3. I haven't even managed to kill those yet. Granted, they see little use now but they ran hard for many years and still get fired up occasionally.

Here's a better (though old) article on the subject: http://www.overclockers.com/overclockings-impact-on-cpu-life/

One of the takeaways is that thermal cycles have a far greater impact than just running hot.