Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Another kind of bug that costs money: processes using 100% CPU for abnormal amounts of time. Back in the day when we deployed Windows Small Business Server, sometimes the WSUS process gets stuck and uses 100% CPU. In one particular instance this went on for about a week before anyone noticed.

I wondered how this affected power consumption. For this one particular server we had power consumption metrics and sure enough consumption was something like 30 watt over normal for that week.

So who pays for that? In those days electricity was cheap so for the customer it didn't really make a difference, but think about it globally, how much electricity and thus also money has been wasted on stuck processes?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: