We often get questions from our customers, especially those that are planning to utilize their server as a personal lab or run out of a home, "How much will this server impact my electricity bill?"
We took some of our most popular servers and measured their performance in power consumption. We measured the Kilowatts Per Hour (kWh) and calculated it against the average electricity cost in the United States.
The models we tested were across-the-board all 8-bay rack server models from HP and Dell – 2x 2.60Ghz 8 Core Processors, 96GB of RAM, and 8TB of SAS HDD storage – since they reflect mid-level specs. We also measured the servers' performance in Peak and Idle. While a device will never always be Idle or Peak, you can use these guides to make the most informed decision depending on your anticipated usage.
(If you live in an area with a price of electricity outside of the national average, you can calculate how much it will cost to run a server by multiplying the kWh times the rate listed on your electricity bill.)
Dell & HP 2U Rack Servers
Dell & HP 1U Rack Servers
Want to have a more exact number of your device’s energy consumption?
You can use this online tool here to calculate the total wattage generated by your machine (you’ll need to know the exact model and specs of your device's components. The more accurate detail you provide, the more exact the output will be).
You can then convert the output from Watts to kWh with this conversion calculator here.
Take that kWh number and multiply it by the rate on your electricity bill.
Have a question? Want to see other guides? Leave your comments below to let know what you think and what else you’d like for us to measure!
4 comments
Thanks for reaching out, Sam. I do not agree with that statement. Using 1000 servers for one hour does cost more to run than one server for 100 hours.
The price difference comes from the specifications of each server being used to acquire the results desired. 1000 servers of even the simplest configuration would cost more to run for one hour than one server on the high end side.
As for performance, having 1000 servers performing tasks for one hour would ensure that not each server is under too much load and allows one server to fail and have others pick up the slack.
Having one server running everything would pave the way for critical failures bringing down an entire business, which would result in loss of productivity and loss of revenue depending on what is being run on the server.
In conclusion, running one server for 1000 hours or 1000 servers for one hour does come with cost differences initially and in the long run in terms of failure prevention and productivity loss.
Thank you so much for the positive feedback, Greg. We’re so glad you’re enjoying TechMikeNY and really appreciate you taking the time to share your thoughts with us!
So, how accurate do you think this statement is:
“…Moreover, companies with large batch-oriented tasks can get results as quickly as their programs can scale, since using 1000 servers for one hour costs no more than using one server for 1000 hours.”
https://www2.eecs.berkeley.edu/Pubs/TechRpts/2009/EECS-2009-28.pdf
I just wanted to drop a comment to say that you guys are doing a phenomenal job. Your prices are very competitive and your website is top notch. Keep up the good job!