I am very bad at this so would welcome any help:

Assuming 8 hours 4k gaming a day gaming, at 27p/kWh, what is the daily/monthly/yearly cost?

7 comments
  1. You’d need to know the power draw of the card, but let’s assume 300w as I think that’s about right, let’s call it 333w to make this easy.

    It’ll use a KWH every 3 hours, so your cost to run is 9p and your. Multiply that out for whatever timescale.

    In reality that’s ignoring the rest of the system usage, so it’s probably more 15p an hour.

  2. A quick google gives me 250 watts average (350w max), so you’ll use 2kWh per day, plus a little extra because your power supply won’t be 100% efficient.

    About 60p per day.

    £18 a month.

    £219 a year.

    That’s just the graphics card, not the rest of the system.

  3. For simplicity sake we’ll model your entire computer as 750W including a 32″ monitor.

    In 8 hours you’ll use 8 * 0.75 KWh of energy which is 6KWh.

    Per day that is ~£1.60 or around £600/year

  4. Are you planning on gaming for 8 hours solid a day at 4k? If so then some of the estimates given already are probably about right. If not, the power drawer is likely to be much less. I’ve the same card currently under heavy load drawing between 150-200w. It really depends on how hungry the game is, with it being 4k I’d expect it to be pulling a fair bit.

    Edit – it should kick out a bit of heat too, every little helps!

  5. It’s worth mentioning that these figures will be somewhat over in the winter, because you’ll be able to turn the heating down somewhat due to the heat it puts out – all but a rounding error of that electricity is turning into heat inside the house, so you’re effectively only paying the difference between the cost of your heating and the cost of electricity.

  6. A few people have hinted at it, but a PC turns all its energy into heat (well, some will turn into sound or light, but that degrades into heat), so this time off year, you could argue it is 100% efficient.

Leave a Reply
You May Also Like