Is there a way to find the actual cost to compute a given task in terms of electricity consumption, or has this been a solved problem and hosting companies just add their proprietary sauce and then charge for the end result?
I’m curious in this way, because I’d love to be able to host for clients in my geographical area, but I don’t want to over-charge more than what it is actually worth to run the infrastructure.
At the core of my question is understanding the basic economics of hosting in relation to electrical consumption and hardware wear. I’d guess the outer layer of my question has a bit to do with knowing the true cost of hosting rather than how to make the most off of clients for simple services.
Appreciate any and all input!