Imagine when your fridge can do this: freeze extra cold when the sun is shining (or wind is blowing), don't run the compressor when it's not, only run the blower after you open the door to move that extra cold from the freezer, allow a slightly larger temperature range, and of course run as necessary to avoid spoilage. It's not a simple algorithm, it has to handle various timeframes, such as solar being a daily cycle except there's less in winter and can go for a week or more with very little (storm/overcast). Maybe it could also use a bit of "learning" like the Nest thermostats to also optimize predicted usage.
I know of one commercial product that sort of does this: the Zappi electric car charger. If you have grid-tied solar, it measures the current being fed back to the grid and adjusts the charging current to match. So if a cloud goes over your house, or you turn on a big appliance, the charger reduces the power to the car by the same amount. This maximizes the use of your own solar energy and minimizes the use of grid energy.
I've been posting for years that an effective grid "battery" is internet connected refrigerators, water heaters, A/C, car chargers, etc., that only run when power is cheap, i.e. when solar/wind is providing excess power.
A great deal of our demand for electricity is elastic and shiftable, which will eliminate a huge chunk of the need for grid batteries.
Glad to see this finally gaining some traction!
I think people have a strong bias for thinking the things they see and touch during the day are environmentally important. But most "pollution"/energy use happen out of sight from our everyday lives.
Or so I think. Happy to be disproven!
Does this exist anywhere in the world?
Also freezing it more in day doesn’t allow me to freeze less at night. If I don’t open the door at all, then the insulation is already taking care of this. What am I missing?
Basically they've plugged into the ISO as a "generation" resource, and when the price of power goes high enough, they say "okay we can produce that much power", and they have all their users reduce that much power, which has the same effect. They get paid for the power, and pass some of it along to their users.
The article says they are “shifting the timing of our compute tasks”, so if they think that there will be cheap electricity later in the day (because it’s going to be especially windy or something) it would make sense for them to schedule some of their heavy compute tasks at that time, rather than right now.
If you have the space, you can put thermal mass (e.g. water) between the cooler and produce, acting as a cheap thermal battery dampening temperature oscillations. It's often done in off-grid situations.
There’s huge lie (by omission) about renewables: nobody explained how to convert the world to 100% renewable energy without coal backup.
Given nuclear’s inflexibility, doing untimely work when less electricity is needed for more timely needs is also a win.
I live in Scotland. We now get around 90% of our electricity from renewables. We closed our last coal power station in 2016.
Perhaps you should consider checking your theories against reality before accusing others of lying?
One way to do it would be assign various jobs a value, (which could be dynamic e.g. it might get more important as information becomes stale) and have them bid on compute power. You could make the value virtual.
Or you could use real money. This is the premise behind EC2's spot instances. So when power is abundant, your prices drop and the relevant jobs kick off.
Using real market prices makes sense especially if you're renting out computing power, most customers will be happy to adjust workloads to save money.
Even if it's entirely internal, it's good to have a facility to "optimize for cost" and then report the savings. That's helpful to get the engineering resources devoted towards it, because "I saved $X" is a great bullet point to put in anyone's promotion packet or to base a bonus on.
It's not the value of the outcome for that job that you're interested, but rather its sensitivity to a delayed latency in executing it.
For example, preemptively converting Youtube videos to a lower resolution with optimum compression to avoid having to do it in real-time (when video is played) at a crappy compression (to be fast), is valuable for sure. It's just that it can be postponed for 24 hours without real impact. Executing a search for a single user is less valuable in terms of overall impact but much more latency-sensitive.
(you can think of value and latency-sensitive in terms of two dimensions that are independent between them.)
This idea helps save the planet for sure, but it requires cloud-providers to build APIs that enable devs to switch from the "here's the SSH to the server, do what you want with it" to a model where it's the devs that say instead "here's a lambda function and its desired latency execution, please schedule to run it for me and let me know when the result is ready" ( https://en.wikipedia.org/wiki/Inversion_of_control )
Google was able to do that because it owns a large part of the jobs executed in their datacenters. Hence they could build this adaptive scheduling for their own jobs quickly without necessarily passing through a cloud-based API that inverts the control of job scheduling.
As a customer I think you could configure somethings like you said using spot instances on AWS but that’s it, you’re going to save some small amount of dollars in a year but if you account for the engineering hours needed to set this up maybe is not really worth it.
As a cloud provider you could juggle your client between datacenters depending on the load and price of energy over there. A flat rate for a cloud region means that there’s an opportunity for arbitration between datacenters that could be an increase of thousands in profits on their side.
The future doesn't always need to be as "webscale" as Google; sometimes, scaling down is the smart thing to do. The minimal approach of LTM is the technology equivalent of riding a bicycle (or electric velomobile [1]) to work instead of driving.
[0]: https://solar.lowtechmagazine.com
[1]: https://solar.lowtechmagazine.com/2012/10/electric-velomobil...
LTM's approach required producing, assembling and shipping a full 2GHz/1GB computer, plus PV, PV-controller and router, all to serve a single site. And it's even turned off some of the time!
Google, on the other hand, is more like a fleet of trains; sure, each one is a honkin' beast, but it also transports thousands of passengers/sites at once, possibly millions in its lifetime.
The bicycle analogy doesn't really work, because a bicycle is just a performance attachment to the real vehicle: the human.
This way you can have 24/7 fully green content delivery to consumers.
Although that being said we could just try to cover the earth with as many generators everywhere and then fully connect the grids.
I guess it's okay as long as the people making the rules have good monitoring and are watching out for weird exploits and fixing them. The flexibility to change the rules tends to be more common internally than externally where customers want more guarantees.
As we've seen, there also needs to be a balance between cost-optimization and preparedness. If the wind patterns don't match the prediction then you need to be ready for that.
Also, as we've seen with cryptocurrency, real money attracts theft. A human-adjusted credit system is better. In the real world, this looks like support having the discretion to forgive big bills. But to do that they need to know their customers. It's hard to automate.
Our hypothesis is that market signals combined with the right tools (friendly app and home automation) can help households shift demand into less carbon intensive periods.
So far it's working pretty well.
It would also be pretty neat to integrate processing power markets with the wholesale energy markets. Energy prices are quite volatile and making load responsive to that would actually be quite helpful to stabilize them.
I've wanted to have realtime pricing like that for a while, it seems to be becoming available again.
I honestly thought that was what the advanced electricity meter roll-out was going to do; but it seems not.
More direct energy cost to service price charged seems like a good thing in general.
It's goofy, but another one is situations where you have a lot of stored heat energy, thinking like pools, hot tubs, hot water heaters, etc— all those things could be activated in response to spot pricing with pretty simple policies (I want a shower of at least X degrees at 7am, I want the hot tub at at least Y degrees by 9pm, etc).
Texas energy markets are where all the fun on this front is really happening.
Best thing. Then you incentivize a cleaner grid overall and you don’t even have to worry eventually about this kind of thing.
If there is also a dynamic price for using the grid, that usage will also spread.
This Dell paper[0] suggests that 16% of the carbon over a typical server lifecycle is from the manufacture, so you probably don't want a server sitting there unused for 23 hours per day, since the overall carbon/compute ratio would be worse overall.
The post doesn't mention this metric, but it would be really nice to see something more detailed in time - especially with this overall efficiency of the server/datacentre lifecycle in mind, rather than just energy consumed from use.
[0]: https://i.dell.com/sites/csdocuments/CorpComm_Docs/en/carbon...
Assuming the server is "sitting unused for 23 hours a day" is the wrong model for what this work changed. You're assuming the server could be running at 50% duty cycle vs. 100% duty cyle. It isn't; since we're talking the batch load, there's a roughly fixed amount of low-priority work to be done and doubling the amount of CPU active-duty time alotted to doing the work doesn't get the work done faster (the details on that are complicated, but that's the right model for what Google's describing here). One should model the duty cycle as fixed relative to the processor (i.e. "This global datacenter architecture, over the course of its life, will do a fixed N units of computronium work on these batch tasks") and then ask whether that work should be done using coal to power the electrons or wind.
If Y = 2 and only 16% of the carbon in a typical coal-powered computer's lifetime is from the manufacture, then solar makes sense - solar is 2*16% = 32% of the carbon of coal. But if Y = 10 - so it's running 10% of the time, meaning there need to be 10x as many computers built - and 16% of the carbon is from the manufacture, then solar power is actually worse for the environment than coal power: solar takes 60% more carbon than coal power.
Of course, this is a vastly simplified situation, but it points to the idea that we need to at least consider the carbon cost of manufacturing.
Part of that calculation should be the amount of compute capacity headroom you'd choose to have anyway even if you didn't care about carbon.
Compute demands can vary from one day to the next. Maybe tomorrow people uploaded 3 times as many YouTube videos as they did today. Maybe load varies based on day of the week or day of the month. To some extent, you can smooth that out by delaying jobs, but there are practical limits.
You also want some spare capacity just for safety. Efficient utilization is important, but things like performance regressions or spikes in demand can happen.
- Spawn nightly regressions when wind power starts to pick up, instead of at some arbitrary wall clock time
- Dispatch compute-heavy jobs during low energy cost times; dispatch IO-heavy or memory-limited jobs during high cost times.
Am I crazy or is this website capturing down-button clicks and ignoring them? I typically use down and up to slowly scroll as I read an article. This page is driving me nuts.
Surprised to see that get through QA. (Up arrow works just fine.)
There's one other keyboard event listener, but it doesn't appear to do that.
Regardless of this change, I wonder if they share their forecasted non-renewable energy needs with their energy supplier so that the energy supplier can prepare for changes to the expected base load.
Do any factories or other energy intensive operations do this?
Aluminum Foundries in particular are extremely power intensive and have been run during off-peak times (or are built in areas with cheap plentiful electricity like nearby hydro-electric dams).
Still, i'd love to see this concept made a lot easier for the average consumer. Many people already have smart thermostats, why can't that talk to my power generation company and allow me to over heat/cool when the impact is lowest? Why can't my dish washer run automatically when it would impact the world the least? Why can't my EV automatically charge when power is most available?
I know most of those things are possible, but they sure as hell aren't easy, and IMO they won't truly have an impact until they're on by default and don't require the user to do much of anything.
These things seem like they are easily doable, but we just need the different industries to work together to come up with ways to have all of this stuff interoperate.
Fun fact: this is a big reason why aerospace congregated in the pacific northwest during WWII (eg Boeing in Seattle).
Aluminum is key to aircraft because it's lightweight, and at the turn of the century the US went on a dam-building spree with a lot of hydro (ie large consistent baseload) being located in the pacific northwest.
That's the great thing about standards -- there are so many to choose from!
If you're a large consumer of energy and can turn that consumption on or off at short notice (on the order of seconds) then the grid operator will pay you to allow them to scale your consumption up or down.
The classic example of this is cold storage. If you have a warehouse full of freezers which need to be kept within a certain temperature threshold then it doesn't really matter when you run the freezers and you could switch off at several points during the day.
Given how much electricity a datacenter consumes. Google surely must have a direct support contact within the electricity provider and they better start working both ways, if they don't already.
Quick math. A datacenter is 60 000 servers, so 6 MW consumption at 100 W per server (moderate load). That is 1% of the peak output of a nuclear reactor. You bet the electric company wants to know when they need to adjust their reactors.
Seems like getting rid of the middleman at a basic, physical level. Power is available for very low cost at certain times. So let's time-shift computation that doesn't have to be done at a specific time! Really, it's just the same trick as time-shifting your EV charging and other power draws. It costs money to run battery banks and inverters. Let's just take them out of the process, where we can.
"The best part is no part." -- Elon Musk
It sounds like this project is for their own operations. Have you guys thought about how to offer this closer to a turnkey cloudops SASS / API? What kinds of abstractions would you present to developers building non-time-critical compute loads?
Could be a great differentiator for GCP vs. AWS (I have heard of some companies choosing GCP over AWS due to Google's green energy cloud). And for you guys, the only thing better than Google being a customer is all of Google's customers being your customers.
If a data center has (roughly, to a first-order approximation) fixed compute capacity at any point in time, and we assume that any capacity not being used by Google themselves is made available to GCP, then wouldn't Google reserving the "green" hours for themselves drive the remaining "dirty" hours onto the GCP spot markets?
Is there a cloud market design that addresses this tension between maximizing utilization and having desirable or 'premium' compute hours?
I wish there would be more countries covered (in particular Switzerland), but I guess you depend on the live data being provided in these countries.
Unfortunately, the crux is data availability and reliability from system/transmission operators. For example, there is no online data available for the Northern Territory of Australia, so you can't build a parser for it. Some data providers have frequent data outages for different regions (ENTSOE), as there is no SLA or contractual obligation for providing data reliably.
If you're aware of a region that has live data available and is not yet live on electricitymap.org, please consider contributing a parser [1]! If you live in a region without live data, please consider politely requesting such data be made available through utility and system operator contacts, or explore requiring such data be made available by law (if public policy is your thing).
[1] https://github.com/tmrowco/electricitymap-contrib#adding-a-n...
Would you mind commenting on what your tech stack is like? Looking at your github repo it seems like you're combining a lot of data sources. Can you comment on your approach? Also, considering this collaboration, are you running on GCP?
Also, feel free to join our Slack at https://slack.tmrow.com to ask your questions!
I personally think the documentary doesn't go far enough about the cultural aspect. I'm speaking specifically of the social norms of conformity, sociopathy and narcissism in the decision making classes. We have fucked future generations for short term gains and the only hope the tech community can come up with is to pretend that we will build rockets to go to Mars. When the kids of the billionaires pushing this horseshit get preventable cancers or tangled up in a class war - maybe then things will change (albeit I doubt it). Until then, it's all bean bags, idiocy, shiny tech toys and fluff. Enjoy it while it last. The magic bullet of nuclear fusion doesn't seem to be likely and the social organization to leverage what we have in a responsible manner doesn't either. We will keep doing what we are doing until nature bitch slaps us and we have to change. Hopefully I'm wrong. I want to be.
Solar and wind on the grid increase supply, which should drive down price per KwH (of course, the equation isn't quite that simple, since demand in most of the world near human population centers is also highest during the day).
Big users such as Google with its datacenters will of course negociate their own electricity contracts. I think renewables are the cheapest to buy right now, so by moving load around to be able to maximize use of cheap renewable electricity, they will definitely save money.
What I mean by this is that instead of deciding "I want to drive 200 miles to the beach" and buying a tank of petrol, you would instead wait for favourable wind/solar conditions in order to "save up" the energy you need such that you can afford to drive to the beach. If you are unfortunate one year you might only end up with half of what you need, but you'll still be able to do something.
This goes for things like food too. Stop demanding the same food year round. Instead work with the seasons and eat what is available locally at that time of year.
This would be such a huge boost to happiness. You can't see light if it's light all the time. We just don't know how great our lives are because we simply expect it to all be available at all the time. Expectations are simply assimilated and become invisible very quickly. Not only that but it turns out that meeting these expectations comes at a huge price. Let's instead take what nature gives us, but no more.
Seems some people at Google still hasn't got the memo yet that the "not evil" days are now a thing of the past. This looks amazing and more like something I would expect from old Google.
A more interesting measure would be the actual reduction in CO2 emissions.
Yes, there are valid criticisms. Wind turbines are made of unrecyclable fiberglass. It takes energy to build them (truck rolls to the site, concrete for the foundations), and it's important to make sure the energy return on energy invested is net positive. We use fossil fuels to produce these renewables technologies. That's all true, but not insurmountable.
They say battery storage makes up only a tiny percent of the needed capacity to overcome renewable intermittency. Sure, but it also omits how solar has dropped two orders of magnitude in price over the last few decades as we've built more of it and gotten better at making them (the "learning curve").
It follows a group of Vermont hikers hiking to a wind turbine site and then being NIMBY about it, but none of them talk about where their energy SHOULD come from.
Look, it raises a lot of critical questions. But it also seems to expect a single magic pill that just doesn't exist. 2/3's of the way through they talk about the misrepresentations in biomass and point out how many organizations seem to be both for it and against it. "Which side are they really on?" says the classic accusatory documentary voiceover with scary music. Well, it's complicated! Clearly you don't want to burn all the forests all at once. And yeah, if you burn pressure-treated wood, those chemicals go into the local community. At the same time, wood does grow back. The nuance that's missing in this documentary is questions like "how many acres of rotationally-harvested woodlands are required to power a 1MW biomass plant sustainably in perpetuity? And can such projects exist in practice?"
Biomass isn't a panacea solution, and the HN startup mindset of "can I scale up a technology to dominate everything" doesn't apply because biomass has limits to it's scalability. It's just one of many tools, and the problem about this documentary is it can't envision a future where many tools are used together. When a Sierra Club exec is questioned about biomass, they kept the part where she says their "position is nuanced", but then they cut to something else without explaining that nuance. That's lazy documentary filming.
The complicated thing about energy is there is no silver bullet. This documentary finds the bad in each technology without considering how all the pieces could fit together. It presents the bad sides of each technology as if that should disqualify the tech instead of asking how can we improve each over time. There aren't easy answers to these questions, but this documentary just wallows in how bad everything is without asking the hard questions about how things can be made to work or what the alternative of doing nothing is.
These turbine blades can be broken down into pellet insulation or used as feedstock for cement kilns. It's a supply chain and economic incentive issue, not an unsolved technology issue.
I can't speak to Moore's beliefs, but his documentaries (IMHO) are designed to inflame, not to have an intelligent discussion about complex problems that require complex solutions. They are "clickbait" disguised as objective information.
This is beyond depressing. All cope and hope peddling.
"Here's a barge full of coal. Maybe you can fix it with that."
"Google: Data centers now perform LESS when the sun is not shining or the wind is not blowing"
My point is that by tying performance to environmental factors, you get a boost when things are great but then can have troubles when things are not great. Anyone familiar with solar panels already knows this, but if the correlation is obscured, it could be surprising. The article didn't mention a specific performance gain, but if we say you get an X% performance gain when the sun is out, it also means you get a similar X% performance loss when the sun is not out. Users of the system will get used to the improvement, which becomes the new standard, and then a particularly dreary season comes in with weeks of cloud cover, and suddenly there is concern about the degradation of service.
(Like I said, it's still a good idea, it's efficient use of resources, but the PR is funny, that's all.)
They've worked so hard to sell their AI solutions to the fossil fuel industry, lately, so they can help them extract and burn more oil and gas[0].
[0] https://www.vox.com/recode/2020/1/3/21030688/google-amazon-a...
Who said they'd burn it for fun? Necessity suffices.
>In Russia, one of the world’s top producers, the industry is considering resorting to burning its oil to take it off the market, sources told Reuters
Source: https://www.reuters.com/article/us-global-oil-turmoil/when-o...
How can Google, while helping others extract as much oil as possible, can still pretend it may not end up being burned?
It's like a a drug dealer saying "I'm definitely not helping anyone taking drugs. I'm only selling it, they stay them somewhere if they please. Don't blame me".
Unfortunately an unintended side consequence of these kinds of efforts (unless you're very conscientious about maintaining the correct incentives, generally through pricing) is sometimes that the gains in energy efficiency and savings are clawed back by an increase in overall energy consumption because it's gotten effectively cheaper to operate for the same number of compute cycles.
Just like with energy efficient LED light bulbs, although the overall energy use goes down, often it doesn't go down as much as it could have ideally, because people start lighting places that didn't have light before, because it's gotten so much more affordable to do so!
Or like when you add highway lane capacity, traffic gets worse...
Or in this case, the Google video engineers come up with new useless filters and resolutions to occupy the newly freed-up compute capacity.
Just something to be aware of. The people who do this have to monitor and put in place controls so that the outcome is what they intended. Otherwise people are more clever than you think.
LEDS have to be one of the worst possible example for claims of induced demand as a bad thing given that the efficency gains outstripped proliferation of additional always on devices and a cellphone per person.
While Induced Demand may exist it too has its saturating limits of diminished returns.