When COVID-19 started causing havoc around the world and forced governments to implement lockdown measures in an attempt to slow its progress, daily life changed quite significantly for most people. Business operations of all kinds were disrupted — offices closed, inessential brick-and-mortar stores shut down, and jobs were lost — and a lot of panic resulted.
On the whole, though, it’s remarkable how well organisations everywhere (private or public) have adapted to these difficult circumstances, and it’s mostly due to the convenience and ubiquity of the cloud. Without it, the sudden move to remote working would have been too much for many: if this pandemic had struck 15 years ago, the effects would have been very different.
After a short period of remote working being standard practice and traffic dropping enormously across the board, you probably saw the stories about pollution levels dropping drastically and air quality improving proportionately. People started to focus on the positives of the lockdown: after the pandemic is eventually thwarted, surely we should stick with this new approach to business.
There is one major concern that the new business world has highlighted — and that’s the energy use of cloud technology. In this post, we’re going to consider how sustainable the cloud really is, and what can be done (or is already being done) to make things better:
The drain of early blockchain technology
Concerns about the energy use of technology really hit the mainstream when cryptocurrencies started to pick up steam and people everywhere invested heavily in crypto mining. High-power GPUs were set up to run at full capacity on a 24/7 basis, all in the hope of producing some profit, and environmentalists were understandably frustrated. The proofs being generated weren’t valuable outside of finding and allocating coins: it was purely about making money.
Over time, things began to change.

View Entire Article on ComparetheCloud.com