Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🍃 adding environment protection KPI and audits (like CO2 footprint) #12548

Closed
willi84 opened this issue May 23, 2021 · 19 comments
Closed

🍃 adding environment protection KPI and audits (like CO2 footprint) #12548

willi84 opened this issue May 23, 2021 · 19 comments
Assignees

Comments

@willi84
Copy link

willi84 commented May 23, 2021

Feature request summary
Adding one or more KPIs to audit the environmental footprint.
Measuring e.g.

  • the footprint of executing the website
  • the footprint of using CDNs compared to usage analytics if possible
  • reducing battery, connection and way more usage
  • reducing connections to large scale backend with bad footprint
  • analyzing bad deployment and cloud setups
  • non-necessary usage of scripts where server-side rendering or caching would be a better option (how much this would save costs)
  • ...

What is the motivation or use case for changing this?
The Web is scaling exponentially. so our footprint is scaling. A lot people dont know what kind of footprint we have.

How is this beneficial to Lighthouse?
contributing to a green web

ref: #10269 as possible duplicate

@mgifford
Copy link

This is an incredibly important issue.

Leveraging the https://www.thegreenwebfoundation.org/green-web-check/?ip would be useful.

Other examples include https://www.websitecarbon.com/

The CDN issue is an interesting one @willi84 so often they can hide the servers used to run the service.

@drydenwilliams
Copy link

The upcoming version of EcoPing gives loads of wonderful information like this. We've created an EcoScore which is build like the different Lighthouse audits (Performance, SEO etc..) to give your websites a score out of 100. Along with this we show you the Performance, Best Practises and top tips to improve them. These audits everyday on the sites you add to your dashboard, so you can see your site changing/improving over time.

We also use live electricity grid information to see how renewable your websites resources will be:

Screenshot 2021-05-19 at 14 25 39

You can even add different pages for a website and compare different ones too:

image

@mgifford
Copy link

Pretty neat that you can compare information where different files are loaded from.

@willi84
Copy link
Author

willi84 commented May 24, 2021

nice stuff. It would be awesom if we can also compare more aspects. not just co2 or electricity.
also:

  • using of old inefficient OS of the Server
  • usage of dedicated server when traffice/analytics says there is no need to
  • usage of scripts where there is no need to (needs better code analyziz) and could be replaced with Server-Side Rendering
  • bad deployment files (if public accessable by a referenced github/gitlab project)

As far as I know we can't read from outside the server, so we don't know anything about:

  • old ineffficient server
  • way too much server changes without recycling old components
  • non-healthy production-standards

But if I could wish, I would love if google could provide a protocol that lets the user analyze the server setup (maybe in collab with big hosting/cloud providers)

  • that you know that your server is tool old
  • that says to you, that you have an inefficient cloud setup to your needs
  • better analysis of inefficient deployments

@mbarker84
Copy link

Having widespread visibility of these metrics through a tool like Lighthouse is really important for building a greener web. In my experience many clients use Lighthouse to see how their site is performing as well as developers, it could trigger conversations and motivate change that might not otherwise happen. A top-level eco score (alongside the ones for Performance, Best Practices, etc.) along with a breakdown of areas for improvement (like the ones mentioned in the other comments here) would be great.

@radum
Copy link
Contributor

radum commented Dec 9, 2021

As far as I see most of those tools use the same study from 2017 and the main metric is number of MB over the wire. I think this is flawed and you can't really measure the carbon footprint using that metric. My home router uses the same electricity if I watch Netflix vs normal browsing, and watching movies consumes way more data (I checked).

Also there are more and more new mobile devices coming online every day and the global network has not imploded. Also networking equipments improve constantly at various adoption rates.

So I don't think is that easy to measure this. Also it matters lots how a website is hosted and how it works. Caching and edge cache being very important things to consider.

Netflix for example works with ISPs all over the world to host their content near the location of their users, dramatically reducing the impact on the network.

What I am trying to say is that, this is a problem we are yet to properly understand and be able to measure it.

@willi84
Copy link
Author

willi84 commented Dec 10, 2021

Its not just about electricity or MBs ... there are also other ressources like the network. And when netflix deliver near ur house, they using a CDN (Content Delivery Network) which means they have a copy of their content on multiple locations which have to be stored and backuped and so on there.

@radum
Copy link
Contributor

radum commented Dec 13, 2021

That is the example I gave also @willi84 my point is that if Netflix uses your ISPs CDN today and not again tomorrow, your ISP will roughly use the same amount of energy. So data transmission volume alone is not a good indicator.

@drydenwilliams I am curious how your CO2 emissions are actually calculated for EcoPing. Perhaps there is something smarter than volume of data and the energy impact.

@paulirish
Copy link
Member

I've looked at the various efforts to quantify website energy use/carbon footprint but currently they're rather under-developed compared to web performance or accessibility testing.
They primarily use proxy metrics like network request count or byteweight. But relying on these is far too indirect.

An example usecase: I've seen some bot/human-detection javascript code that gets embedded along various 3rd party scripts. On mobile, it can be 500ms of pegged CPU, so from a webpage POV, that's a non-negligible amount of watts being spent. Any metric/tool that's quantifying this stuff should not only identify this cost but attribute it correctly to what entity included that script.

Energy consumption is also very different on mobile connectivity versus desktop. Polling a network resource every 5 seconds is asking a lot of your phone's radio resource controller, but it's trivial for cable/fiber internet.

The existing energy consumption tooling for mobile development is quite mature, so I'd recommend folks investigate how Android/iOS and Mac/Windows quantify these things:

On desktop, I also wouldn't be surprised if a typical website ends up costing significantly more power draw from the monitor displaying the non-darkmode pixels than from its CPU activity. From some limited research, it looks believable. 🤯

I think the next steps in this area are quantifying these costs a bit more. What are the rules of thumb to estimate tradeoffs? (eg., X seconds of a white background at normal brightness on desktop ~= Y seconds of high CPU load)

@radum
Copy link
Contributor

radum commented Dec 14, 2021

@paulirish Even so it greatly depends on the device, some are very good some are not. The variation will be so great that will be hard to establish a baseline. Even if it is at monitor, cpu level.

Cisco did a study on old routers for energy consumption (I will find the results) under heavy load vs mostly idle and the differences were insignificant. The main improvements for those devices comes at hardware level targeting the energy consumption of its parts. Data volume had nothing to do with it. Network equipment improves almost every 2 years and the number of devices that come online grows exponentially each year while the network still holds without any signs that it my crash under heavy load.

Also talking about testing a particular website brings lots of questions

  • How do you include the energy and materials required to produce that site?
  • What about the energy required to host it, cloud containers, and content delivery networks?
  • How do you measure the energy requirements of end-users interacting with your product or service across devices over time?

A website has many components across multiple systems (most of the closed), each of which have their own energy and resource requirements. Finding a blanket solution that fits all is elusive.

I can have a 10KB website hosted by a nuclear plant or a SPA 100mb site hosted by a Raspberry powered by solar energy. Which one is better?

The scientific community has not yet reached a consensus on how specifically to measure emissions from any digital product or service. And I'm not expecting this to happen anytime soon.

Better proxies are what we do, web performance, and very important UX. I can have the most performant site in my domain, but if my users spend loads of time with their phones in their hands because they can't find the content they need I negated the performance impact.

As you said screens are by far the biggest energy draw. Smartphones have more efficient screens. But its hard to put a number on that to size fit a metric.

I think web engineers today need to focus on web performance and UX.

@willi84
Copy link
Author

willi84 commented Dec 14, 2021

maybe its useful not to create the as we Germans would say "eierlegende Wollmilchsau" (the thing which could do everything) ... and starting small. google has already a lot good data I am sure. How about simple rules like that:

  • if my user are just in Europe (according to google search or analytics) then a CDN is overkill
  • if I just have a simple static web page, it doesnt need a big js framework on top or big bundles to load
  • if my cookie consent screen is loading, I dont need to load in background the whole images e.g.
  • also the chrome team probably knows a lot about efficient js code structures and such who are not performant
  • with analytics you know how deep ur user is reading ur page and if it really make sense to deliver big content pages
  • you can read which server you are using and it this is up to date and efficient
  • dont duplicate websites for desktop and mobile
  • if you dont neet to support IE to your users, you might not need polyfills .. welcome solutions that forces user to use a modern browser
  • is all of your requested data used in the application? maybe graphql or similar is more efficient or just load it inline in the html document
  • do you have tons of trackings which have a big impact, maybe just use two
  • are you doing fancy stuff with state management instead of using cookies, variables , local storage or other stuff to store temporary data
  • ...

@willi84
Copy link
Author

willi84 commented Dec 14, 2021

maybe @jawache from MS has also some ideas?

@mrchrisadams
Copy link

Hi folks,

I've been trying to find some numbers in this issue on CO2.js, where I've been collating relevant links and numbers

thegreenwebfoundation/co2.js#2

In there, you'll linked see some existing projects for lighthouse that might be of use.

@jawache
Copy link

jawache commented Dec 22, 2021

Thanks @willi84 you might be interested in the Software Carbon Intensity (SCI) Standard that’s being worked on in the Green Software Foundation. It’s currently in draft but we’ve recently released an alpha version for public comment and there is also a more layperson's article about it on our blog.

From our article “a method for scoring a software system based on its carbon emissions. The SCI is a tool that enables developers to easily account for software carbon intensity in their day-to-day work in the same way they consider cost, performance, security, accessibility and other concerns today.

The SCI was created through the Standards Working Group of the Green Software Foundation, the international coalition of nonprofits, academia, and industry leaders.”

The standards working group is now working on some case studies, so we should have a case study for how to calculate an SCI score for a website in the new year.

@paulirish from people I've spoken to the monitor power draw is significant as you suggest. Windows also has an command line energy measurement tool that captures display power consumption on a task level.

NOTE: The Green Software Foundation is different from the Green Web Foundation, although they are members and we work closely, hey @mrchrisadams!

@willi84
Copy link
Author

willi84 commented Dec 27, 2021

just a note: a talk about "Blauer Engel" certification at #rc3 congress
https://invent.kde.org/teams/eco/be4foss/-/blob/master/conferences-workshops/presentations/2021-12-27_ccc-r3s_presentation.pdf

its just about desktop software but maybe also stuff to learn from.

@paulirish
Copy link
Member

my earlier response captures most things I want to say: #12548 (comment)

I think there's potential for folks to quantify energy use on the client (monitor & cpu energy, etc) and also separately quantifying energy use of the servers involved. I urge folks to attempt higher precision measurement rather than proxy metrics.

thegreenwebfoundation/co2.js#2 looks like a decent place to continue the conversation. g'luck all!

@willi84
Copy link
Author

willi84 commented Dec 2, 2022

one more thought
while listening to a talk about graphQL I was thinking about some kind of data-coverage tool, measuring how much data from an endpoint is used and/or how much api calls are happen.
the less API calls and the less data footprint, the less usage of the wire to transport data.

@paulirish
Copy link
Member

WPT landed a new feature around this yesterday:
catchpoint/WebPageTest#2867

(AFAICT, it's unannounced, so... best to let them be first to share the news. ;) though I do spot a screenshot in that PR for the curious...)

@paulirish
Copy link
Member

WPT landed a new feature around this yesterday

It's launched: https://blog.webpagetest.org/posts/carbon-control/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests