My submission for the UKRI Sustainable Digital Society workshop

A friend of mine, Eirini Maliaraki, recently suggested I apply to take part in an virtual workshop on challenges for a Sustainable Digital Society. This summary explains what it was about:

This is a call for expressions of interest to take part in a one-day scoping workshop to explore how research can tackle the challenges in establishing a Sustainable Digital Society. The outputs of the workshop will be used to scope a Digital Economy Theme call to be funded by EPSRC for up to £5m.

I applied, and was accepted! One part of the preparation work was to answer the following question, before the workshop:

"What is the most important research challenge that should be addressed by the Digital Economy Theme in the next 5 years in order to achieve a Sustainable Digital Society?"

They're using a tool, called Well Sorted, to help with gathering and synthesizing key themes before, during and after a workshop. I haven't come across it before, but their website has a handy playlist explaining how it works.

My answer

Our answer needs to be delivered in two parts - the title, and the actual text, and there's a word limit on the submission. This was my submission. I've shared it here, so I can unpack the answer - for my own thought processes as much as anything else.

Factoring projected emissions into decision making

We must get better at factoring the environmental consequences of future activity into decision making, to reduce and avoid future emissions.

We need to be able to do this when we have incomplete information about said activities, and allow for recourse.

I'll break this down in more detail.

The "We must get better at" part

The science literally dictates that we need to get better at this if we want to have habitable planet and avoid millions of needless deaths.

We need to be grown ups about this, and we have to get better at internalising what’s at stake, what our continued inaction says about us if we don’t move fast enough.

Speed is justice.

the "factoring the environmental consequences of future activity into decision making" bit

If we do this at all, it’s still extremely rare.

There is some promising work around explicitly internalising the cost of emitting carbon into decisions we make. The software giant Microsoft is now probably the highest profile proponent of this approach, and they attach a 15 dollar per ton carbon fee, that they apply to all activity. When it comes to carbon emissions, we're talking about the full three scopes approach as described by the GHG Protocol.

This is good for internalising the cost of an organisations operations that emit carbon direcly and indirectly, but it hasn't been so effective for dissuading the same company from signing contracts to help Exxon extract even more fossil fuels out of the permian basin.

If Microsoft's carbon footprint in 2019 was around 16million tonnes, and in 2025, this project is supposed to be extra 50,000 more barrels a day this, an estimated 3.4m tonnes of carbon emissions from just this one project.

The worrying this is thing is that in the tech industry, Microsoft is one of the leaders in this space right now. Also, the figure Microsoft, a leader, is using is nowhere near the 100 EUR per tonnes price of carbon proposed by the EU, as outlined in some good work by the New Climate Institute.

The "reduce and avoid future emissions" bit.

This is the problem we need to avoid. There is already money flowing into carbon removal solutions, and I think we’ll see more flowing to this over time, if only because we’ve left things so late, that it’s tempting to focus on this, to avoid thinking about the scale of the changes the science spells out for us. I tried summarised this in a short thread on twitter, partly as a response to the otherwise good news that Stripe, another leader in tech, was investing in a portfolio of carbon removal.

They are absolutely necessary given how late we've left it, but the problem I have with the carbon removal discourse, is that feels a bit like saying: "Let's save the person bleeding out, by pumping more blood into them!"

While ignoring the fact that they're still bleeding.

So, research on reducing and avoiding is needed too.

the "We need to be able to do this when we have incomplete information about said activities" bit

We can’t wait until we have all the data - every day countless decisions are made that commit millions of tonnes of emissions into the atmosphere.

And while I find much of the discourse around putting dollar amounts on the services nature provides uncomfortable,, I am aware many, that many decisions are ultimately based on the consequences of a particular action, in terms of how much money it makes, or how much it costs.

Crucially, these are ultimately what many organisations use to allocate resources and attention - pretty much every other nice statement will be trumped by this, unless there is a law backing it, and even then, if the cost of paying the fines is low enough, it can be considered a cost of doing business.

I also think you don’t need to be part of every decision to influence it - if you have a decent set of heuristics, you can use these to influence decisions such that people make more informed tradeoffs. This is partly what the carbon fee example I refer to above with microsoft is designed to do - it provides guidance that enables more informed trade-offs for all the countless decision that have an impact.

the "and allow for recourse" bit

This refers to the fact that it's difficult to have a meaningful discussion about making reductions in the environmental impact of technology, and through technology, without talking about governance.

If we haven't reduced the environmental impact of technology faster than it's grown over the last few decades, it's not because we don't know how - it's because we've been okay with shifting the costs outside of the organisations doing the growing. The same applies, like the Microsoft example above to the outward facing decisions we make, albeit usually at a much smaller scale.

There need to be clear consequences for not factoring the non-monetary impacts into our actions into decisions like the ones I've referred to, and we need to make it easier and more 'normal' to talk through their consequences, using language that allows us to be mindful of the deliberate tradeoffs we are making, and where necessary change course.

A note on taking part in UKRI workshops

I think I was the person with the fewest formal qualifications during the workshop, and I was really, really intimidated by reading all the profiles ahead of the event, and I wasn't sure how I'd be able to contribute in a meaningful sense on the day, but in the end everyone was really nice and welcoming. I realised that in cases like this havig a broad base of knowledge even if it wasn't so deep was useful to help find common language across domains, and identify similar ideas, and I ended enjoying much more than I was expecting, and feeling like I was able to contribute in useful way.

I'd absolutelt recommmend takng part to anyone who has a cross disciplinary background, and shied away from them before.



Copyright © 2020 Chris Adams
Powered by Cryogen
Theme by KingMob