Data for Children Collaborative

View Original

DCC Dictionary: What do we mean when we talk about "ethics washing"?

In recent years there has been a growing appetite for putting ethics at the core of emerging technologies. Why is this so important?


In recent years there has been a growing appetite and a real need for putting ethics at the core of emerging technologies. Why? Because it makes technologies better, not just better suited for their purpose, and therefore better for their users but crucially, better for the world – both humans and the planet. After all, ethics invites us on a continuous journey where we stop and ask,

(more on ethics in our Dictionary entry here). But sometimes that journey might be confusing, or to some, maybe those in the race to raise profits, seem like a barrier. Especially if it is unclear what the process of asking the right questions is or even what these questions should be. Well-designed regulations and robust ethical frameworks are hugely helpful in providing this guidance and keeping data and AI-rich projects on the ethical track, but creating these isn’t without its challenges. So it may be tempting to see ethics as a less demanding alternative to developing regulatory frameworks or being subject to legal restrictions. Using ethics in this way is often called “ethics washing”.



Similar to greenwashing concerning environmental issues, ethics-washing is a term coined to capture a set of practices which, on the surface, can make a company or an organisation present itself like they are doing things ethically, but with a closer look, we discover it is a mere façade. Sometimes it can be unintentional – after all, we are all still learning, and it doesn’t always work as intended when we try things out. Many data scientists and developers do have the desire to combat the hidden biases that data is often riddled with, and make sure their projects benefit society. However, in some cases, it can be more calculated.

One of the issues ethics-washing poses, whether intentional or not, is that it may seem that the ethics work – however questionably the company may be undertaking that - can then excuse more robust work around potential data issues. Thereby detracting from the urgency of creating appropriate regulatory and ethical frameworks. It creates an illusion of sufficient self-regulation. Some might think that is great because regulation gets in the way of innovation – but many would disagree.


What is innovation good for if it does not help close inequality gaps in some way, or at least does not add fuel to the fire? After all, as Karen Hao points out, the ultimate aim of the technology, AI included, is to help humans prosper, or as some philosophers say, it should be set to help people not only prosper but thrive – growing and evolving. If we think about innovation in this way, perhaps we can also understand ethical frameworks as not policing tools but enablers. Helping data innovators to fulfil the ultimate goals of technology?


Not all innovations need to think about ethics deeply, and drawing the line between one that does and one that doesn’t is a tricky business. It is not always clear, so creating a culture of ethical reflection can help start the process. Enabling ethical literacy helps identify challenges, and ethical frameworks support steps needed to prevent harm and create benefits.

A vast number of projects operate under the Data for Good umbrella, which starts from the idea of creating tangible benefits for the most disadvantaged members of society and protecting the most vulnerable people. Many of those projects do amazing work at supporting the UN’s Sustainable Development Goals. For Data for Good initiatives, it is easy to conclude that the first steps on the journey to ensuring the technology support human flourishing have been taken – this is the very ambition of the Data for Good banner. Unfortunately, that is not always the case, and it can soon become another tool for ethics washing. Some initiatives take off under the pretence of “doing good”, which may produce some good outcomes, but miss a bigger picture and simultaneously create harm.

So practical ethical frameworks for third sector data-driven projects are just as important as those for commercially oriented activities. They improve ethical literacy. They help everyone involved, from data scientists to academic experts and local actors, feel well equipped to ask difficult questions about the motivations and methods of projects they are involved with.

The Data for Children Collaborative with UNICEF recognises the need to develop ethical literacy and responsibility. Our work on Responsible Innovation is one of many ways we hope to help others design and deploy practical ethical frameworks. When implemented properly, there is little room for ethics washing. But improving ethical literacy is a process involving many moving parts, and it never ends. Ethics is all about continuous growth and learning for one another.

Want to find out more about any of our services?

Drop us a line: Hello@dataforchildren.ed.ac.uk