In the midst of yet another tech scandal, there has been a lot of talk on the Northcoders campus this week of what had got wrong at Facebook and Cambridge Analytica. And that got us thinking about our own ethical obligations.
Almost anybody whose work affects the general public must abide by some set of ethical tenets. Doctors have been swearing the Hippocratic Oath since the fifth century BC, and have been followed by codes of conduct for those who work in construction, oil, horticulture, law, franchises and many others.
Yet while specific tech organisations have their own ethical codes, the tech industry is not itself subject to any centralised code of conduct. So for our latest Northcoders Discuss, we thought we would ask our team to tell us about an ethical obligation that we might have as a developer...
Sally Hale - Northcoder In Residence
With another tech scandal in the headlines, developers might be asking themselves how they would respond if asked to design or develop something they felt was unethical. Many designers adopt a personal code of ethics, trying to find a balance between both the needs of the business and the customer without becoming coercive or deceptive. Are the terms and conditions deliberately difficult to find, or opt-out buttons confusingly-worded? Are your users achieving what they came to your app to do, or are you persuading them to do something else?
Advertising and consumer law can protect against the most extreme attempts to obscure risk, deny customers’ rights or target the vulnerable, but up to that limit, it is creators themselves who have to ask themselves who really benefits from their product, and if anyone is being harmed in the process.
Jonny Rathbone - Tutor
The argument over whether science is amoral and whether it is the user or applier that gives it ethical character has existed for at least as long criticism of Darwin’s theories for how they inspired Nazi eugenic practises. It is unfair to criticise someone who observes and extrapolates in place of the one who extrapolates again and acts on this, but in tech we perhaps live in a murkier world, where our job is to provide bridges between scientific advancement on one side and public opportunity on the other. The notion is housed in the word ‘developer’; not necessarily starting from nothing, but inevitably guiding the tendrils of possibility for the user.
Google’s restructure into Alphabet in 2015 saw its motto change from ‘Don’t be evil’ to ‘Do the right thing’. There’s a rich semantic world to explore here. The former may feel more pithy and sardonic but it is arguably richly permissive - is it evil to collect detailed profiles of your users and allow others to exploit them? Maybe not, but do you share responsibility when things go wrong? Surely, because you provided that bridge. And if you share responsibility, you have an obligation to at least act to mitigate risk. The latter motto asserts a transition from negative to positive (it should be ‘right’) and from passive to active (‘I swear I’m not evil, I just sometimes do evil things’ should now be ‘I may not be perfect, but I’ve got a choice here’).
Developers are usually cogs in a machine, and it can be hard and perhaps detrimental to see yourself as capable of having too wide an impact. But we should always be enabling, otherwise what’s the point? Whatever Google was intended to be, it wasn’t to be benign, and that is progress, in all its unruly glory.
Josh Gray - Talent & Partnerships Coordinator
Developers have a responsibility to behave ethically, but this obligation does not stem from being developers. As with any citizen in a democratic society, there is a certain of standard of behaviour that forms a part of our shared values.
I feel technology is not unique with regards to the prevalence of questionable decisions and a poor transparency, but merely a high profile example of an industry displaying conditions enabling this behaviours. These conditions are miseducation and obedience to authority.
Recent data privacy scandals have reinforced my suspicions that there is a fundamental misunderstanding about the information we share, and the privileges of those we give this to. Looking at Facebook’s terms of service, a couple of lines jump out:
“...you grant us a non-exclusive, transferable, sub-licensable, royalty-free, worldwide license to use any IP content that you post on or in connection with Facebook (IP License)”
Another recent example is how surprised people were by the amount of data mobile apps gathered about their location, but never saw the connection with the targeted advertising they received and suggestions by apps such as Google Maps.
As a society, we have a responsibility to educate people of all ages about how their data is used and the revenue streams of tech corporations.
We also have to look at the environment around developers, enabling them to justify ethically ambiguous decisions they are asked to make. I think the detachment from your user a code editor provides is harmful, and the lack of cultural diversity in development teams create echo chambers and wide-scale groupthink.
Finally, questions need to be asked about the people we place our trust in, and why. I feel many people in my industry need to look at the work they do and the companies they do this for in the context on the Milgram Experiment, a study that explored the conflict between obedience to authority and personal conscience. Milgram came to the conclusion that obedience to authority is engrained in us all from the way we are brought up, making it easy for us all to make poor ethical decisions when encouraged by authority figures.
Finally, here is a link which is one of many examples of how the people we trust most often share characteristics, specifically being a white, nerdy male, and hope the reader considers how these preconceived notions harm society.
Ruth Ng - Head of Growth
As Jonny says, it is easy to feel like a cog in a machine as a developer, particularly if you work for an enterprise. And to reiterate what Josh has said, it is in the very nature of what it means to be human to defer to authority.
I believe that we as developer have a moral obligation to tease out the ethical issues that pertain to the work that we do, and to be vocal about what we believe to be right.
And that should include creating or even approving software that doesn’t meet the consumers’ needs and requirements or that isn’t robust or safe.
Nobody should ever feel obliged to perform a task or work on a project that they believe is detrimental to the economy, individuals, our culture or the environment or else that they deem unethical for any reason.
And if your employer asks it of you, you’re working for the wrong company.
Finally, to follow on from Josh in commenting on the importance of diversity in tech, I'd like to remind you of an excerpt from the Northcoders pledge to diversity:
"Diversity and inclusion are a big part of who we are. At a time where technology will define what it means to be human, it’s crucial that we ensure that our future is built by people from all walks of life, and that people from all walks of life have the opportunity to be a part of it."
We must be vocal about this and we must act! It is our moral duty to ensure diversity in tech wherever it's possible for us to do so. Our future must be built by people from all works of life, lest it become a tool to serve the few.
With tech ethics in the spotlight once again, it is an opportune moment to re-examine the way we think about our ethical obligations as developers and take a more active approach towards creating tech that exists to do good.