The Ethics of Code and Design

In the tech community we are talking a lot about two topics: Privacy and security. A good thing. There is one other important topic that we rarely talk about. Ethics.

by Thomas Schinabeck | November 2017
A while ago I stumbled over an essay by Jonathan Harris, titled Modern Medicine. It’s not about medicine, it’s about technology and their makers. It is a piece about the responsibility of software engineers and designers.
 
Here some quotes I highlighted back then:
 
„Through the software they design and introduce to the world, these engineers transform the daily routines of hundreds of millions of people. Previously, this kind of mass transformation of human behavior was the sole domain of war, famine, disease, and religion, but now it happens more quietly, through the software we use every day, which affects how we spend our time, and what we do, think, and feel.“
 
„If a given drug is found to harm more than it heals, we’re encouraged not to use it. But sometimes a drug is so addictive that we use it anyway — even if it hurts us — and we go to extraordinary lengths to obtain another dose.“
 
„A lot of software is designed to be addictive. In Silicon Valley, the addictivity of a given piece of software is considered an asset.

Compulsive Usage

Jonathan uses the word addictive and he is not over exaggerating. If you analyse the user experience of certain digital products, you can see close similarities to products that are already regulated for being addictive. Sometimes you even wonder, if certain concepts and pricing strategies had their archetypes in these kind of products. The similarity of design and experience of certain apps compared to slot machines is just one obvious example.
 
Some of you will think now, well this is a very dark picture you are building here.  If you think so, please read this paper „The Top F2P Monetization Tricks“ by Ramin Shokrizade. It’s published at Gamasutra an established online publication on all aspects of the gaming business. Just to be clear: This paper is not a critical piece, it’s much more a „how to“. Here are some quotes from the paper:
 
„While it is possible to make commercially competitive games without using coercive methods, this is a lot more work. In the current market, especially with most adults and children not familiar with the nature of these products, the environment is still ripe for fast profits, and likely will continue to be so for a few more years.“
„This involves putting the consumer in a very uncomfortable or undesirable position in the game and then offering to remove this “pain” in return for spending money. This money is always layered in coercive monetization models, because if confronted with a “real” purchase the consumer would be less likely to fall for the trick.“
„Thus consumers under the age of 25 will have increased vulnerability to fun pain and layering effects, with younger consumers increasingly vulnerable.“ 
„Note that while monetizing those under 18 runs the risk of charge backs, those between the age of 18 and 25 are still in the process of brain development and are considered legal adults. (…) Thus this group is a vulnerable population with no legal protection, making them the ideal target audience for these methods. Not coincidentally, this age range of consumer is also highly desired by credit card companies.“
„The above mechanics are not meant to be exhaustive, but give a basic overview of key techniques used in coercive monetization model based games to defeat a customer’s ability to make informed choices about the costs and values in these products.“
 
The downsides of compulsive usage seems not to be a thing these people worry about.
 
Compulsive usage is not just a topic in games. We all experience how we sometimes get lost on certain websites for hours. How we check services several times a day without real urgency. We realize our own limitations of being in control of our own behavior.
 
If designers at Facebook use words like „serotonin“ to describe their design process, then we can assume they know a few things about neuroscience and behavioural economics.
 FacebookNotification
„The company shorthand for this is ‘serotonin’, the neurotransmitter that sparks feelings of happiness. A sticky note with the word scrawled on it is tacked on the wall of a design meeting I sit in on. ‘That’s our term for those little moments of delight you get on Facebook,’ explains Julie Zhuo, a design manager. And Cox clearly understands this as well: ‘It’s the science of things you can’t reason about, that you just feel,’ he says. ‘So when we’re going off to create something new, it’s important to be iterating in that mind-set.’“
– Source: Fast Company – April 2012
 
The ability to combine neuroscience and the limitations of human behaviour with technology is a huge responsibility. We have the power to design for compulsive usage in a way like never before. Not every application succeeds building something that people need or want to use on a regular basis. Though if we get into that lucky situation, we should be aware that behind these analytics data points are (most of the time) humans.
.
“Indifference towards people and the reality in which they live is actually the one and only cardinal sin in design”
— Dieter Rams
I am not saying that it’s easy to build such products and that everybody can do it, it’s not. But more and more people are experimenting in these fields. The more try, the more will succeed. There are huge incentives to misuse this power. It’s a billion dollar business. The estimated daily (!) revenue of a game like Candy Crush is over $2,173,000.

Product Design Decisions

Mike Monteiro gave at the Webstock Conference 2013 a talk titled „How Designers Destroyed the World. In the beginning of his talk (04:50 – 09:05) he introduces an interesting example that show the consequences of certain product decisions.
He tells the story of a young woman who was outed unintentionally as gay in front of her parents, via Facebook. The reason: A new feature of Facebook that enabled every user to add any other user to a group. The main problem: The group privacy settings can overwrite the users privacy settings. If someone created a public group and added you to this group, everyone else could see that you are member to this public group. A post on your own Facebook profile inclusive. An organizer of a gay support group, who had no bad intentions, added this young woman to his group. Her life changed radically.
 
No question, this is a complex case and you can argue for hours, who is responsible for this situation. However it shows designing a feature could have a bigger impact as we might think. The team at Facebook is without doubt one of the best in its field. I am sure, they are not careless. The question is, how did they make their decisions? Which priorities did they have? Which tradeoffs did they made?
 
If you compare the first example about designing for compulsive usage and this last one, they seem quite different. In the end though they have the same roots.
If you design something like the Facebook group feature, then user retention, virality and many other business goals are a big part of the decision process. If the priorities of the team are on growth, you can start to speculate which tradeoffs you are willing to make. The fact that Facebook allowed people to add friends to groups without their consent was for sure a decision that didn’t hurt their business goals.
 
Let’s face it: As software engineers and designers we are on a regular basis in situations where business goals are conflictive to interests of individuals or greater society.
Which is also the case in many other industries. What’s the difference? The same thing that fascinates so many of us about our industry. The scale, the ability to influence millions of people from our living room. We don’t even have to ask for permission.

Who should feel responsible?

The designer of the UI? The engineer of the algorithm? The product manager? They are all just part of a team, and only trying to solve the problems of the company, right?
So the CEOs are responsible? Well, they have to make sure the next payroll is save. They just act in the interest of investors and their employees.
So the investors are responsible? They will tell us, that they have no clue about the day to day business of the company. They just wanted to help this company grow.
 
So, the user is responsible? You have to know best what products you are using, right? If you don’t use the product, the product won’t exist. Easy.
The problem: Humans have limitations. Neuroscience and behavioral scientist proof this over and over again. The work of Nobel Prize laureates Daniel Kahnemann and Richard Thaler gave these topics in the last years broader attention outside of the field of psychologyWe are often not able to make the best decision in our own interest. There is a reason why gambling is regulated in many countries.
 
So who should feel responsible to help others and ourselves to make good decisions? Most of the time, when no one seems responsible, we all are. We all have a choice.
 
So, what choices do we have as software engineers and designers?
First of all, there is the easy one: Find an excuse.
 
Even if you work for a company, that does a modern example of a Skinner experiment, you could argue something like this:
 
A) „I serve them. It’s entertainment.“
B) „Come on, nobody is dying here, and if I won’t do it, someone else will.“
C) „We are living in a free country. They have to know best themselves. If they would use our product responsibly no harm would be done.“
 
I don’t want to sound sarcastic at this point, but you will always find a good reason to argue in such a way. You need a credible quote from an expert to find an argument why you built an addictive game? Here you go:
 
„Meanwhile, for all those skeptics out there who believe video games have little to offer education, I’ll leave you with a challenge. Find one – that’s right, just one – video game that is not about learning.“
Dr. Keith Devlin, a mathematician at Stanford University
There is another choice. The hard one.
You feel accountable for the behavioral outcomes of the software you build.
You recognize responsibility for building a decent, useful, powerful and ethical tool.
You try to build tools that appear briefly when you need them, and then disappear, leaving you free to get on with your life.
 
As Jonathan Harris writes in his piece:
 
„There is an ancient pact between tools and their users which says that tools should be used by their users, and not the other way around.“
 
Everyone who thinks now this is an obvious choice, judges too easily. There are often huge incentives to take the easy choice. Most of the time these are not black and white decisions. The borders are blurred. Take a product like Twitter. Should you work on the introduction of a new timeline, with rich media previews, for more user engagement? Or should you work on smarter filter options, to fight the „Fear of Missing Out“ phenomenon? Optimise for “time well spent” and make users feel better? Or optimise for more advertising opportunities?
 
It is hard to make these decisions. If you are public company, or if you took a lot of venture capital, there is a lot of pressure. You are part of the attention economy. It operates by convincing users to spend large amounts of time online, clicking many things and viewing many ads. There is a big conflict of interests.
 
„[In the publishing business] the readers are the product, and the customers are the advertisers.“
– Dave Winer, Scripting News

How can we make more often the right decision?

To be honest, I don’t know. As for any ethical discussion, it’s very hard to find answers here. There are many people who are much more qualified to find a proper solution to this than me. All I hoped to do with this post is to start a conversation at a few lunch tables out there.
 
Jonathan Harris had an interesting thought. The idea of a Hippocratic Oath.
He compares our industry with the health care and pharma industry and suggests:
 
„We could ask our engineers to take a Hippocratic Oath, as medical students are required to do before we call them doctors. The basic tenets of an Engineering Oath could mirror the medical ones“
„Technological innovation will always outpace any legislation that tries to constrain it; regulating technology tends not to work. So the ESA [Ethical Software Administration] would need a different approach, perhaps an open online forum where anyone can post their concerns, and where every company receives an aggregated “ethics” score, based on their actions.”
„From a young age, engineering students could be taught to speak up for what they believe. Too many engineers remain silent, leaving decisions to “management,” and simply writing code as they’re told.“
Is this a realistic thing to do?
There is currently no legal obligation for medical students to swear an oath upon graduating. Though 98% of American medical students swear some form of this oath.
Other industries are thinking about such an oath, too. In his acceptance speech for the Nobel Peace Prize Sir Joseph Rotblat suggested a Hippocratic Oath for Scientists. 
 
There are industries where companies themselves are proactive to build trust for their products. In journalism companies like the New York Times or Reuters have these kind of codices as many others.
 
Sure, people break these standards, not every company lives up to them, but at least we all have a good picture of what the right thing to do would look like.

What’s next?

All of us want to build successful products. Companies have to be sustainable, have to make money with their product. And yes, users shouldn’t be patronized and regulations are often the wrong choice.
 
No matter what the individual definition of success looks like, a big part of it is “how”.
How do we want to reach our personal goals? The process of how you have reached something, will always be part of the result. 
 
“The whole purpose of planning something like Everest is to effect some sort of spiritual and physical gain and if you compromise the process, you’re an asshole when you start out and you’re an asshole when you get back.”
– Yvon Chouinard (Founder of Patagonia)
Users and customers, and everyone who shapes their opinions, have a huge impact .
What do we expect from a car engineer, who plans the next engine?
What do we expect from a chemist at a food company?
What do we expect from a geologist who works for a big energy company?
They all have a hard and an easy choice.
What expectations do we have from designers and software engineers?
 
Taking the hard choice sometimes comes also with inconveniences for users, though if you communicate well, it’s also a chance. It could become a big asset. Look at companies like Patagonia. Take the hard choice, tell your story, make it to your strength. It’s interesting to see frameworks like the B Corporation become more popular.
 
The influence of our industry on our everyday life, our bodies and our behavior will rise dramatically in the near future. There are amazing times ahead, but let’s have this conversation on ethics more often. We need to encourage developers to speak up. We have to celebrate the companies who made the right decisions.
 
Does this make sense? Are we having this discussion already? Do I miss something? Please let me know.
 

SaveSaveSaveSave

SaveSaveSaveSaveSaveSave

SaveSaveSaveSaveSaveSave

Author: Thomas Schinabeck

Digital product designer based in Munich, who loves to help building products and growing brands. Otherwise trying to spend as much time as possible in the mountains. - @digitalwaveride