Not everyone sets out to build a digital product that’s used irresponsibly, raising all sorts of ethical issues. But that’s what can happen when we don’t prioritize ethics, measure how people are using the products we build, and be as “familiar as possible with the best and worst of humanity,” said digital ethics expert Claire Woodcock.
Woodcock, who’s also an AI product manager based in the UK, talked about why it’s on product managers to champion ethics in products — and the cost to them and the businesses they work for if they don’t — at the May ProductTank Waterloo.
“Not only is making responsible decisions the right thing to do, but ethical issues can cause your business to fail, the business you work for to fail, and your career to fail. This can really impact you if you make the wrong call,” said Woodcock.
She pointed out case studies of ethical issues, such as the blood-testing startup Theranos. But ethical issues can be nuanced as well.
Fitness app Strava found itself in ethical trouble when it designed and built its app to automatically turn on location sharing and heat maps. The intention wasn’t a bad one, pointed out Woodcock, Strava was, after all, simply trying to build a fitness community. But not everyone was using the app as intended. The heat maps caused national security issues when they were posted online and showed a military base and soldiers' movements. And location sharing caused issues when some women discovered that men were using the app to stalk them.
“There was no one in that room to challenge the group think and say, ‘How might this data be used and misused?’ ” said Woodcock.
(Read more: Designing for digital safety.)
It’s easy to throw stones at a big company with power, but there are many ways that ethical issues can crop up. Imagine searching online for professional hairstyles for work, but the results only show white women’s hairstyles, said Woodcock. That could be because of algorithmic bias. Or maybe it wasn’t the algorithm, said Woodcock, but that images were pulled from the only available data: Magazines that so often feature white beauty.
Or maybe it was something else, said Woodcock.
“It could be that we as users are to blame,” she said. “Maybe we kept clicking on caucasian content when we were looking for professional hairstyles so we trick-taught the algorithm that this is what good looks like.”
That’s just one example of how problems in society can come together to create outcomes that we didn’t intend, she said.
“When we work with digital products we often don’t know what’s wrong with a product unless we know what to look for,” said Woodcock.
“We have quite a lot of power when we build a digital product. It influences how we think, how we act, what we purchase, and the data that we let others have access to.”
Often with digital products, the solution is to add a human to fix everything. But humans make mistakes too, Woodcock said. She pointed to studies that show women are more likely to die from heart attacks because doctors treat them differently than men. And when racialized people say they’re in pain, doctors don’t always believe them.
“Imagine I didn’t know that and was building a medical product and I used all that information and hardcoded it into my product? That’s why product managers need to be aware of what’s good and what’s bad about the information sources that you’re pulling into your product,” she said.
It’s on product managers to champion digital ethics because they’re in the best position to do so, said Woodcock. Product managers operate at the intersection of UX, business, and tech and understand how they interact, she said. So it’s on them to understand how ethical issues might pop up.
“You are the only people with a big picture vision and understand how choices in product could combine to create unforeseen impact,” she told ProductTank Waterloo attendees, many of whom are product managers.
There are ways to include ethical talk in your workplace, said Woodcock. First, do the newspaper headline test. Take two newspapers with politically opposite views and ask yourself: If I do this, what will they write about me, my business, or my product two years from now?
In order to do those thought experiments, you have to be well versed in product, so read more broadly, she added.
Amplify unheard voices, especially people who are challenging group think. Challenge tech saviour culture. And advocate upwards if you discover an issue. But don’t start with the C-suite — they’re busy and have loads of pressure, she said. Find allies on the policy team because they’re thinking about all of these issues too.
And give your team permission to ask about the consequences of their work and the company’s work. Finally, set OKRs around ethics and measure them so it’ll stay top of mind. As soon as you see people adopting your product or its technology strongly, pay attention to how they’re using it.
“Be as familiar as possible with the best and worst of humanity,” said Woodcock. “Humans are really human, so you have the best of humans and the worst of humans.”
The next ProductTank Waterloo virtual meetup is June 11 when Teresa Torres will discuss the what and why of continuous discovery. Sign up now.
Missed a ProductTank Waterloo and want to get caught up? Read all of our recaps.