Responsibility, Transparency, Privacy

Responsibility, Transparency, Privacy
12 Monkeys 1995 Terry Gilliam

A few years back, I was talking with my friend who at that time was managing a team at Google Search. He asked me: "How can you design trust?". He spent his time thinking about sponsored results and people's perceptions of them. I could not give a straight answer.

Over the last few years, the challenge of differentiating ads from content, fake from real information became exponentially more difficult. Brands are working hard on building a trustworthy, charming, approachable image. (The Kawaiization of product design - DESK Magazine by Tobias van Schneider, 2020)

When we talk about trust from the user's side and responsibility from the platform and content creator's side we often end up talking about child safety. It became a litmus test to indicate how well account moderation, content moderation works. How random strangers can Zoombomb classrooms, how fake accounts can inappropriately interact with those who are just learning about the complexities of the world.  

A recent video from the New York Times has a great thought experiment on comparing digital products with tangible toys for kids. "Would You Let Your Kids Play with These Toys?".

Responsibility

Unfortunately, the digital world is inherently more complex and harder to control than regulating consumer products. The scale of digital products makes manual curation and moderation impossible and leaves us with the auto-generated hellscape of content. All the harmful content and related scandals harm the reputation and the overall perception of these platforms while still not committing to taking full responsibility for the content that can be accessed.

"Recommendations drive a significant amount of the overall viewership on YouTube, even more than channel subscriptions or search. (...) But all too often, recommendations are seen as a mysterious black box. (...) Someone may report that they’re very satisfied by videos that claim “the Earth is flat,” but that doesn’t mean we want to recommend this type of low-quality content. That’s why recommendations play such an important role in how we maintain a responsible platform. They connect viewers to high-quality information and minimize the chances they’ll see problematic content." (On YouTube's recommendation system by Cristos Goodrow - VP of Engineering At YouTube, 2021)

I think a clearer separation of creator and platform would greatly help distinguish who is responsible and accountable for inappropriate (or how YouTube calls them: "borderline") content. Designing safer and more user-centered recommendation systems should not rely on people diving into advanced settings and following robust Community Guidelines.

Elsagate is a neologism referring to the controversy surrounding videos on YouTube and YouTube Kids that are categorized as "child-friendly", but which contain themes that are inappropriate for children.
"Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatise, and abuse children, automatically and at scale, and it forces me to question my own beliefs about the internet, at every level." (Something is wrong on the internet by James Bridle, 2017)

Transparency

Just because we can find what is written on a cereal box doesn't mean you are aware of the impact that cereal will have on your health. But it is a decent start, to begin with.

Apple introduced App Tracking Transparency which is another consent form aiming to give the power to people to ask apps not to track them. Facebook replied with a full-page ad in the New York Times: “We’re standing up to Apple for small businesses everywhere,”. Apple's app tracking policy reportedly cost social media platforms nearly $10 billion so far. If anyone interested there is a "paper" by Apple titled "A day in the life of your data" , how a $227 billion-a-yearindustry harvesting our personal data through a father-daughter day at the playground.

A day in the life of your data by Apple
Why Apple’s new privacy feature is such a big deal - The Verge
Pre-modal page what Meta showed for Facebook and Instagram users to allow tracking Source

So did these App Tracking Transparency modal windows help us be more conscious of how platforms sell our behavioral and taste data? Are we trusting more or less the apps we love to use?

I don't think so. Most people don't understand how applications, websites work and the underlying hidden mechanisms that sell our behavioral information related to what we read, watch, search for. These are enabling many small businesses to exist, as well makes very big businesses even bigger. Cute illustrations, icons, carefully crafted copywriting did not solve the trust problem Meta has.

"I believe people are smart and some people want to share more data than other people do. Ask them. Ask them every time. Make them tell you to stop asking them if they get tired of you asking them. Let them know precisely what you're going to do with their data." (Steve Jobs - All Things Digital Conference, 2010)

I also believe people are instinctively curious, they want to learn, they like to have a sense of control over what they are doing (paraphrasing "A view of the user" from HIG 1987). But I don't believe continuous displays, alerts, confirmations will help regain our trust in technology. It will require strong regulatory steps and education to mitigate the impact on people's life. The change will not come from commercial companies, just like smoking regulations and prevention programs are not coming from tobacco companies.

Privacy

So here is my naive, designerly view on privacy: "privacy mode" should not feel like a spy feature or something that only a cyber-security engineer can use fluently. It should be the default. Not a cookie consent trivia on every single website. Not something that people have to pay for.

"Online privacy isn’t just something you should hope for — it’s something you should expect." (Safari - Apple, 2021)
"Privacy is a fundamental human right." (Privacy - Apple, 2021)