Skip to main content
Start of main content.

UX Australia 2019: Ethics and Privacy

by richard.sison /

Share this post on social media

A recurring theme in the tech industry is one of Ethics in Design. It's often a point of discussion online and was a prevalent  topic at conferences I've attended recently. There is still a lot of work to be done to make design and product teams more aware, responsible and proactive about ethical issues where technology is shaping society and defining culture.

The Big Theme: Privacy

Throughout some of the talks this year was a focus on Privacy. Aral Balkan largely spoke about Surveillance Capitalism; a topic which refers to the commodification of our privacy and personal data by large tech companies.

The issue he highlighted comes from the fundamental nature of large companies whose business model revolves around money and growth; a mindset he likens to the "ideology of the cancer cell".

Aral made an example of IBM and their dirty laundry. An often forgotten saga is IBM's role in actively working with Nazi Germany. They helped develop a system which collected population data enabling the Nazis to identify ethnic groups they deemed undesirable. The point of this story wasn't to single-out IBM, but instead to illustrate the point that when a company cares more about money than anything else, their "values" don't seem to matter much — they will do anything if the price is right. 

Aral went on to advocate for stricter regulations and ownership of our data because our privacy needs to be respected and protected. After all, "data about us is us"; we should be the ones who hold the keys and have control over who has access to it.

Aligning with Expectations

We live in a society where it's normal to willingly give away data because we trust companies are being responsible about it.

Tim Kariotis provided demonstrations on just how easily we do give away personal information. My key takeaway from his talk was largely around aligning to user expectations; namely being respectful and mindful on what is an appropriate amount of data to collect based off the context. 

Tim encouraged us to question what data we actually need, and ask our customers how they'd expect it to be used. We should use that as a base to inform our decisions and line up to the expectations of our customers.

Technology and Intent

While a lot of discussion around ethics in design paints the evolution of technology in a bad light, we have to remember that technology isn't inherently bad; it's the intent behind it (or lack thereof).

Andy Polaine, in his presentation, demonstrated many examples of the creative possibilities of artificial intelligence like "deep fakes" (a technique where faces of people can be generated through machine learning and super-imposed onto videos). The result is becoming increasingly more realistic and harder to distinguish what's real; therein lies the ethical concerns. In the right hands and with the right intent, creative examples can enrich an experience without confusing the audience. In the example "Dali Lives", Salvador Dali breaks the fourth wall and interacts with the audience — it doesn't attempt to blend fact and fiction, it instead embraces the context and makes for an immersive experience for the viewers.

But it's not hard to see what this technology could do in the wrong hands. The popular YouTube channel, Ctrl Shift Face, shows just how convincing this can get. While examples like Tom Cruise's face on Bill Hader's head are entertaining and relatively harmless, the potential impact this technology can have on more serious and sensitive topics (such as manufactured videos from world leaders) raises significant ethical issues. 

Summary

When we're talking about the evolution of technology, whether it's on privacy or emerging technologies, a lot of this comes down to trust and meeting expectations.

While it's true that as a society, we have to be more vigilant about what we trust on the internet, as an industry, we play a significant role in helping bridge that gap and we can't be complacent about that responsibility.

We need to use our knowledge to help guide and educate people beyond the design and tech community and it all starts with advocating for operating with transparency, honest intent and from principles which value the needs of people over profits.