The digital advertising platforms don’t just want our data. They also want us to spend our lives online, an “addiction” created for the primary purpose of serving advertisers. Manal al-Sharif reports on the pitfalls of “digital dictatorship” in this latest of her Tech4Evil series.
According to a 2016 study, we touch our phones around 2617 times a day, while another showed that 79 percent of phone owners check their device within 15 minutes of waking up. Our attention has been hijacked so much that one out of every four car accidents in the United States is believed to be caused by texting and driving.
As a result, we have shorter attention spans, take our phones everywhere, and become anxious when it’s out of sight.
According to the Oxford Journal of Consumer Research, targeted ads don’t just make us more likely to buy — they can change how we think about ourselves.
Former Google employee, Tristan Harris compared the persuasive techniques of Big Tech apps to giving us doses of dopamine, like a slot machine in our pockets. It’s all designed to keep us hooked and coming for more:
Technology steers what 2 billion people are thinking and believing every day…Religions and governments don’t have that much influence … But we have three technology companies who have this system that frankly they don’t even have control over…which is governing what people do with their time and what they’re looking at.
Moreover, Big Tech trains us to underplay the importance of privacy; after all, what is there to fear if we have nothing to hide? They give us a map to get us to our destination faster and record every place we have been to. (Google doesn’t offer the option to stop storing location history, it can only be paused, and it doesn’t say how long for if you do pause it).
Being in a constant state of surveillance creates a complacent society. But also a populace afraid of expressing their thoughts. When our every move is being monitored and analysed, our curiosity to search, learn and engage in discussions for healthier societies may be jeopardised.
Facebook’s guinea pigs
Once you accept the Facebook Data Use Policy, prerequisite to use Facebook at all, you sign up for a mass psychological experiment. This policy states that Facebook “may use the information we receive about you … for internal operations, including … research”. They might not be a research institute, but they have your consent, data and are thus indemnified against misuse.
In 2012, Facebook ran a mass psychological experiment on 689,003 randomly selected Facebook users, who were divided into two groups. Facebook elected to show only negative content to the first group for a week while showing only positive content to the second. They monitored each group’s behaviour. Two years later, the results concluded that:
When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.
In other words, our moods and behaviours can be influenced by our online interactions, which can be controlled by whoever runs the algorithms responsible for what newsfeed we read and what ads we see.
This experiment was done without the user’s consent, didn’t allow them to opt-out, and didn’t have the approval of an ethics board like any other experiment involving humans would do.
Can we stop it?
Professor Shoshana Zuboff – author of “Surveillance Capitalism” – states that surveillance capitalism is “an assault on human autonomy” What began as advertising is now a threat to freedom and democracy.
Instead of behavioural modification for commercial ends, the ends become political. Voting instead of buying.
And it is not just advertisers that want to use our data this way. Employers, health insurance provider, law enforcement agencies, the tax department and pretty much anyone who can pay the price to get access to our profiles, can do so.
Zuboff’s nightmare of democracies manipulation was realised, but only after the damage was done.
In 2019, the American Federal Trade Commission (FTC) hit Facebook with the most significant privacy violation penalty in the history of the trade commission. The FTC fine of $US5 billion was issued in the wake of consulting firm Cambridge Analytica having access to the information and micro-targeted more than 87 million Facebook users. The data was used to interfere with the US presidential elections.
Cambridge Analytica uses big data analysis and a process called psychometric targeting to change audience behaviour. The results are astonishing and almost guaranteed. Cambridge Analytica’s involvement in Brexit is another known example of the power of this method.
The Norwegian Consumer Council recently published a report calling for a ban on surveillance-based advertising. It argues that the harms outweigh the benefits, which mostly go to a few Silicon Valley billionaires. The report states that the industry has shown little desire to abandon its questionable practices, despite repeated warnings and hefty fines. The report suggests that the last resort is a full ban.
The report also pointed out that successful alternatives to surveillance-based advertising exist. When The New York Times stopped serving surveillance-based advertising to European users, its advertising revenue kept growing as its advertising partners purchased ad space regardless of the targeting capabilities.
The business model on which this industry is built is unregulated and as we have seen risks deteriorating the fabric of democracies.
Governments want to avoid obstructing innovation and technological advancement and end up giving the big tech companies the luxury of operating unregulated. Governments – such as in Australia – are also not keeping up with the speed of the tech world.
What we need is for governments to create regulatory bodies to monitor highly manipulative, highly addictive technology the same way they regulate drugs and tobacco. And make ethics a mandatory subject for marketing graduates, computer scientists and software engineers.
Can you protect yourself?
In the words of Yuval Noah Harari:
If you dislike the idea of living in a digital dictatorship… then the most important contribution you can make is to find ways to prevent too much data from being concentrated in too few hands… These will not be easy tasks. But achieving them may be the best safeguard of democracy
The simple answer is that you can’t fully protect yourself unless you go back to your 1997 Nokia or don’t use the internet. Once you go online, nothing is private. Apart from minimising the harm by making informed clicks and try your best to understand what you give informed consent to for the sites you visit. An almost impossible task, especially if you are not a lawyer specialising in privacy.
But there are some other simple steps you can take to avoid online tracking:
- Use Search engines that won’t save every search you make, like DuckDuckGo.
- Remove invasive privacy apps from your phone like Facebook, Tik Tok and the like (Apple Store offers Privacy Report on popular apps before downloading).
- Use private browsers like Tor, Brave and Safari.
- Reject all non-essential cookies when visiting websites.
- Create an online persona to sign up for services and mailing lists.
- Use ad-blocker and Privacy Badger while browsing.
Still, none of these ways will evade the fingerprinting technique mentioned in our previous article.
In acknowledgment of the invasive nature of cookies, the EU’s General Data Protection Regulation or GDPR classify cookies as “online identifiers”, subject to regulations that require websites to gain your consent before issuing cookies to your browser. So one way to enjoy better privacy rights is to use a VPN service to browse as if you are coming from the EU.
This will make any website think you are protected and shows you the “We respect your privacy” option to opt out. (California has stronger privacy laws too).
For now, until legislation starts to catch up with the enormous power and reach of Big Tech, the best we can all do is being aware and sharing that awareness with others.
Manal al-Sharif is an author, speaker, human rights activitist and a regular contributor to international media. She has written for the Time, the NY Times and Washington Post. Her Amazon bestseller memoir, Daring to Drive: a Saudi Woman's Awakening, is an intimate story of her life growing up in one of the most masculine societies in the world.
Manal is a cybersecurity expert and host of the tech4evil.com podcast that discusses the intersection of technology and human rights.