Malaysiakini logo
This article is 3 years old

COMMENT | Ubiquitous and mysterious, algorithms are ruling our lives

COMMENT | Automated decision-making systems have crept into every corner of our lives — they impact the news we see, the songs we’re recommended, the products we buy and the way we travel.

At the heart of these systems lie algorithms — computerised instruction sets that operate over data to produce controlled outcomes. Algorithms that, until recently, operated with very little scrutiny.

When it comes to news, algorithms can determine what content comes top of your search, what advertising is targeted at you, and what is and isn’t allowed to exist on a platform through automated moderation.

Despite their ubiquity, algorithms can harm. Automated decision-making can discriminate on the basis of race, sex, age, class and more. These systems have been exploited by individuals and groups to proliferate misinformation.

Many news algorithms operate in closed proprietary systems shrouded in secrecy — aptly described as “black boxes”. To best assess the potential and risks of automated decision-making in news and media, the community would need to access information about how these systems work in practice. That requires transparency.

Most of us would have encountered collaborative filtering— a prevalent content recommendation algorithm popularised by Netflix.

Collaborative filtering makes recommendations by extrapolating from shared qualities between items and/or users, directing audiences with messages like “people similar to you enjoyed this film, so you should also enjoy this film”. More data improves this prediction accuracy.

The volume of user preference data collected by platforms such as Spotify, Facebook, and YouTube is now so vast that serendipity is rendered mostly absent.

These platforms instead resemble what marketing researchers Aron Darmody and Detlev Zwick describe as “highly personalised worlds that are algorithmically emptied of irrelevant choice”.

As algorithmically-enabled social media...

Verifying user