Bringing You the Daily Dispatch

Code Dependent by Madhumita Murgia review – understanding the human impacts of AI

Code Dependent by Madhumita Murgia review – understanding the human impacts of AI


What is inside the box? This seems to be the question on the minds of many people that Madhumita Murgia has talked to. If the algorithm’s “black box” is responsible for making important decisions regarding our health, education, or human rights, it would be beneficial to have a clear understanding of its contents. As long as it remains unknown, what can be done if one’s child is unfairly labeled as a potential future criminal due to insufficient or biased data, similar to the situation experienced by hundreds of families in the Netherlands in the late 2010s?

It’s these Kafkaesque absurdities, and how they play out on a human level, that interest Murgia, the Financial Times’s first artificial intelligence editor. Code, she reminds us several times in this troubling book, is not neutral.

This is not a tale about ChatGPT and other major language models and their potential impact on various aspects of society, from the entertainment industry to school assignments. Instead, it is a narrative about the transformation brought about by the everyday algorithms that are now a part of our lives. This includes the challenges faced by those who are responsible for interpreting vast amounts of data, as well as the unintended effects of the biases present within that data. It is also the account of how these AI systems, which are built using such data, can benefit some of us (such as when you use UberEats to order McDonald’s) but at the expense of others, specifically individuals and communities that are already marginalized (such as the young immigrant worker who receives only a small fee for picking up your Big Mac).

Murgia’s reporting in this instance covers a wide range, as it relates to her work during the day. She guides us through the rather simple methods used to train AI systems (such as workers in a Kenyan office labeling road signs to teach driverless cars to identify them) to how these flaws ultimately impact the final product (such as delivery drivers being paid less due to the app not factoring in delays caused by road construction or having to bike uphill).

Additionally, Murgia argues that a new form of data colonialism is on the rise. While some AI workers are lifted out of poverty through subcontracted work, the profits generated are not divided fairly. Moreover, there is a repetitive nature to the work, with strict adherence to instructions and discrepancies in pay and job stability based on location. The toll on mental health cannot be overlooked, as these workers are exposed to disturbing content in order to spare others from viewing it. What’s more, the AI industry is divided into smaller segments, leaving many workers unaware of the purpose of their tasks or who they are working for.

A lawyer from Kenya named Murgia believes that algorithm training is similar to the Bangladeshi clothing industry that manufactures clothing for Western fast-fashion companies. This can also be compared to the high-end designer goods sector where factory workers are unaware that their products are being sold for thousands of dollars.

Skip over advertisement for newsletter.

There is a glimmer of hope. Journalist Murgia, who previously worked for Wired, acknowledges the potential of AI to enhance health outcomes. We are introduced to individuals like Hiba, whose family fled Falluja, Iraq and turned to data labor to support their new life in Bulgaria. There are also heartening accounts of gig economy workers who have taken action against exploitation by quietly banding together – even in China – to regain some control over their work that they had given up to “the algorithm altar.”

The underlying tone of this statement is negative. We have moved beyond the optimistic views of technology in the early 2000s, and while government officials may see the potential for AI to improve healthcare and social services, there are also countless individuals questioning if it will negatively impact their livelihoods. To make matters worse, this can already be seen in the Chinese government’s use of facial recognition systems and preemptive detention lists in Xinjiang. It’s a dystopian reality that we are currently experiencing.

Source: theguardian.com