Are algorithms making us W.E.I.R.D.? - alphr
Western, educated, industrialised, rich and democratic (WEIRD) norms are distorting the cultural perspective of new technologies
From what we see in our internet search results to deciding how we manage our investments, travel routes and love lives, algorithms have become a ubiquitous part of our society. Algorithms are not just an online phenomenon: they are having an ever-increasing impact on the real-world. Children are being born to couples who were matched by dating site algorithms, whilst the navigation systems for driverless cars are poised to transform our roads.
Last month, Germany announced ethical guidelines for determining how driverless cars should react in unavoidable collisions. According to these guidelines, in such instances where driverless cars are unable to avoid a collision, they will be expected to protect people rather than property or animals.
The problem is that we live in an incredibly diverse world, with different nations having their own morals and culture, hence algorithms that are acceptable to one set of people may not be acceptable to a different group. “We do not really have a consensus on morals,” says Dr Sandra Wachter, a researcher in data ethics at the Oxford Internet Institute and a research fellow at the Alan Turing Institute. “Even regionally in Europe we greatly differ in what is important to us and what values we need to protect. Privacy is one of the major issues in Europe, but if you look how Germany perceives privacy – what they think of privacy and why it is important – that is not necessarily reflected in the UK.”
See related
That time Snapchat got a little racist… again
Google apologises after Photos tagging algorithm makes racist blunder
Assassin’s Creed Origins and the fantasy of history
How will drones reshape our cities?
Ethics are influenced by personal, cultural and religious factors. These unique perspectives stem from the societal experiences of each country. “Even though we talk about the same thing, they are flavoured in a different colour, because we have different experiences,” says Wachter.
The ethical guidelines for driverless cars may be broadly suitable in Germany, but they would not be universally acceptable. For example, in areas of the world where cows are widely considered to be sacred, drivers would naturally avoid hitting a cow at all costs. Likewise, there are some religions, such as Jainism, that place great importance on the lives of animals.
“We often talk about US norms, but really what we are talking about are WEIRD norms that are being imposed through these things,” says researcher Matthew Blakstad, author of Lucky Ghost. WEIRD, in this case, stands for western, educated, industrialised, rich and democratic.
This highlights an issue that, although algorithms can be deployed around the world, they invariably have the cultural background and perspective of their programmers. Wachter says problems can occur if these technologies are developed by people from a single demographic. If that happens, the results will lack the personal, cultural and religious nuances that are needed to reflect a global population. “It will be very one-sided, when it needs to be culturally diverse,” she warns.
The cultural perceptions of coders can subconsciously influence the decisions they make during the development phase of the algorithmic applications. Decisions about the rule sets and data sets, which are used to train the machine-learning element of the algorithm, can naturally affect the resulting behaviour. “The data they collect and the way they test is combined to be inward looking, so it works for them and their friends,” says mathematician Cathy O’Neil, author of Weapons of Math Destruction.
A prime example of this is how a video game developer once inadvertently created a racist video game. The motion-tracking algorithm was unable to detect dark skin tones, as the camera relied on reflected light to detect motion. The fact that this was not noticed during development or testing indicated an unfortunate bias.
Meanwhile, Snapchat courted controversy last year when they deployed the Coachella Festival filter. This filter was designed to create a ‘summer festival vibe’ with a garland of flowers. It also featured a skin-whitening function.
It was the first ‘beautification’ filter to automatically lighten skin tones. The main difference between Snapchat and other photo-sharing platforms is that it used predefined filters, without the user’s specific consent. It means somebody, somewhere, assumed that lightening an individual’s skin tone would make them more appealing.
See the full article: http://www.alphr.com/technology/1007267/are-algorithms-making-us-weird