Algorithms and Blues: Decision-Making is Being Blindly Outsourced
They have benign, if exotic, nicknames: Penguin, Panda, Hummingbird. These are the monikers given to computer algorithms that dictate what results we get when we enter text into Google’s search engine.
We are increasingly subject to algorithms in our online lives, such as when we browse Amazon’s book recommendations or ask Apple’s Siri voice assistant what the weather is going to be like today.
Recently, I bought a T-shirt that appeared in an advert in my Facebook newsfeed. It had a quote on its front from The Martian, one of my favourite sci-fi movies of recent years: “Let’s science the shit out of it!”
The line summed up a stranded astronaut’s approach to getting off Mars and I’d been thinking about how great it would look on a T-shirt when Facebook’s algorithm served it up to me: it knew that I’d “liked” the page of Andy Weir, who wrote the book, and that I’d discussed a review of the movie with my Facebook friends. Click and purchase: $24.95.
Millions of these transactions are going on every day, as we are offered things the algorithm calculates we want and takes the effort out of information transactions.
But, as internet expert Michele Willson argues in a new research paper, “Algorithms and the Everyday”, published recently in the journal Information, Communication and Society, we are blindly outsourcing decision-making to algorithms more and more.
Nowhere is that more evident than in the profusion of devices such as Fitbits and smartwatches that track our footsteps, heart rate and sleeping patterns, crunching the data to suggest ways of improving our quantified selves. “The process from biology (heart) and practice (walking) to data becomes unquestioned, normalised and invisible,” Willson writes.
In the next decade, algorithms will power the development of the so-called internet of things, including driverless cars and robots. Elon Musk, a founder of electric-car maker Tesla, claims that millions of lives will be saved when algorithms, rather than drivers, control cars.
But algorithms are designed by humans and so they are subject to biases, flaws of logic and the social, political and commercial priorities of their designers. This was brought into stark relief recently with reports that in parts of the US, software is being used to predict whether people will commit crimes.
The software collects several data points about a person and calculates a risk score. That score is then used by judges to inform sentencing decisions for people convicted of crimes. But a ProPublica investigation revealed that the risk-scoring algorithms were racially biased. Analysis of 7000 risk scores of people in Broward County, Florida, tested their predictive accuracy and found the algorithm was wrong 40% of the time and tended to give blacks higher-risk scores than whites.
Most algorithms are proprietary in nature – Google is not going to give away the magic code that makes its search engine so effective – so they are a black box to most of us. They are, writes Willson, “opaque and inaccessible to outside critique … their parameters, intent and assumptions indiscernible. And yet the working of algorithms has wide-ranging consequences for the shape and direction of our everyday.” That will have to change if algorithms are to decide access to state houses or priorities for health programmes.
As is often the case when it comes to moderating the effect of new technology, the Europeans are leading the way. The European Union has adopted a requirement for data-driven decisions based solely on automated processes that have a significant potential impact on people to be clearly explained. From May 2018, EU citizens will have the right to get explanations of automated decision-making and challenge the decisions.
Some type of algorithm auditing, perhaps overseen by our Privacy Commissioner, will eventually be needed here. It’s possible to opt out of Facebook, even Google. But it’s a different matter when your government is run by algorithms.
Source New Zealand Listener September 3-9 2016
http://www.pressreader.com/new-zealand/new-zealand-listener/20160826/281569470149024
Not citing sources is plagiarism, and copying pasting articles without permission is copyright infringement. If you want to share a news story, simply link to the source, and include your original commentary, and possibly small quotes from the source.
Copy paste is discouraged by the community, and may result in action from the cheetah bot.
Creative Commons: If you are reposting under a Creative Commons license, please attribute and link according to the specific license. If you are reposting under CC0 please consider noting that at the end of your post.
If you are actually the original author, please do reply to let us know!
Thank You! ☙
Entire account is plagiarized content.
!cheetah ban
Okay, I have banned @mione.
Discussed in chat..
!cheetah unban
Okay, I have unbanned @mione.
It's a moot point if you are talking about such systems integrating into a platform like Steem. You could train it, you can have multiple algos and tweak their parameters. The problem you speak of comes from their systems being untrustworthy and opaque. It's a powerful tool, but it should be fully in the hands of the user. You could even have data sharing of decision algorithm parameters (neural net maps and such) so that you could even see how other people's systems are running, and even integrate them into your own. A peer to peer sharing system could be created for this. Not the data that is sorted, but the configuration the user sets to scrape their data and use it to help you find resources for you. Not just your own, but you could add and weight the data of those you follow, for example, maybe even be able to query automatically another person's configuration with a query and get the result they would get (or automatically compile multiple weighted by confirmation feedback).
It could be bigger and more important than Google, and competitive, and diverse. This is much the sort of thing discussed in Charlie Stross' book Accelerando. It can learn to anticipate your movements, adapt to your moods, and basically act as a filter that helps you more quickly locate information and spend less time looking for it.
But we must control it, and teach it, or it's a potential psywar weapon.
I'm waiting for the day to come where they're going to start wanting to plant the Fitbit into our skulls. Come on, now you won't have to wear that thiing on your wrist all the time!