| Hidden bias in technology, design, and human nature |
From our affinity for “people like us” to algorithms for personalization, the ways we use and design technology tend to amplify our own conscious and unconscious biases, reinforcing our self-segregating/filtering bubbles and resulting in increasingly destructive polarizations throughout society. This isn’t just an explanation for why other people do things we don’t understand. It’s also about why we have so much trouble really understanding other people. If we don’t share context, it’s almost impossible to cultivate shared meaning.
Technology can serve a higher purpose. It has the potential to help us see our biases, understand more about how and why these biases develop, and maybe even choose how we work against the negative influences of bias upon our own and others’ actions. It’s essentially a design question. But first we have to recognize that the intrinsic design flaw is and perhaps always will be human nature.
In a networked world, people connect to people like themselves. What flows across the network flows through edges of similarity … Prejudice, intolerance, bigotry, and power are all baked into our networks … Information can and does flow in ways that create and reinforce social divides.
Streams of content, limited attention: The flow of information through social media | danah boyd
As web companies strive to tailor their services (including news and search results) to our personal tastes, there’s a dangerous unintended consequence: We get trapped in a “filter bubble” and don’t get exposed to information that could challenge or broaden our worldview. Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy.
Beware online “filter bubbles” | Eli Pariser