Extremism and fake news: The dark side of too much information
Regardless of where one sits on the political or ideological spectrum, we can probably all agree that the world seems more polarized than it has ever been. An intriguing new study from Thomas Hills, at the University of Warwick, suggests that the recent acceleration in the prominence of fringe extremism and misinformation around the world is partly due to the mass proliferation of information we have at our fingertips and our inability to effectively process that influx of information.
Over the past decade, digital technologies have allowed billions of people virtually instant access to more information that we can imagine. In some ways this has resulted in miraculous opportunities, democratizing knowledge in a way that humanity has never before experienced. But the dark side of this technological advance is that it fundamentally highlights a major problem with how humans process information.
When we are faced with information rich environments, we naturally have to filter out certain things. Decades of study have proposed many interesting models explaining how our finite "attention resources" are allocated in real-world scenarios, but a new article from Professor Thomas Hills, of the University of Warwick's Department of Psychology, suggests we are now facing "attentional bottlenecks" due to the torrent of information we have to wade through thanks to the influx of digital technologies.
The crux of Hills' thesis is that when faced with an overwhelming volume of information we fundamentally lean on a series of biases in order to wade through the torrent of data. Hills refers to the process as "cognitive selection," suggesting it drives the evolution of information in much the same way the theory of natural selection drives biological evolution. There are four primary forces that drive cognitive selection making us lean into the most negative outcomes associated with information proliferation.
The first force is a bias towards negative information. Hills suggests an increased sensitivity towards negative information is an evolutionary tool that probably in the past served us quite well, making our prehistoric selves a little more cautious when weighing up the pros and cons of a given scenario. In the 21st century though, this tendency causes us to often amplify risks to disproportionate levels at the expense of more balanced information. Bad news travels fast indeed.
The second factor driving cognitive selection is the commonly known idea of confirmation bias. This process pushes us to filter out information that is inconsistent with our current beliefs. The tendency for humans to favor belief-consistent information is amplified through modern information technologies, according to Hills. Social media allows us to more readily organize ourselves into like-minded groups, and the algorithms dominating these systems work to amplify exposure to agreeable information while blocking out that which is disagreeable.
The natural human preference for relying on a general social consensus is the third factor at play in cognitive selection. Essentially this process describes our tendency for herd-like agreement and it evolved as a useful way to survive traversing unfamiliar environments by imitating the behavior of others. On the other hand, in the 21st century world of social media, this process results in people choosing suboptimal solutions to problems without individually evaluating masses of information.
The final force Hills suggests drives cognitive selection is our human predilection for pattern recognition. Simply put, the more information we have at our disposal, the more likely we are to lock into spurious patterns and correlations that suit our pre-defined purposes.
Furthering the analogy to natural selection, Hills suggests information that can best exploit these cognitive selection processes will inevitably be more successful in spreading and influencing a collective. All of this somewhat helps explain many of the extreme divisions apparent in the world today, from fake news to the mainstreaming of fringe conspiratorial beliefs.
"There are well-understood psychological limits on our capacity to process information," says Hills. "The unfortunate reality is that these limits are forcing us down an evolutionary relationship with information that is losing sight of our best interests. People didn't evolve in an information environment anything like the one we currently experience. And the evidence suggests that things are rapidly moving beyond our control."
Hills doesn't all-out conclude we are entirely doomed, but there are no immediate solutions forthcoming. Of course, the first step to overcoming these negatives associated with an information-rich world is to understand why humans lean into certain ideas over others.
The new study was published in the journal Perspectives on Psychological Science.
Source: University of Warwick
Please keep comments to less than 150 words. No abusive material or spam will be published.
I think Thomas Hills from Warwick Uni is trying to statically fix what is in effect a dynamic cause and effect loop where these elements become constantly interchangeable.
I do not think, from what I have read here, that Hills demonstrates that the increasingly extreme bifurcations of opinion he notes are a function of mass psychological reaction to information flow and density per se.
A much simpler explanation is that the post WW2 democratic consensus, which was underpinned by American power, together with an indulgence based economic and social system, driven by a deregulatory/privatization agenda, run by a social/economic libertarian ascendency, has run its course, reached its use-by date and presenting us all with a massive damage bill for wrecked existential, social, economic and ecological governance and infrastructure.
And that means everyone is scrambling for higher ground from which to prosecute the inevitable wars of succession and toleration that always succeed the end of an ancien (old) regime.
And like a lot of social psychology, Hill's work is as much driven by social ideology as science. And it will get traction precisely because it depoliticizes/psychologizes the problem and shifts blame from the real causes of the phenomenon it studies away from the politics of decline and regime disintegration.