via Rensselaer Polytechnic Institute
Computer scientists propose systemic changes in automatic content curation
As the volume of available information expands, the fraction a person is able to absorb shrinks. They end up retreating into a narrow slice of thought, becoming more vulnerable to misinformation, and polarizing into isolated enclaves of competing opinions. To break this cycle, computer scientists say we need new algorithms that prioritize a broader view over fulfilling consumer biases.
“This is a call to arms,” said Boleslaw Szymanski, a professor of computer science at Rensselaer Polytechnic Institute. “Informed citizens are the foundation of democracy, but the interest of big companies, who supply that information, is to sell us a product. The way they do that on the internet is to repeat what we showed interest in. They’re not interested in a reader’s growth; they’re interested in the reader’s continued attention.”
Szymanski and colleagues at the University of Illinois at Urbana Champaign, the University of California, Los Angeles, and the University of California, San Diego, explore this troubling “paradox of information access,” in a paper published on arXiv.org.
“You would think that enabling everybody to be an author would be a blessing,” said Szymanski, an expert in social and cognitive networks, with previous work that includes findings on the power of a committed minority to sway outcomes. “But the attention span of human beings is not prepared for hundreds of millions of authors. We don’t know what to read, and since we cannot select everything, we simply go back to the familiar, to works that represent our own beliefs.”
Nor is the effect entirely unprecedented, said Tarek Abdelzaher, a professor and University of Illinois at Urbana Champaign lead on the project.
“It’s not the first time that affordances of connectivity and increased access have led to polarization,” said Abdelzaher. “When the U.S. interstate freeway system was built, urban socioeconomic polarization increased. Connectivity allowed people to self-segregate into more homogenous sprawling neighborhoods. The big question this project answers is: how to undo the polarizing effects of creating the information super-highway?”
The effect is exacerbated when our own human limitations are combined with information curations systems that maximize “clicks.”
To disrupt this cycle, the authors contend that the algorithms that provide a daily individualized menu of information must be changed from systems that merely “give consumers more of what these consumers express interest in.”
The authors propose adapting a technique long used in conveying history, which is to provide a tighter summation for events further back from the present day. They call this model for content curation “a scalable presentation of knowledge.” Algorithms would shift from “extractive summarization,” which gives us more of what we consumed in the past, to “abstractive summarization,” which increases the proportion of available thought we can digest.
“As long as you do balance content, you can cover more distant knowledge in much less space,” said Szymanski, who is also the director of a Network Science and Technology Center at Rensselaer. “Although readers have a finite attention span, they still have a slight knowledge in new areas, and then they can choose to shift their attention in a new direction or stay the course.”
Few analytical models exist to measure the trend toward what the authors call “ideological fragmentation in an age of democratized global access.” But one, which the authors considered, treats individuals as “particles in a belief space” — almost like a fluid — and measures their changing positions based on the change in content they share over time. The model “confirms the emergence of polarization with increased information overload.”
The more ideologically isolated and polarized we are, the more we are vulnerable to disinformation tailored to reinforce our own biases. Szymanski and his colleagues offer a slew of technical solutions to reduce misinformation, including better data provenance and algorithms that detect misinformation, such as internal consistency reasoning, background consistency reasoning, and intra-element consistency reasoning tools.
“The very sad development discussed in this paper is that today, people are not conversing with each other. We are living in our own universe created by the data which is coming from these summarization systems, data that confirms our innate biases,” Szymanski said. “This a big issue which we face as a democracy, and I think we have a duty to address it for the good of society.”
Szymanski and his co-authors are working on mathematical models that both measure the extent of polarization in various media, and predict how trends would change under various mitigating strategies.
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Automatic content curation
- Bowery Farming Announces Availability in 650 Stores, Marking 600% Growth This Yearon September 27, 2020 at 1:46 pm
Bowery Farming, the Modern Farming Company, today announced that its fresh, traceable Protected Produce is now available in more than 650 U.S. stores ...
- Don’t Blame Social Media. Blame Capitalism.on September 27, 2020 at 11:07 am
This is reinforced by a dramatic storyline, which plays between the interviews, that follows a family whose kids are increasingly addicted to their phones as people in an imagined algorithmic control ...
- Balkrishna Doshi designs for the peopleon September 22, 2020 at 5:00 pm
If you wait a little while as I did, you can hear the sound of an auto-rickshaw and I smiled so ... as it is about the designs themselves. The curation of the show makes that easy to understand.
- How to guard your social feeds against election misinformationon September 21, 2020 at 7:30 am
Facebook, at least, does some automatic labeling of posts that appear ... Fact-checkers can’t easily find content that’s shared in private groups and messages, and the tools fact-checkers ...
- Deep Learning Tool Empowers Biologists, Speeds Screeningon September 14, 2020 at 5:00 pm
June 25, 2018 | 2018 Bio-IT World Best Practices Award-Winner | Traditional high content screening is labor- and time-intensive ... which serve as the training data for the neural networks. The need ...
Go deeper with Google Headlines on:
Automatic content curation
Go deeper with Bing News on:
- Sukkot and the Worldon October 4, 2020 at 2:22 am
Yet everywhere, we see fragmentation, antagonism, and friction. Identity politics, ideological rivalries, and other forms of divisiveness all hamper and undermine civil society and I see no end to it.
- Spain’s conservative revolutionon October 3, 2020 at 4:10 am
“Pedro Sánchez’s only hope is fragmentation, the division of all us who love ... “He’s a very young person but with deep ideological convictions,” Zarzalejos said. “He represents the most conservative ...
- Terrorism in East Africaon September 30, 2020 at 5:00 pm
Numerous armed Islamist groups continue to plague the East African region and especially Somalia as a result of the fragmentation of the country and the lack of ...
- Whither the Proposed Elephant Reserve?on September 19, 2020 at 5:01 pm
Among the root causes are the eviction of elephants from their natural habitat, the fragmentation of their territory, and the use of that territory for development work and for illegal activities. The ...
- How Should Universities (Especially Law Schools) Treat The Powerful?on September 11, 2020 at 12:32 pm
In the case of the university, this is the difference between maintaining academic freedom for students or faculty members who advance a range of ideological positions and awarding honorary ...