City
Epaper

YouTube showing election-fraud videos to users skeptical about 2020 US polls

By IANS | Updated: September 2, 2022 11:40 IST

New York, Sep 2 YouTube is showing more election-fraud videos to users already skeptical about the legitimacy of ...

Open in App

New York, Sep 2 YouTube is showing more election-fraud videos to users already skeptical about the legitimacy of the 2020 US presidential election, a study has revealed, showing how its algorithms perpetuate existing misperceptions.

The study, published in the Journal of Online Trust and Safety, found that those most skeptical of the election's legitimacy were shown three times as many election-fraud-related videos as were the least skeptical participants roughly eight additional recommendations out of approximately 400 videos suggested to each study participant.

The findings expose the consequences of a recommendation system that provides users with the content they want.

"For those most concerned about possible election fraud, showing them related content provided a mechanism by which misinformation, disinformation, and conspiracies can find their way to those most likely to believe them," said the authors of the study.

Importantly, these patterns reflect the independent influence of the algorithm on what real users are shown while using the platform.

"Our findings uncover the detrimental consequences of recommendation algorithms and cast doubt on the view that online information environments are solely determined by user choice," said James Bisbee, who led the study at New York University's Center for Social Media and Politics (CSMaP).

Nearly two years after the 2020 presidential election, large numbers of Americans, particularly Republicans, don't believe in the legitimacy of the outcome.

"Roughly 70 per cent of Republicans don't see Biden as the legitimate winner," despite "multiple recounts and audits that confirmed Joe Biden's win," the Poynter Institute said earlier this year.

While it's well-known that social media platforms, such as YouTube, direct content to users based on their search preferences, the consequences of this dynamic may not be fully realised.

"Many believe that automated recommendation algorithms have little influence on online aecho chambers' in which users only see content that reaffirms their preexisting views," said Bisbee, now an assistant professor at Vanderbilt University.

"This highlights the need for further investigation into how opaque recommendation algorithms operate on an issue-by-issue basis," said Bisbee.

Disclaimer: This post has been auto-published from an agency feed without any modifications to the text and has not been reviewed by an editor

Tags: James bisbeeNew York UniversityYoutubeInformation technology and information technologyFacebook gamingAlumni of new york film academyYoutube channelEngineering for researchInc.Uk-headquarteredNew york health
Open in App

Related Stories

EntertainmentBadshah Receives Death Threat from Lawrence Bishnoi Gang Over ‘Tateeree’ Song Row

InternationalUK Watchdogs Urge Social Media Giants To Stop Children Accessing Platforms

TechnologyWhy YouTube Witness Global Outage? TeamYouTube Reveals Reason

Social ViralYouTube Down: Funny Memes and Jokes Go Viral on Social Media As Alphabet-Owned Streaming Platform Faces Global Outage

TechnologyYouTube Down: TeamYouTube Says Its Teams Are Looking Into Global Outage

Technology Realted Stories

TechnologyFuel supplies adequate, no need to panic: IOCL

TechnologyGovt working to minimise supply chain impact, pharma sector unaffected: Commerce Secretary

TechnologyNITI Aayog launches ATL Sarthi, Mentor India Academy to deepen school‑level innovation

Technology21 states carrying out press briefs to counter misinformation around LPG: Centre

TechnologySAIL provides 4,000 tonnes of steel for 'INS Taragiri' warship