![]() ![]() ![]() The effects of pre-bunking last for only between a few days and a month. She added that elections were tricky to pre-bunk because people had such entrenched beliefs. The tactic was not effective on people with extreme views, such as white supremacists, Ms. However, pre-bunking is not a silver bullet. The researchers don’t have plans for similar pre-bunking videos ahead of the midterm elections in the United States, but they are hoping other tech companies and civil groups will use their research as a template for addressing misinformation. It will be done in concert with local fact checkers, academics and disinformation experts. Jigsaw will start a pre-bunking ad campaign on YouTube, Facebook, Twitter and TikTok at the end of August for users in Poland, Slovakia and the Czech Republic, meant to head off fear-mongering about Ukrainian refugees who entered those countries after Russia invaded Ukraine. “This is one of the few misinformation interventions that I’ve seen at least that has worked not just across the conspiratorial spectrum but across the political spectrum,” Ms. A million adults watched one of the ads for 30 seconds or longer. ![]() The researchers bought YouTube ad space to show users in the United States 90-second animated videos aiming to teach them about propaganda tropes and manipulation techniques. The new paper details seven experiments with almost 30,000 total participants. ![]() Warnings, written in multiple languages, will appear as prompts placed atop users’ feeds and in searches for certain topics. Twitter said this month that it would try to “enable healthy civic conversation” during the midterm elections in part by reviving pop-up warnings, which it used during the 2020 election. Most have also not been as detailed - or as entertaining - as the videos used in the studies by the researchers. Social media platforms have made attempts to pre-bunk before, though those efforts have done little to slow the spread of false information. The strategies and tools being deployed during the midterm vote in the United States this year by Facebook, TikTok and other companies often resemble tactics developed to deal with misinformation in past elections: partnerships with fact-checking groups, warning labels, portals with vetted explainers as well as post removal and user bans. Despite an array of attempts by the companies to address the problem, it is still largely up to users to differentiate between fact and fiction. Since Russia spread disinformation on Facebook during the 2016 election, major technology companies have struggled to balance concerns about censorship with fighting online lies and conspiracy theories. But effective educational tools still may not be enough to reach people with hardened political beliefs, the researchers found. The researchers found that psychologically “inoculating” internet users against lies and conspiracy theories - by pre-emptively showing them videos about the tactics behind misinformation - made people more skeptical of falsehoods afterward, according to an academic paper published in the journal Science Advances on Wednesday. So researchers at Google, the University of Cambridge and the University of Bristol tested a different approach that tries to undermine misinformation before people see it. In the fight against online misinformation, falsehoods have key advantages: They crop up fast and spread at the speed of electrons, and there is a lag period before fact checkers can debunk them. ![]()
0 Comments
Leave a Reply. |