Facebook Adds Tools To Combat Misinformation In Groups

Facebook adds tools to combat misinformation in groups examples facebook adds tools to combat misinformation in groups sociology facebook adds tools to combat misinformation in french facebook adds tools to combat misinformation board facebook adds tools to combat inflation facebook adds tools to measure facebook ads manager tools
Facebook Adds Tools To Combat Misinformation in Groups


Facebook Adds Tools To Combat Misinformation in Groups

Facebook said Wednesday it's adding new tools that could make it easier to combat the spread of misinformation in Groups. 

Facebook Groups, which can be public or private, are online spaces where people can chat about various topics including hiking, parenting and cooking. But users have also used Groups to spread misinformation including about the coronavirus, elections and vaccines. False claims and propaganda are still a big problem on Facebook especially after Russia's invasion of Ukraine. In some cases, people have used old footage or photoshopped images to misrepresent what's happening in those countries.

screen-shot-2022-03-08-at-4-26-21-pm.png

Facebook will let administrators who manage Groups automatically decline any posts that have been rated false by fact-checkers.

Facebook

One new feature will allow administrators who run Facebook Groups to automatically decline any incoming posts that have been rated false by the company's third-party fact-checkers. The social network said that will help reduce how many people see misinformation. 

Facebook didn't say whether posts typically get fact checked before they're shared in a Group. A company spokeswoman said the social network is also working on a new way for administrators to remove posts that are later flagged for containing false claims after they've been posted to a Group.

Facebook partners with more than 80 fact-checking organizations such as PolitiFact, Reuters and The Associated Press to help identify false claims. Users who try to share a fact-checked post see a warning that says it contains false information in the post but can share the content if they want. Facebook doesn't share data about how much content gets fact checked on its platform. 

The release of the new tools show how Facebook is trying to ramp up efforts to combat misinformation. There's been questions, though, about how well labeling misinformation on social media works. In 2020, a study by MIT found that labeling false news could result in users believing stories that hadn't gotten labels even if they contained misinformation. The MIT researchers call this consequence the "implied truth effect." Facebook said that more than 95% of the time when people see a fact-checking label, they don't end up viewing the original content. 

The social network also announced the release of other features meant to help make it easier for administrators to manage and share Groups. Administrators, for example, will be able to send invites via email and share QR codes that will direct people to a Group's About page where they can learn about the community and join. More than 1.8 billion people use Facebook Groups every month.

Social media sites have also been used to spread scams so users should be wary about clicking on links or sharing QR codes. Facebook said the QR codes for Groups include the social network's logo.


Source