Simply as the FDA officially approved Pfizer’s COVID-19 vaccine for kids between the ages of 5 and 11, Meta, Facebook’s brand new identity, introduced that it’s rolling out stricter insurance policies for vaccine misinformation targeted at children (by way of Engadget). The platform previously put restrictions on COVID-19 vaccine misinformation in late 2020, however didn’t have insurance policies particular to children.
Meta says in a new blog post that it’s partnering with the Facilities for Illness Management and Prevention (CDC) and the World Well being Group (WHO) to take down dangerous content material associated to children and the COVID-19 vaccine. This contains any posts that indicate the COVID-19 vaccine is unsafe, untested, or ineffective for children. Moreover, Meta will present in-feed reminders in English and Spanish that the vaccine has been accredited for youths, and also will present details about the place it’s accessible.
Meta notes that it’s taken down a complete of 20 million items of COVID-19 and vaccine misinformation from each Facebook and Instagram for the reason that starting of the pandemic. These numbers are at odds with what we’ve seen from the leaked inner paperwork from Facebook — the Facebook Papers made it clear just how unprepared the platform was for misinformation associated to the COVID-19 vaccine. If Facebook have been extra ready, it would’ve rolled out campaigns to fight misinformation earlier within the pandemic, each for children and adults, presumably eradicating extra false content material in consequence.