Pizza slices, cupcakes, and carrots are just a few emojis that anti-vaccine activists use to speak in code and continue spreading COVID-19 misinformation on Facebook. Ars Technica reports: Bloomberg reported that Facebook moderators have failed to remove posts shared in anti-vaccine groups and on pages that would ordinarily be considered violating content, if not for the code-speak. One group that Bloomberg reviewed, called “Died Suddenly,” is a meeting ground for anti-vaccine activists supposedly mourning a loved one who died after they got vaccines — which they refer to as having “eaten the cake.” Facebook owner Meta told Bloomberg that “it’s removed more than 27 million pieces of content for violating its COVID-19 misinformation policy, an ongoing process,” but declined to tell Ars whether posts relying on emojis and code-speak were considered in violation of the policy.
According to Facebook community standards, the company says it will “remove misinformation during public health emergencies,” like the pandemic, “when public health authorities conclude that the information is false and likely to directly contribute to the risk of imminent physical harm.” Pages or groups risk being removed if they violate Facebook’s rules or if they “instruct or encourage users to employ code words when discussing vaccines or COVID-19 to evade our detection.” However, the policy remains vague regarding the everyday use of emojis and code words. The only policy that Facebook seems to have on the books directly discussing improper use of emojis as coded language deals with community standards regarding sexual solicitation. It seems that while anti-vaccine users’ emoji-speak can expect to remain unmoderated, anyone using “contextually specific and commonly sexual emojis or emoji strings” does actually risk having posts removed if moderators determine they are using emojis to ask for or offer sex.
In total, Bloomberg reviewed six anti-vaccine groups created in the past year where Facebook users employ emojis like peaches and apples to suggest people they know have been harmed by vaccines. Meta’s seeming failure to moderate the anti-vaccine emoji-speak suggests that blocking code-speak is likely not currently a priority. Last year, when BBC discovered that anti-vaccine groups were using carrots to mask COVID-19 vaccine misinformation, Meta immediately took down the groups identified. However, BBC reported that soon after, the same groups popped back up, and more recently, Bloomberg reported that some of the groups that it tracked seemed to change names frequently, possibly to avoid detection.
Read more of this story at Slashdot.