Graphic images from a Texas mass shooting on Saturday that killed nine (including the gunman) and wounded seven are still circulating on Twitter after spreading virally all weekend. Critics told The New York Times that unlike other platforms, Twitter isn’t doing enough to remove or label these “unusually graphic” images, especially in footage where dead bodies of some victims, including a young child, appear to be identifiable, Reuters reported.
Family members do “not deserve to see the dead relatives spread across Twitter for everybody to see,” photojournalist Pat Holloway told the Times. Over the weekend, Holloway joined others in tweeting directly at Twitter CEO Elon Musk to step up the platform’s content moderation.
Twitter’s policy on sharing content after a violent attack acknowledges that “exposure to these materials may also cause harm to those that view them.” That policy is primarily focused on banning the distribution of content created by perpetrators of attacks, but it also places restrictions on “bystander-generated content” depicting “dead bodies” or “content that identifies victims.”
Read 16 remaining paragraphs | Comments