Criminal logo

Are Social Media Companies Removing War Crime Evidence

Busy Food Market in Syria: Over 30 People Killed

By Shadrack KalamaPublished 11 months ago 3 min read
2
Are Social Media Companies Removing War Crime Evidence
Photo by Daniel Stuben. on Unsplash

In Syria, a bustling food market turned into a tragic scene where more than 30 individuals lost their lives. Disturbing videos and images, which were initially posted online, depicted Ukrainian civilians being shot and burned by Russian soldiers, potentially revealing evidence of war crimes. However, a search on major social media platforms would now yield no results.

These graphic images are being removed due to violations of rules regarding explicit content. Unfortunately, this also means that crucial evidence for prosecuting war criminals may be permanently lost. Regardless of whether the social media companies publish or take down the content, it is imperative not to let this material slip away. While these videos and images are highly distressing, they hold significant importance as the world needs to be aware of the atrocities occurring in foreign lands.

Imad, a pharmacist from Aleppo, Syria, recalls the horrifying moment when a barrel bomb struck his hometown, specifically near the local market where he was working. This attack was carried out by the Syrian government, resulting in the deaths of Imad's friends and customers. He remembers local television stations documenting the incident, and he had previously seen footage of it online. Years later, while seeking asylum in the EU, Imad searched for the video again, only to find it removed by Facebook, Twitter, and YouTube for violating their graphic content guidelines. In the chaos of a war-torn country, the original footage was also lost, leaving these crucial images as evidence of potential war crimes missing forever.

Here lies the predicament. Social media companies have faced criticism for allowing easy access to distressing content in the past. However, they are now increasingly stringent in removing such material promptly. Unfortunately, by deleting it, evidence of crimes and abuse may be irretrievably lost. These companies have erred on the side of caution due to past scrutiny, resulting in an overzealous approach to removing content. Most of the removals are carried out by artificial intelligence systems that automatically detect and flag instances of violence, pornography, or child abuse. To understand this process better, we approached Hive, a company that provides AI moderation software to platforms like Reddit. They granted us access to their system, and we uploaded videos for testing purposes. The program uses various criteria such as blood, corpses, or nudity to assign a score between zero and one. A higher score indicates a higher confidence in correctly identifying objectionable content. Notably, the software successfully detected dead bodies in our videos, earning scores above 0.9, which would likely lead to automatic removal without human review.

Although major social media platforms claim that graphic content from war zones can remain online if it serves the public interest, our experiments suggest otherwise. We decided to test how these platforms would handle the videos we uploaded. Instagram promptly took down three or four videos within a minute, while YouTube initially age-restricted the same three videos but later removed them all after ten minutes, accompanied by a warning message. Our subsequent attempts to upload the videos failed altogether, and our appeal to reinstate them was rejected.

The videos we attempted to upload were filmed by a Ukrainian named Eagle. He used to work as a travel journalist but has since turned to documenting attacks on civilians following the Russian invasion. Eagle's videos, shared on Facebook and Instagram, are immediately taken down due to their content. His purpose was to challenge the Kremlin's claim that the killing of civilians never occurred.

One organization, Mnemonic, based in Berlin, has taken it upon themselves to preserve videos like these. Mnemonic is an archiving group that has developed a tool capable of quickly and automatically downloading and preserving footage from war zones before it

investigation
2

About the Creator

Shadrack Kalama

Shadrack is a passionate writer with a creative spirit and a love for storytelling. With a pen in hand and a mind full of imagination, I weaves words into captivating narratives that transport readers to new worlds and evoke feelings

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments (2)

Sign in to comment
  • Shadrackkalama11 months ago

    Very True

  • I think its time social media companies should stop removing crime scene evidence

Find us on social media

Miscellaneous links

  • Explore
  • Contact
  • Privacy Policy
  • Terms of Use
  • Support

© 2024 Creatd, Inc. All Rights Reserved.