War TikTok Is a Mess [Content Made Simple]
Issue #259: Facebook content report, sharing bad info, and more.
TOP OF THE WEEK:
War TikTok Is a Mess
Quote:
It is not novel to remark that the experience of scrolling through TikTok feels like emotional whiplash. Upon opening the app you might be greeted with a DIY project from Dollar Tree, followed by a manifesto on the power of friendship as a network for mutual aid. Scroll: a puppy eating peanut butter; scroll: news that a famous cat is dead. On February 24th it took me three swipes to land upon a video purporting to be a livestream of a city in the dark, filmed from an apartment window. Air raid sirens blared in the background, and the only other audible noise was the terrified whimpers of the person holding the camera.
I have no idea whether the footage was filmed by a real person in Ukraine, observing what was happening outside their window in real time, but I am almost certain that the person filming was not the same one who uploaded it to TikTok. Watch it for long enough and you’ll notice it’s a loop on repeat, and if not one of the commenters will point it out to you: “SCAM!” they write in between the thoughts and prayers from other TikTokers. “Staged for money!”
Commentary:
Rebecca Jennings writes what I wanted to write earlier this week about the mess of social media amid the war in Ukraine. Great, insightful work here.
HITTING THE LINKS
Great thoughts on the latest “transparency report” from Meta/Facebook.
What fuels engagement online? Emotionally-charged posts, with anger and joy being among the most highly shareable emotions. As any social media marketer knows, trigger these responses in your audience and you’ll generate engagement, because more emotional pull means more comments, more reactions – and in Facebook’s case, more reach, because the algorithm will give your content more exposure based on that activity.
Link #2: How to avoid sharing bad information about Russia’s invasion of Ukraine
Brilliant and helpful as usual from Abby Olheiser.
The fast-paced online coverage of the Russian invasion of Ukraine on Wednesday followed a pattern that’s become familiar in other recent crises that have unfolded around the world. Photos, videos, and other information are posted and reshared across platforms much faster than they can be verified.
The result is that falsehoods are mistaken for truth and amplified, even by well-intentioned people. This can help bad actors to terrorize innocent civilians or advance disturbing ideologies, causing real harm.
Link #3: Nick Clegg has the power now to right Facebook’s wrongs. This is how he should do it
Frances Haugen, the Facebook Papers whistleblower, has a message for the new head of global affairs at Facebook.
In 2020, Clegg claimed Facebook merely “holds up a mirror to society,” while ignoring that Meta designs its algorithm to reward the most extreme and polarising content. My disclosures to US Congress and the Securities and Exchange Commission confirmed that political parties across Europe – on the right and on the left – found Facebook algorithm changes in 2018 forced them into more extreme political positions. In democratic societies, one could say Facebook votes before we do. And in war zones and fragile societies with weak law and order, Facebook can get people killed.
THE FUNNY PART
If you like this, you should subscribe to my free newsletter of funny content I find online. It’s called The Funnies. It delivers on Saturday mornings.
You can subscribe to The Funnies here. (It is and will always be free.)