How Facebook Ads Are Designed to Polarize [Content Made Simple]
Issue #196: Surviving this week, what social media companies learned in 2016, and more.
FACEBOOK’S POLITICAL AD SYSTEM IS DESIGNED TO POLARIZE
Facebook wants to deliver “relevant ads” to its users, but how does that affect political advertising? It makes it more polarizing, driving people deeper into their trenches.
Quote:
It all comes down to Facebook’s desire to show users “relevant” ads. When you target an ad to a certain Facebook audience, you’re actually bidding against other advertisers in an auction for that group’s attention. And Facebook openly tells businesses that the platform will “subsidize relevant ads,” meaning an ad can win an auction even against higher bidders if the algorithm deems it more relevant to a given user. Why? Because to keep selling ads, Facebook needs to keep users on the platform.
“I talk about this all the time in my trainings for campaigns and operatives: Facebook’s objectives are not aligned with your campaign objectives,” said Wilson. “Facebook wants to make more money, and they make more money by getting people to spend more time on the site.” That, in turn, gives the platform an incentive to show users what they’re already interested in. That might seem benign when it comes to an ad for detergent, but it has different implications for democratic politics, which depends to some extent on the possibility of candidates getting their messages in front of people who aren’t already in their camp. And it raises the question of whether the platform gives an advantage to established politicians; an unknown candidate, after all, won’t show up in any user’s list of preexisting interests.
Commentary:
This is a great piece in WIRED. Facebook wants to deliver “relevant ads” to its users because that makes them more likely to click, makes Facebook’s ads more successful, and allows them to keep making billions of dollars a month. The problem is that this has massive sociological side effects that Facebook doesn’t consider. When you only deliver “relevant” content to people, you drive them further into their trenches, alienating them to people who think or live unlike themselves.
ON THE POD
No podcast this week!
HITTING THE LINKS
Link #1: What Social Media Companies Have (and Haven’t) Fixed Since Election 2016
This is a great explanation of how many social media platforms have improved the last four years.
It’s almost hard to believe now, but the last time the United States elected a president, Facebook, Twitter, and YouTube had essentially no policies on misinformation, conspiracy theories, or foreign election interference. Now, all the major platforms have such policies — but they are constantly evolving, inconsistently applied, and often hotly contested, both on the national stage and within the companies themselves. Together they amount to a convoluted patchwork in which the same post, ad, or user might be banned on one platform, slapped with a warning label on another, and fully permitted on a third.
Link #2: QAnon Coopted the Anti-trafficking Movement, Now Facebook Takes Action
Good explanation of some action Facebook is taking against QAnon to prevent their spread into mainstream culture.
QAnon's pivot to "Save the Children" has been integral to the once-fringe conspiracy theory's recent and rapid entrée into mainstream culture.
With "Save the Children," also called "Save Our Children," QAnon has spread with fluffy aesthetics on Facebook-owned Instagram, in what researcher Marc-André Argentino has dubbed "Pastel QAnon." Many women, including lifestyle influencers, yoga instructors, and other Instagram personalities, have joined in spreading QAnon messaging.
Internal Facebook documents previously obtained by NBC News found that QAnon groups had millions of members before the company announced a ban on the conspiracy theory in early October. An August analysis by First Draft, a nonprofit tracking misinformation online, found that of 3.5 million Facebook users discussing "Save the Children" and "Save Our Children" hashtags, "the most engaging conversations were happening in Facebook groups and on Instagram accounts related to QAnon," NBC News reported.
Link #3: The Election, Truth, and Some Advice
Be wise this week.
The primary goal of those who wish to undermine the 2020 U.S. Presidential Election is not to disqualify votes or tamper with actual results. The primary goal is to “sow confusion and doubt.” Much of the efforts of these foreign manipulators are expressed through social media and other forms of digital communication.
The easiest way they can do that is simply by posting content that is shocking, spectacular, or just downright entertaining.
THE FUNNY PART
If you like this, you should subscribe to my free newsletter of funny content I find online. It’s called The Funnies. It delivers on Saturday mornings.
You can subscribe to The Funnies here. (It is and will always be free.)