If Facebook’s goal is to change your behavior, they need to know your psychological response to photos and videos the same way the tobacco industry needs to know people’s brain chemistry when choosing additives in cigarettes.
Facebook wants to know: How are you feeling?
There’s a water crisis on the other side of the planet. Donald Trump tweeted his latest offensive screed. Your old friend’s brother unexpectedly and tragically died. Do you like it? Better yet, do you love it? Does it make you sad or angry? Does it make you say “wow”?
Facebook has upgraded the ubiquitous Like button with a host of new emotions. You can now “love,” “haha,” “wow,” “sad” or “angry” something with cute emojis. You can “wow” a friend’s appalling Tinder screengrab, send them a “sad” for emotional support during a period of anxiety or just “haha” their wack memes in sympathy.
Why? Likes have always felt a little callous in situations that aren’t exactly positive, but we’ve never had a better option. The new emojis give us a range of emotions to react with. An “angry” reaction just feels natural for responding to someone’s vacation album. “Love” is perfect for a relationship update.
But just remember this: For every little inch of emotional nuance we gain from these buttons, Facebook gains a mile in the ways it can manipulate and keep tabs on us.
Facebook confirmed to Mic that it will use data gathered when you use the new emojis to alter your News Feed and learn more about what you like.
Facebook is constantly trying to figure out what will keep you glued to your News Feed longer. Every like, every share and every click or tap is more data to feed the Facebook algorithms. It’s like watering a tree that sinks its roots deeper and deeper. And with each interaction, Facebook knows you better. Do you prefer Bernie Sanders to Donald Trump, or fashion to green living?
Facebook knows this about you, and it will use the information to tailor your News Feed to things Facebook thinks you want to see.
This tailoring of your feed affects how news organizations report and distribute valuable information, and it’s influencing how political campaigns shape their messages. It can even have an impact on which candidates reach a bigger audience.
Your Facebook feed could be a source of inspiration. It could inform you, challenge your political beliefs or expose you to new art and ideas. But Facebook’s main interest is to grab your attention and keep you scrolling and clicking. It’s meant to keep you more engaged, but more often than not, you end up just searching endlessly for something interesting.
Notice all of the videos you see lately? Facebook figured out that video holds people’s attention, so now it feeds you more punchy videos than articles. This is why your News Feed is now filled with food porn clips and one-minute segments summarizing breaking news.
Messing with your emotions
We’ve known for years that Facebook is interested in manipulating emotions in order to alter user behavior.
In 2014, Facebook quietly published a research paper revealing that the company had intentionally altered the News Feeds of about 700,000 users to see if showing them more uplifting or upsetting content could affect their emotional state. The study drummed up outrage and only barely proved that altering someone’s timeline would change user behavior.
Now, Facebook is getting a deeper insight than ever. Measuring likes is old-school; Facebook wants to know how you feel. It wants to know if politics are a downer, if open expressions of anxiety are a turnoff. Facebook wants to know what gets that dopamine pumping. Wow! Love!
Facebook wants to know what gets that dopamine pumping. Wow! Love!
If Facebook’s goal is to change your behavior, they need to know your psychological response to photos and videos the same way the tobacco industry needs to know people’s brain chemistry when choosing additives in cigarettes. For Facebook to compete with Snapchat and Twitter, it needs to water their algorithms with something better, and your emotions are a superfood.
Another day, another emoji. Slight cosmetic updates to your social networks happen all the time. In August, Instagram unveiled rectangular photos instead of giving us the gallery feature it offers to advertisers. Twitter changed “faves” to “likes” instead of rolling out what users had been demanding: more powerful tools for harassment prevention. With all this competition for attention, our time has become the most valuable asset.
Is this another form of surveillance?
The cynical might say no, because you opted in when you created your account and approved the terms and conditions. If you want to resist Facebook, though, there’s a better way than deleting your account: Just don’t use the new emojis.
Keep tapping “like.” Binary yes/no choices are good enough to say “I want to see more of this” without giving an algorithm a qualitative measure of your emotions. Or, when you see that a co-worker’s loved one has died, or that a friend’s new selfie is fire, you can drop them a few kind words to show that their feelings are worth more than a vector drawing.
A picture may be worth a thousand words, but to Facebook, it’s worth so much more.
Feb. 24, 2016, 4:17 p.m.: Mic received clarification Wednesday afternoon that for now, an expressive emoji is still counted by the News Feed algorithm as a “like,” although that could change in the future.
This content first appeared on Mic