☲☰CRYPTED
Editor’s Note: For most of us, the wide world of technology is a wormhole of dubious trends with a side of jargon soup. If it’s not a bombardment of startups and tech trends (minimum viable product, Big Data, billion dollar IPO!) then it’s unrelenting feature mongering (Smart Everything! Siri!). What’s a level-headed guy with a few bucks in his pocket supposed to do? We’ve got an answer, and it’s not a ⌘+Option+Esc. Welcome to Decrypted, a new weekly commentary about tech’s place in the real world. We’ll spend some weeks demystifying and others criticizing, but we promise it’ll all be in plain english. Continuing on from his work on the first two issues (let’s call those a beta) is writer Darren Murph, the former Managing Editor of Engadget and a Guinness World Record holder for number of blog posts published. So take off your headphones, settle in for something longer than 140 characters and prepare to wise up.

Congratulations to the fifteen of you who are still without a presence on Facebook — you’ve nothing to fear. For the other billion or so folks who have chosen to generate a profile on the planet’s most notable social network, it’s probably time you paid attention to what’s really going on behind the scenes. Recently, it was revealed that Facebook conducted an experiment in 2012, whereby it intentionally (though temporarily) altered the news feeds of around 700,000 users.

Of course, Facebook tweaking the news feed is hardly news — it’s been doing as much since its inception. The issue is that the users had their feeds manipulated to show an atypically high or low amount of positive or negative content. The study concluded that folks who saw more positive stuff from their friends were more apt to post positive stuff, and vice versa. It’s also worth noting that the research truly was conducted with academia in mind. Two members of the academic community authored the eventual results in a respected journal alongside a member of Facebook’s Core Data Science Team, and the conclusions were always meant for public consumption and debate.

As you’ve probably seen, the reaction has been less than beautiful for Facebook. But, if I’m honest, I sort of feel for the company. It doesn’t owe any of us much of an apology.

Nothing you touch in the digital world is truly free from prying eyes.

You see, the knee-jerk reaction of outrage feels entirely misplaced. As Nilay Patel outlined exquisitely over at Vox, Facebook’s entire business model revolves around the manipulation of the news feed. In a reactionary (and terse) apology, COO Sheryl Sandberg noted that the experiment was part of “ongoing research. . .to test different products.” While she went so far as to confess that Facebook never intended to “upset” anyone, she didn’t allude to any upcoming alterations in procedure. After all, every single person who creates a Facebook profile agrees to its terms of service (which states that user data can indeed be used for research), and, as we’ve seen with the recent NSA scandal, nothing you touch in the digital world is truly free from prying eyes.

That’s a haunting statement, but it leads into what should be learned from Facebook’s finagling.

For starters, you should recognize that everything you place online about yourself could potentially be viewed elsewhere. There’s a reason they’re called social, not private, networks. You don’t pay Facebook to host your secrets; Facebook allows you to use its services gratis in exchange for data that it can sell ads against. If such a notion abhors you, you’ve no business on Facebook in the first place — and, might I add, this may very well serve as encouragement for the skittish among us to simply deactivate and delete their social profiles altogether. Contrary to popular belief, it is entirely possible to eat, sleep, breathe, and function without broadcasting your emotions online.

Second, it’s important to realize that these kinds of experiments aren’t new. Ethnographers have been studying cultures for ages, gauging reactions as best they could with the surveying tools of their time. The only real change is in efficiency. The advent of the Internet has made it infinitely easier to research the actions and reactions of the human race. Instead of phoning us up or coaxing us to join a focus group, outfits like Facebook can simply analyze our click and comment patterns online. The heart of what’s happening is the same as it has ever been; we’re just a little freaked out at how easy we’re making it on the folks who scour the data.

Contrary to popular belief, it is entirely possible to eat, sleep, breathe, and function without broadcasting your emotions online.

Finally, it’s about time we called Facebook what it is: tightly controlled entertainment. According to an ongoing study from EdgeRank, you’re only seeing 10 to 20 percent of the content that your friends post or share. As with Google’s ever-mysterious search engine algorithm, Facebook is also toying with its own mechanisms behind the scenes, and far from the purview of the average user. Those brands, bands, artists, and activists that you “liked” have suffered a 44 percent decline in reach just since December, according to a study by Ignite Social Media. Facebook suggests that this culling and curation is what keeps engagement high and exhaustion low, and judging by the steady growth of the site, it’s tough to argue its effectiveness. Of course, the specifics here are closely guarded secrets — brands have been playing a perpetual game of trial and error to figure out how to use Facebook to get their message out. Much like a primetime comedy, Facebook’s programming has never been raw and unfiltered, though one could argue that its communication of such programming has been. (One would think that after being backed into a number of public apologies over the years, the company would learn to err on the side of transparency.)

Twitter, as of now, enables users to turn the informational water hose directly to their face, unleashing an unending torrent of material that can only ever be slowed by unfollowing your besties. Facebook hasn’t played this card for years, and while there’s no scheming human behind the scenes handpicking what shows up in your feed, there’s a computer algorithm doing precisely that. It is what it is — a selected batch of quips, links, photos, and movies — that serves to entertain its ballooning user base. Your information is being used by Facebook. Be aware of that, and use the social network as you will.

FILED UNDER ,