When Evansville, Indiana-based philosophy professor Dr. Anthony Beavers spoke at Cal Poly in 2009, it was to share his optimistic view of what social mediaāand Facebook in particularācould mean for the next generation. Hailing it as āa wonderfully open world of information exchange,ā Beavers saw it as a platform to immediately exchange news, opinions, and ideas.
But in the two years that have elapsed since his last Cal Poly talk, the professor became dubious about the control of information on the Internet by a few corporations: Google, Yahoo, Microsoft.
Facebook.
And when Facebook began to direct more web traffic than Google, Beavers became alarmed. In a May 13 talk on the college campus, part of the schoolās Ethics and Emerging Sciences lecture series hosted by Patrick Lin, Beavers will address the topic from a more critical angle: How is Facebook challenging our sense of right and wrong, and what does it intend to do with the information garnered from its 700 million users?
āIāve become patently aware of agendas in the background,ā Beavers told the Sun in a phone interview. āFrom the userās perspective, people interact, and Facebook looks like a social networking site. From the business perspective, itās the largest social science database thatās ever been compiled in the history of the world.ā
But while social scientists are bound by ethical codes concerning the use of that information, Beavers went on, the corporate sector is not.
By population, the social networking platform equates to the third largest country in the world. But itās a country without geographical borders, and the power it has amassed doesnāt fall under any set of political laws.
Last August, Beavers attended a UCLA event at which representatives from Facebook spoke candidly about how their data was used. The company, for example, is able to divine the political leanings of users in its data set, even if those users donāt provide that information outright.
āThey can do ghost-node identification, which means that if you put a couple of your favorite movies and your favorite books and a few of your friends, I can find out what political party you belong to,ā Beavers said. āYou donāt have to tell me that.ā
Statistical reliability, he said, is upward of 90 percent.
Using this information, Facebook can have a significant impact on political elections: simply identify which users are on the fence, politically speaking, and sell advertising spots to political candidates, guaranteed to reach swing voters. And since Facebook is different for every user, itās hard for anyone to tell if he or she is being singled out by advertisersāand impossible for those paying for advertising to tell whether their ads are actually being delivered. This is where ethics is an issue.
āEvery FB user sees what they see, and nobody sees it allāexcept for Facebook,ā Beavers said. āSo this is the interesting thing. They can promise you privacy from other people, but theyāre watching everything. I mean, theyāve got these massive network analysis computers that are calculating demographics and profiles and pitching advertisements.ā
Thatās Facebook on a macro scale. But the platform poses ethical questions on a micro level as well, by becoming a breeding ground for interpersonal misunderstandings.

Beavers saw this play out in his own family: āSome people donāt realize that when theyāre mad and they say something, it has a ripple effect to create drama. My mom is 68; my dad is in his 70s. And Iām reading the news feed, and my mother has changed her relationship status to my dad from āmarriedā to āitās complicated.āā
When the professor asked what had happened, his mother replied that theyād just had a fight. But now she had unwittingly told everyone on Facebook that she was having marital difficulties. And without the proper context to explain her change in relationship status, friends and relatives were immediately alarmed and assumed the worst.
Itās a classic example of the problem of half-information that Facebook often perpetuates, enabling users to give just one part of the story without providing a full explanation, leading to misunderstandings, unnecessary worry, and hurt feelings.
If thereās one common thread in all of Beaversā issues with the site, itās the politics of information flow. Your newsfeed canāt show you everything, so it shows the information pertaining to friends you interact with mostāthus surrounding users with thoughts and ideas they like and agree with, leading them to believe there are more people like them than there actually are. (This doesnāt sound terribly harmful, until you ask yourself, as Beavers did, what if a pedophile surrounds himself with only pedophiles? Wouldnāt he begin to feel that his actions were more acceptable?)
In a less extreme way, The New York Times and Washington Post websites have begun to catch on to Facebookās logic, showing readers first those news stories their friends have ālikedā on Facebookāas though individuals need only be exposed to news they like and agree with.
Of course, the meaning of the word ālikeā has been redefined in the land of Facebookāanother issue Beavers finds deserving of exploration. The professor himself, though he quit the site at one point, now regularly interacts on Facebook. He believes todayās Facebook users are part of something of great historical significance, but that weāre too deep into it to understand what that something is.
In fact, he said, āI donāt think weāre going to understand whatās going on for another hundred years. ⦠Weāre going to look back and say, āWhat a crazy mess that was.āā
But right now, according to Beavers, Facebook is too large a power to try to slow down. People often ask him if they should delete their accounts, a small way to starve the beast. But he only shrugs: āQuit or donāt quit. Thereās nothing we can do about it.ā
Anna Weltner is arts editor at New Times, the Sunās sister paper to the north. Contact her at aweltner@newtimesslo.com.
This article appears in May 12-19, 2011.


