MEDIA & COMMUNICATION

In today’s technological world, I believe there is no one who does not know about the largest online social network, Facebook. Having founded by Mark Zuckerberg in 2004, Facebook has become the biggest website in the world, with 1.15 billion active users monthly. People use Facebook everyday; many of us depend on it to get us through the day, for our leisure, business, finding things, connecting with friends, academic purposes, sharing feelings and everything. It has been a lifestyle habit for many people to waste their time on checking the newsfeed of Facebook (though for them it’s not a waste of time). Newsfeed has been the new place for ‘my story’, ‘your story’, ‘our story’; people share their emotions, comment on other’s emotion shared, as well like on what other friends have shared.

People often share depressing stories like loss of a closed person, being injured severely, counting the last days of a mother in a hospital ICU: happy stories like promotions, got married, unexplainable birthday gifts: also mind disturbing stories like number of brutal fights in some countries, deaths after war, brutal injuries after some natural disaster, and so on. Some of these are emotionally depressing or disturbing to a point where we do not know whether to put a like or not. We are never conscious about the impacts that these negative and positive stories in the newsfeed are giving us, although they affect our mood and emotions tremendously.

It is hard to accept that the newsfeed contents influence our emotions while we scroll down the Facebook from time to time. Consequently it affects the person’s concentration and involvement in a group they stay at that particular moment, being completely lost in their emotional world regardless of where they are physically. To overcome this issue, Facebook conducted a research experiment using 689,003 users to test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed (Kramer, A.D.I, Guillory, Hancock, J.E 2014).

This research has been done in 2012 without the knowledge of the users whether their data are being used. According to Kramer, A.D.I, Guillory, Hancock, J.E (2014), the experiment was taken place for 1 week where participants are randomly selected based on User ID, resulting in a total of ~155,000 participants per set condition who posted at least one status update during the experimental period. They examined these data by comparing each emotion condition to its control. The results of the research experiment showed that emotions expressed by others on Facebook influence our own emotions; when positive expressions were reduced, people produced fewer positive post and more negative post and when negative expressions were reduced the opposite took place.

However, the issue of ethical behaviour in research has been raised regarding the misuse of the user’s information without consent after the research experiment was conducted. The research cannot be simply said as an observational study, also a form of violation as it was manipulating the content of number of Facebook user’s emotions and newsfeed.

Although the facebook tried to defend their processed research with Facebook’s data use policy, where it said the information received can be used for internal operations like testing and research as well; breaching the ethical rule of consent can still be placed against them.

While the paper presents a façade of informed consent by agreement to Facebook’s Data Use Policy, we have now found out – thanks to Forbes – that the clauses regarding research were only added to this policy four months after the experiment took place (Hunter, D 2014).

When we opened a Facebook account, we entrust them with all our personal information, sharing important (or not) moment of our lives and making the process of going through Facebook a daily basic routine. In exchange for fully dedication of its users, trust is one of the big words along with accessibility and reliability.

The trust you place in us as a safe place to share information is the most important part of what makes Facebook work Kathy H Chan

Facebook manipulating its users, even for a positive purpose, without their consent, is a break in the user’s trust. It may not be breaking the law, the Facebook’s Data Use Policy state they can do use the information in order to improve their services. Nevertheless everyone knows that nobody read all of the long policies and agreement, and those policies in the end are only for company to protect themselves in these situations. So, the question is, is it ethical for Facebook to use the data without the thousands of user’s consent?

Reference

Hunter, D 2014, ‘Facebook puts ethics of research by private companies in spotlight’, The Conversation, viewed 29 April 2015

Kramer, A.D.I, Guillory, Hancock, J.E 2014, ‘Experimental evidence of massive-scale emotional contagion through social networks’, Proceedings of the National Academy of Sciences, vol 111, no. 24, pp. 8788-8790, viewed on 29 April 2015

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: