PLAGIARISM & AI FREE

Professional Research Paper Writing Service for
Complex Assignments

No hidden charges

No plagiarism

No missed deadlines

FACEBOOK EMOTIONAL CONTAGION CASE STUDY

FACEBOOK EMOTIONAL CONTAGION CASE STUDY.docx

FACEBOOK EMOTIONAL CONTAGION CASE STUDY

Engineering Ethics: Facebook Emotional Contagion Case Study
Name of Student
Institution Affiliation
Engineering Ethics: Facebook Emotional Contagion Case Study
Executive Summary
In 2014, Facebook published a paper about the application of social media in emotional contagion in the society. This article cited research carried out by the organization in 2012 that involved editing the newsfeed of 689,000 Facebook users with them of determining how access to information through the social media affects the transfer of emotions. These activities by a multinational technology firm represent gross violations of the engineering code of ethics. Various parties have pointed to the lack of informed consent, manipulation of information accessed by the firm’s users and unethical revelations that point towards the ability to use the social media to impact the perception of the public on various issues that may include politics and other social phenomena. The action by Facebook violates various fundamental pillars of the engineering code of conduct. Some include failure to hold paramount the safety, health, and well-being of the public by manipulating the user’s psychological well-being and deceiving the public. Also, failure to attained informed consent while carrying out research that involves human subjects’ amounts to infringement of the fundamental right to of participation (Metcalf and Crawford, 2016). This concept represents a significant ethical, moral and legal issue.
It is evident that Facebook case puts into question the honor, reputation, and usefulness of the engineering practice. As a technology firm that heavily relies on the engineering practice, it is expected to conduct itself with honor, integrity, and responsibility and also confine its actions with the available legal and ethical standards. The ethical case at hand points to an issue within the engineering practice that requires immediate resolution. While resolving the issue, Facebook faced minimal legal hurdles as it stipulated that it had acted in line with Facebook user data policies. However, decreased brand reputation and increased in negative publicity forced the company to offer an apology promising stricter guidelines in future related activities. It is evident that emotional contagion case is relevant to engineering ethics study as it portrays how a breach of the relevant code of conduct can lead to reputational a loss of the parties involved and the engineering profession as a whole. It is critical that all concerned parties acknowledge they need to behave responsibly towards the public and maintain a strict adherence to the moral, ethical and legal expectation of the profession.
Introduction
Engineering ethics is a critical concept for all parties involved in the engineering practice. Professionals in the sector are required to conduct themselves in a responsible manner that is mindful of social, cultural, economic, political and environmental aspects of the society while displaying utmost adherence to legal and ethical standards. The Code of Ethics for Engineers, published by National Society of Professional Engineers provides ethical guidelines to which individuals involved in the practice have to abide by. These guidelines primarily protect the public from professional malpractice resulting from the unlawful or illegal conduct of engineers or parties involved in engineering activities. Failure to comply with the ethical and legal guidelines stipulated may impact the public negatively while also hurting the brand reputation of the involved organizations. One of the companies affected by an ethical issue is Facebook, a giant multination technology firm primarily majoring in the social media sector and internet related products. The company published a paper in 2014 that provided empirical evidence of contagion of emotions made possible via social media networks (Kramer, Guillory, and Hancock, 2014). This action by the company led to a massive public outcry with various stakeholders from the social, legal and political divide pointing out several legal and ethical violations.
In 2012, Facebook performed a study that involved manipulating of the Newsfeeds of over 689,000 of its users with the aim of determining the capability of social media networks to facilitate the transfer of emotions from one person to another (Booth, 2014). The organization then published its findings that supported the hypothesis stipulating that it emotions are indeed contagious and it is possible to use the social media platform to manipulate how individuals feel without their awareness. By making public their findings, the company admitted to various ethical issues including manipulation of the emotions of members of the public (which amounts to influencing a person’s psychological wellbeing) and performing research on human test subjects without their informed consent. Also, it breached the company client trust by intentionally controlling information accessible to them. The company defended its activities by citing compliance with Facebook's data use policy, agreed upon by all users when joining the social media platform. It also argued that the reason for the research was to improve its services and overall customer experience. Since Facebook is a technology company that relies on heavy engineering professionals, it is in violation of several policies of the Engineering code of ethics.
Facebook’s activities violate at least three fundamental canons of the ethics document including engineers’ responsibility to hold paramount the safety, health and well-being of the public, restrain from engaging in acts of deception and the requirement to conduct oneself with honorably, responsibly, ethically and lawfully in manner that uphold the reputation and usefulness of the engineering sector. According to Harris Jr, Pritchard, Rabins, James, and Englehardt (2013), parties involved in the engineering practice should behave morally and aim to avoid engagements that compromise the ethical, legal institutions which thus result in a reputational loss for the discipline. Runeson and Höst (2009) acknowledge the increasing popularity of empirical research in the engineering and information technology sector, they argue that it is necessary for such research works to follow available relevant guidelines. Facebook’s behavior requires effective mitigation strategies that may include updating of the code of ethics to address issues regarding privacy and ambiguity in company client agreements, stricter legal policies, and the introduction of punitive measures for parties involved in such violations. This paper provides an analysis of Facebook’s emotional contagion case including the ethical issues involved and resolutions.
Ethical Concerns
The research conducted by Facebook raises concerns due to the nature of experiments and the involvement of unaware humans as subjects. The main aim of the Facebook research was to manipulate the emotions of individuals through manipulation of their newsfeed to understand the impact this had on their mood by monitoring their subsequent Facebook posts. In doing so, Facebook was messing with the psychology of its subjects as this affected their emotions. Some newsfeeds ware full of positive messages while others were full of negative news. The study revealed that the type of messages on the newsfeed had an impact on individuals posts. This is a clear indication that the manipulation did indeed have an impact on the psychology of individuals to the extent of affecting their Facebook posts up to seven days later. According to Grimmelmann (2014), Facebook was doing sad psychology experiments of its users. Kamer (2014), purports that those affected “produced an average of one fewer emotional word, per thousand words.” He further argues that critics are unaware of the full magnitude of the experiment on the subjects. Moreover, the management supports its actions by stating that the user agreement by Facebook users allows them to be used as subjects for research. Although the management of Facebook states that the impact was minimal, there is no conclusive and sufficient evidence suggesting the same. The effect of the study on individuals might have been severe for certain cases. Facebook did not conduct a follow up of the subjects to ascertain that the study did not have a severe effect than the intended impact.
Facebook violated the NSPE first canon that requires Engineers to hold paramount the safety, health, and welfare of the public. Instead of upholding these fundamental canons, Facebook subjected its users to psychological experiments that might have had a negative impact on the user. In manipulating the newsfeed to impact the emotions of the user, Facebook did not promote the welfare of the public and only did the contrary. The corporation put its uses in harm’s way as the negative newsfeed might have had undesired consequences on the user. It might have led to anxiety disorders when individuals only see negative things happening around them and to their friends making them lose hope in humanity. The extent of the impact of the experiment is unknown despite the corporation citing that they were minimal.
The Facebook research has raised various ethical concerns relating to “informed consent” of individuals. Critics argue that Facebook failed to acquire the informed consent of the subjects used in their research. Informed consent is defined as attaining the permission of an individual before the use of their data. It is majorly applicable in health care with modern technology companies introducing the concept of informed consent in their terms and conditions. Therefore, Facebook argues that the user gave agreed to the use of their data for research purposes as highlighted in Facebook’s terms and conditions. However, it has been argued the issue of informed consent is debatable. According to Flick (2016), a majority of Facebook users do not comprehend the data use policy of the corporation. This is because a majority of the users do not read the terms to understand them completely. Therefore, such individuals may give a consent with the requisite information to do so because they are not well informed. Grady (2015), purports that a majority of the consent that users are required to give is not usually precise. They are usually vague and only cite that the information may be used in accordance with the organization's terms and policies. Therefore, the vagueness does not usually imply the specific use of the information and thus an organization might adjust their policies to their liking to allow them to use personal information in accordance with their wishes. For example, Jouhki, Lauk, Penttinen, Sormanen, and Uskali, (2016) point out that the use of Facebook user personal data was introduced into the corporation’s policies after the experiment spared public outrage. This incident goes to show that organization usually use vague terms and conditions to give them a legal shield against litigations.
The Facebook user agreement raises the further issue of whether there is consent of the user. The user agreement has been likened to the forms filled by subjects of psychological experiments as they share some similar characteristics. By agreeing to the user agreement terms and conditions, whether or not the user understands the terms, the service provider usually absorbs themselves from any form of legal litigation. However, this still leaves out the issue of honor and responsibility of the service provider. Engineers should ensure that the act responsibly to protect the honor of the profession. Making individuals sign terms and conditions that they are unaware is not ethical of such companies. Facebook thus violated the NSPE canon that requires Engineers to conduct themselves honorably, responsibly, ethically, and lawfully to enhance the honor, reputation, and usefulness of the profession. The failure to adhere to this canon sparked public outrage that leads to a poor reputation for Facebook. This impacted on the reputation of the Engineers and all the management and staff at Facebook.
Facebook has been accused of deceiving its users by withholding information on the experiments and its intended purposes. The laws of research require that the subject is informed of the experiment and what it aims to achieve. It also allows for those not interested to opt-out of the experiment. The option of opting-out usually affects research experiments as a majority of individuals may choose not to participate. According to Kahn, Vayena, and Mastroianni, (2014), a lot of researchers usually use data collected by third parties such as Facebook because of the various challenges of collecting data. Facebook has been accused of deception because it did not inform its users how their information will be used. Perhaps it is because they did not wish to give the opting-out options which would require Facebook to reveal the intended use of personal data. Opting-out of the terms and conditions of Facebook means opting out of the social media platform which has become to a large extent a part of the social lives of individuals. Jouhki, Lauk, Penttinen, Sormanen, and Uskali, (2016), thus argue that the matter of consent is the most controversial. The author's author that a person might be deceived about an experiment in which they have given informed consent to such as in the case of Facebook. Facebook violated the NSPE canon that requires Engineers to avoid deceptive acts.
Resolution
Facebook did not expect a negative public review of their research which was published in the Proceedings of the National Academy of Sciences. According to a statement by the chief technology officer, Mike Schroepfer, the company was unprepared for the negative reaction from the public and regretted experimenting with a manner in which it did. Mike Schroepfer, gave a public apology where he admitted that the corporation takes responsibility for its actions and is aware that they would have utilized various non-experimental practices. He also cites that Facebook did indeed fail to offer proper communication of the primary aim of the study to its users which made them feel deceived. He also pointed out that the research could have benefited from review by senior people. Facebook’s apology was among the resolution measures adopted by the company.
According to Rushe (2014), Mike Schroepfer also outlined two other measures to be undertaken by Facebook to ensure that it avoids a similar ethical dilemma in future. The corporation has introduced a new set of guidelines governing such research. Any research activity that includes the use of personal information and that may have an emotional impact on the users will undergo a thorough review. The review will involve the use of an academic expert as well as the formulation of a committee before it embarking on such an experiment. The committee will be constituted by engineers, researchers, legal as well privacy and policy teams. The main aim of the new guidelines is to ensure that future research conforms to both ethical and legal requirements. Another strategy that Facebook will implement a resolution is the training of its staff on ethical research practices. Facebook will engage its staff in six weeks “boot camp” where the training will take place. The boot camp aims to ensure that the employees are well conversant with what is expected of them and thus ensure that they avoid ethical controversies in their research work. According to Stappenbelt (2013), it is vital to train Engineers on the ethics of research. This ensures that they are aware of the ethical expectations and concerns while they conduct research. Facebook opted for the above strategies because it recognizes the importance of research in the continued improvement of their services. Mike Schroepfer retaliated the importance of extensive research to the organization as well as the importance of Facebook users. He also pointed out it wants to ensure that it builds trusts with its users to ensure a good relationship between the two. The resolutions strategies adopted by Facebook will be essential in ensuring that they conform with the NSPE canons for engineers and will help build its reputation through reduced legal litigations as well as ethical controversies.
Conclusion
In conclusion, Facebook’s decision to perform research without informed consent of the test subjects, manipulation the public’s emotional and psychological state and manipulation of information accessed by it users represent violations of various ethical and legal policies. The case study is extremely important in the engineering society as it helps identify the repercussions of non-compliance with regards to the profession’s set legal and ethical expectations while also providing scenarios of how the fundamental tenets of the code of conduct may be violated. As discussed herein, Facebook’s ethical case shows activities that contradict three fundamental canons of the engineering ethics’ document. The firm did not hold paramount the safety, health, and welfare of the public, it engaged in deception and, finally, its conduct was not honorable, responsible and in compliance with available legal and ethical requirements. The latter implies that the firm’s action put in question the honor, reputation and usefulness of the company, engineers involved, and the overall profession. Facebook not only lost its client trust buts also its overall reputation highlighted by public outcry and reproach from stakeholders from political, social and legal fraternities. As discussed herein, the resolution process faced few legal challenges. However, Facebook provided a public apology regretting its actions, restructured its guidelines with regards to future research and mandated training for professionals involved in the company’s research activities. From a thorough review of Facebook’s ethical case, it is imperative that professional, including those in the engineering field, always act in a lawful, ethical, responsible and honorable manner.
References
Booth, R. (2014). Facebook reveals news feed experiment to control emotions. The Guardian. Retrieved 2nd November 2017, from https://www.theguardian.com/technology/2014/jun/29/facebook-users-emotions-news-feeds
Flick, C. (2016). Informed consent and the Facebook emotional manipulation study. Research Ethics, 12(1), 14-28.
Grady, C. (2015). Enduring and emerging challenges of informed consent. The New England Journal of Med-icine, 372(9), 855-862.
Grimmelmann, J. (2014). As flies to wanton boys. The Laboratorium. Retrieved from http://laboratorium. net/archive/2014/06/28/as_flies_to_wanton_boys
Harris Jr, C. E., Pritchard, M. S., Rabins, M. J., James, R. W., & Englehardt, E. (2013). Engineering ethics: Concepts and cases. Cengage Learning.
Jouhki, J., Lauk, E., Penttinen, M., Sormanen, N., & Uskali, T. (2016). Facebook’s emotional contagion experiment as a challenge to research ethics. Media and Communication, 4(4).
Kramer, A. D. I., Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences of the United States of America, 111(24), 8788-8790.
Kramer, A. D., Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 111(24), 8788-8790.
Metcalf, J., & Crawford, K. (2016). Where are human subjects in big data research? The emerging ethics divide. Big Data & Society, 3(1), 2053951716650211.
Runeson, P., & Höst, M. (2009). Guidelines for conducting and reporting case study research in software engineering. Empirical software engineering, 14(2), 131.
Rushe, D. (2014). Facebook sorry – almost – for secret psychological experiment on users. The Guardian. Retrieved 2nd November 2017, from https://www.theguardian.com/technology/2014/oct/02/facebook-sorry-secret-psychological-experiment-users
Stappenbelt, B. (2013). Ethics in engineering: Student perceptions and their professional identity development. Journal of Technology and Science Education, 3(1), 3-10.