Facebook knew that changes it made in its platform to increase audience engagement were pushing its users to spend more time sharing viral content, years before its social-media platform helped to trigger a wave of fake news and hate speech in the digital world.
In an internal Facebook presentation from 2014 that was leaked to journalists from Britain and shared with The Globe and Mail, employees discuss how the company’s shift from desktop to mobile over the previous two years had changed the way users interacted with the platform.
Facebook altered its platform in 2012 and 2013 to make it easier for people to share content from their phones and to encourage users spend more time liking, commenting on and re-sharing posts.
The changes came at a time when Facebook was going public and was looking for ways to improve its business model, internal company e-mails show. The social-media firm started running ads in its news feed in 2012. To encourage users to spend more time on Facebook, the Silicon Valley company said it would focus on making its news feed more personalized and would prioritize the most engaging content – posts that had received large numbers of likes, shares and comments “from the world at large and from your friends in particular.” Several online publishers, including sites such as Buzzfeed, saw a surge in referral traffic from Facebook as the social-media company made it easier to encourage the sharing of viral posts.
Critics and researchers have long warned that the social-media company’s efforts to encourage people to spend more time on the platform – in order to show them more advertisements – have helped encourage the spread of misinformation and shaped the opinions of online audiences.
The details of the company’s internal discussions are contained in documents sealed by a California court as part of a lawsuit involving Facebook and app developer Six4Three. Some of the documents were seized by a British parliamentary committee investigating the company, which published 250 pages of the court file late last year.
Facebook did not immediately respond to a request for comment on the latest release of leaked documents Sunday. However, the company has previously said that the court documents, which date from 2012 through 2015, had been “cherry-picked” from the lawsuit and did not reflect the full scope of internal discussions at the company. “The set of documents, by design, tells only one side of the story and omits important context.”
Years later, Facebook now finds itself at the centre of a widespread backlash over the algorithms that social-media platforms use to prioritize and recommend content that have helped undermine democratic elections, and spread hate speech and misinformation online.
Aiming to combat the spread of hate speech, disinformation and political extremism online, Facebook chief executive Mark Zuckerberg announced a sweeping overhaul of Facebook’s news-feed ranking system last year to prioritize posts from friends over publishers and brands.
But the social-media giant had early warning signs that changes to its news feed were influencing user behaviour. According to the internal presentation, when employees studied the effects of the changes in 2014, they noticed people were re-sharing dramatically more posts, but were writing fewer status updates and other text posts.
“When we turned up re-shares in the system, text posts fell,” the presentation says. “Correlation does not always imply causation – but it does in this case.” The changes had been only partly rolled back by 2014, the document notes, but does not specify which changes had been reversed.
Employees also noticed that the posts people were writing seemed to be influenced by the kind of content they were seeing on their feed.
The data suggested that while people often decided in advance if they wanted to post a photo to Facebook, written posts tended to happen more spontaneously – with people only deciding to write a message after reading something they had come across on Facebook.
“Unlike photos, text posts are malleable and can be ‘triggered’ by what users see in feed,” the company wrote.
About 7,000 documents from the court file were first reported by digital magazine Computer Weekly, NBC and freelance investigative journalist Duncan Campbell. The Globe viewed more than 300 pages of those documents.
The social-media firm has also been a flashpoint in the global regulatory debate over data privacy after political consulting firm Cambridge Analytica improperly accessed personal information on tens of millions of users.
On Friday, the U.S. Federal Trade Commission reportedly approved a record US$5-billion fine against Facebook over how it handled the personal information of its more than two billion users. The settlement is also expected to come with closer regulatory oversight of Facebook’s data privacy practices. Facebook had previously said it anticipated being fined between $3-billion and US$5-billion.
TAMSIN MCMAHON
U.S. CORRESPONDENT
The Globe and Mail, July 14, 2019