Confessions of an ex-Facebook trending news curator: ‘They are just going to get rid of the product altogether’

This article is part of our Confessions series, in which we trade anonymity for candor to get an unvarnished look at the people, processes and problems inside the industry. More from the series →

Barely two days after changes to its Trending Topics section, a fake article about Fox News anchor Megyn Kelly began trending on Facebook — before entirely disappearing on Monday morning.

The incident came on the heels of Facebook’s announcement on Friday that human curators will no longer write the short descriptions that accompany trending topics on the site. Its trending news team was then shown the door. The Trending Topics section has been shrouded in controversy since May, when a Gizmodo report alleged an inherent bias in its prioritization of news articles, which also triggered a Congressional investigation as well as an internal Facebook investigation.

The changes, clearly, are not devoid of problems. For our latest Digiday Confessions, we talked to a member of the now-defunct trending news team at Facebook, who reflected on the recent developments as well as time spent working on the team. Digiday has also reached out to Facebook for comment but has not yet received a response.

Were trending topics “biased?”
I wouldn’t say that it was a systemic problem with biases per se, but there were things in that Gizmodo article that were accurate. Ninety percent of the team identified as liberal, including the copy editors, who essentially had the final approval on topics. If a source came up that may have been less credible to a liberal reviewer — like Breitbart or another publication like that — it would require more extensive secondary sourcing. However, if there was an article that came from a more liberal-slanted publication, it was essentially given less critique and was a more viable topic from the get-go.

Did anything change once the article came out?
Facebook actually put more checks in place to make sure that that wouldn’t happen. And it did kind of stop. They balanced it out in that those liberal publications came under more scrutiny. If there was a topic that could potentially be biased, they would require more eyeballs on it. They made changes to the algorithm itself, so less credible topics wouldn’t pop up into the feed. People paid a lot more attention. The writing style also got drier. Any headline with loaded adjectives, like ‘Hillary Clinton attacked for emails’ or something like that, they were more focused on changing the verbiage.

Was anything exaggerated?
There were things that were bullshit, like how they said we weren’t treated like other Facebook employees. That’s not true, we were treated like anybody else. We could go to the happy hours, participate in the events, people talked with us in the office. We weren’t tucked away in some corner. We got three free meals. But my biggest problem was that while all these perks were great, they pampered us into complacency.

What do you mean?
Most newsrooms have discussions about what’s going on. You don’t just sit on your computer and write with your headphones on all day. When we were on these topics, and I had a question or wanted to get an editor’s opinion, I always felt like I was bothering them. The push toward quotas and producing content didn’t allow for that. You never felt like you were able to voice any considerations. Like, for instance, there were problems with the tool’s tagging feature. There were pre-set keywords, but they were sometimes inaccurate and wrong, and there was nowhere for us to voice that these topics were insufficient.

Was this symptomatic of a bigger issue of communication problems between the trending team and Facebook’s broader culture?
It never seemed like anyone in the company ever actually understood what we did or understand how the topics were curated. There were times when another team was working with a client and they happened to be trending, and they would ask if we could add a video or something because the client expected that. They didn’t get that it would mean breaking that wall between editorial and business. We would sometimes end up acquiescing to their requests and adding that video, and I just felt like that broke journalism ethics.

Wait, sales could influence trending topics?
No. Sometimes we’d get a request saying, “Hey, our client did this and it isn’t trending, can we make it trend?” And the answer for that would be, no, because that’s not how the algorithm worked. You couldn’t just inject a topic after the Gizmodo article came out. In the past, we had the ability to inject news topics, but I didn’t see it happen with any sales requests. The requests we got from them were more like if a topic was already trending, and there was related media or articles, they would request that we add that. But I still felt like that was a break of journalism ethics.

Can an editorial function live within Facebook?
You would essentially have to have them be a completely independent team, where they had full control over the editorial process and didn’t have to answer to anybody at Facebook. It would have to function like a newsroom. Had that gap existed between editorial and the rest of the company, it would have been a more legitimate product. We never felt the support of Facebook behind the product. It was just a little tab, you couldn’t go anywhere, like facebook.com/trending, where you could read all these topics in a feed.

Did that make you feel disposable?
Yes, I expected that we were going to get laid off and had already started applying elsewhere about a month and a half ago. You know how it looks like now? With just a simplified topic and the number of people talking about it? We saw that before anybody else did, and a few of us put two and two together and figured that it was probably how it was going to look like; otherwise, they wouldn’t be testing it on Facebook employees.

So the purpose of the trending team was just to teach the algorithm how to eventually filter the news itself?
I would like to believe that, because that would mean that we actually served a purpose and did something good. But if you’ve used the tool in the last few days, you’d realize that the algorithm didn’t learn shit. The topics are just wrong — they have bad articles and insufficient sources. I think they are just going to get rid of the product altogether, because there is going to be backlash when people who do use the tool realize that the quality has gone down — unless there are severe algorithmic changes that improve the quality of the topics.

https://staging.digiday.com/?p=195252

More in Media

NewFronts Briefing: Samsung, Condé Nast, Roku focus presentations on new ad formats and category-specific inventory

Day two of IAB’s NewFronts featured presentations from Samsung, Condé Nast and Roku, highlighting new partnerships, ad formats and inventory, as well as new AI capabilities.

The Athletic to raise ad prices as it paces to hit 3 million newsletter subscribers

The New York Times’ sports site The Athletic is about to hit 3 million total newsletter subscribers. It plans to raise ad prices as as a result of this nearly 20% year over year increase.

NewFronts Briefing: Google, Vizio and news publishers pitch marketers with new ad offerings and range of content categories

Day one of the 2024 IAB NewFronts featured presentations from Google and Vizio, as well as a spotlight on news publishers.