Mai Rosner is a campaigner for Global Witness who, through the use of an online alias, investigated the extent of climate disinformation on Facebook. In this episode of Planet v. Profit, she shares the biggest takeaways from that experience.
Global Witness campaigner Mai Rosner and host Kirsty Lang discuss how Facebook’s algorithm serves users posts that can reinforce existing beliefs and amplify inaccurate reporting about the climate crisis.
If a user joins the platform with a tendency toward skepticism about climate change, for example, “likes” a post that denies the climate crisis, Facebook will present that user with posts that reinforce that position. On the other hand, if a different user wants to know more about changes the climate emergency will bring, and “likes” a post from the IPCC (Intergovernmental Panel on Climate Change), Facebook will serve that user posts to reinforce that opposite position.
To study Facebook’s tendency to serve climate disinformation to users who “like” disinformation posts, Mai created an online “persona.” This persona, called “Jane,” was a blank slate at first, with no biographical background to offer to Facebook’s algorithm. But then, Mai directed “Jane” to “like” one climate disinformation page, and this single action set in motion a series of climate disinformation posts for Jane to read and be influenced by.
Mai and Kristy also discuss how social media companies are disincentivized from reducing disinformation on their platforms and what regulatory change it might take for them to change this harmful behavior. The algorithmic design that reinforces users’ existing belief, and potentially amplifies climate disinformation, is not only present on Facebook, but other social media platforms such as TikTok and Twitter.
Planet v. Profit is a podcast from Global Witness that holds power to account. Join us every month for new episodes as we take you into the heart of our investigations. Find all episodes of Planet v. Profit on Apple Podcasts, Spotify, Google Podcasts, as well as on the Global Witness website and wherever you like to listen to podcasts. To stay up-to-date with all of our work, you can also join our mailing list, and follow us on Twitter and Instagram.
FIND THE FACTS | EXPOSE THE STORY | CHANGE THE SYSTEM
Read the full Climate Disinformation Report: https://gwitness.org/ClimateDisinformation
Find out about Global Witness’ Digital Threats campaign: https://gwitness.org/DigitalThreats
Sign up to our newsletter: https://gwitness.org/Newsletter Website: https://globalwitness.org
Kirsty Lang:
Our climate is in crisis, from the palm oil industry's human rights abuses to climate disinformation spreading online, from fossil fuels and illicit mining propping up violent regimes, to big banks financing tropical deforestation. And always the defenders and activists risking their life to stand up for their land against profiteering corporations and repressive governments. We'll be hearing their stories. I'm Kirsty Lang. This is Planet v. Profit.
Mai Rosner: (COLD OPEN)
I think it's very concerning because this is the kind of information or disinformation that really changes the contours of political debate. And we know that these conversations and these narratives don't stay online, they come into our kind of political lives and shape public discourse around these issues.
Kirsty Lang: (HOST INTRO)
Well, in this episode, we're going to be diving into the murky world of climate disinformation on Facebook following one woman's journey online. My guest today is Mai Rosner. She's a campaigner for Global Witness. Mai went undercover on Facebook in search of climate truths and climates falsehoods. To do this, she assumed a false identity online. Her cover was known as Jane. So Mai, who is Jane? I mean, where did Jane begin on her journey?
Mai Rosner:
So Jane is a young woman living in the UK. And we set out to create her persona, because in our monitoring of climate conversations we realized that people were occupying really different spaces when it came to conversations around climate. These conversations were really divided and it seemed like these narratives were increasingly getting tangled up in kind of culture wars. And from that observation, we wanted to see what the experience of a climate skeptic user might be on the platform. And that's when we got the idea to create Jane.
Mai Rosner:
In order to ensure that the recommendation and experience that Jane was having online wasn't corrupted by other data points that the platform would collect on her, like her interests or her lifestyle. We kept Jane's profile intentionally bare. The reason we kept it bare bones was because we were wanting to isolate the experience of a climate skeptic user. And if we added lots of other data about her, the recommendations would also be shaped by her other preferences. So in order to control against other mitigating data, we isolated the climate conversation by making that her sole interest. And so Facebook's recommendations to her were based initially just on a single like. All that Facebook knew about Jane was that she was a young woman who lived in the UK, who expressed an interest in climate skeptic content. This was indicated to Facebook through liking a single page, Net Zero Watch.
Kirsty Lang:
So you gave Jane, from the start, some doubts about whether there's a climate emergency.
Mai Rosner:
So we began by directing Jane to like the Global Warming Policy Forum, which recently changed its name to Net Zero Watch, which is a group that regularly publishes climate disinformation. So here's a clip that Net Zero Watch posted to Facebook.
Chancellor Nigel Lawson:
The government clearly prioritizes net zero over anything else, whether it's national security or energy security or economic impact, net zero comes first. I very much doubt that government will survive that kind of dogmatism.
Mai Rosner:
And we chose this as a starting point because it's not a fringe group that exists on the corners of the internet. It's a kind of relatively well connected group with connections to UK parliament.
(LOCATION SOUND) Order. Order.
Mai Rosner:
That has a large following online. So it's a more likely beginning point for someone than a small fringe group.
Kirsty Lang:
Wasn't that the group that was set up by the former British Finance Minister, Chancellor Nigel Lawson, about 30 years ago. He always used to famously be on the BBC as the sort of the climate denier until the BBC stopped allowing him on it. When it was still up for debate, whether there was a climate emergency. So I think it's that group, isn't it?
Mai Rosner:
Yes, it is that group. And like you say, Lord Lawson, who founded the group, is a prominent climate change denier. The group has now pivoted away from outright climate denial, which is something that we've seen many groups do. Researchers who kind of track these narratives have noticed this concerted shift away from climate change denial to more distract and delay narratives that are still as pernicious because they're watering down the conversation.
Kirsty Lang:
What do you mean by distract and delay? I mean, what is Jane actually reading?
Mai Rosner:
So distract and delay narratives are ones that attack measures to mitigate against the climate crisis. They attack climate policies. Or they try to paint environmentalists and scientists as alarmists and biased by saying things like the effects of climate change are not that serious, or renewables are unreliable, or they'll cost you your job and they're bad for the economy. And things like that aren't outright saying that climate change isn't real, but they are kind of distorting the truth around efforts to mitigate against the worst effects of climate change. And they kind of cherry pick science to portray different inaccurate images of what climate change really means.
Speaker 1:
Many climate change alarmist seem to claim that all climate change is worse than expected, but the facts don't support this. This does not mean global warming is not real or a problem, but the one sided story of alarmism makes us lose focus. If we want to help the world's poor, who are the most threatened by natural disasters, it's less about cutting carbon emissions than it is about pulling them out of poverty.
Kirsty Lang:
I mean, I've noticed that some conservative politicians in Britain have recently started to talk about kind of the affordability aspect and kind of implying sort of, it's all very well saying we've got to achieve net zero, but it's going to be really hard for a lot of working people because it'll mean even higher fuel bills and electricity bills.
Mai Rosner:
Yeah. So there's a group of conservative MPs called the net zero scrutiny group, and they are very committed to whipping up as much opposition to net zero efforts as possible. And they're really couching this in this perception that net zero is like an elitist construct and that it is an attack on the working class. And this has the effect of really polarizing debate around climate change. And it is a way of distracting the conversation. And it's a way of encouraging kind of inactivism and making climate change a very hot button issue in the same way that politicians scapegoat around issues like migration or identity politics. There is a kind of campaign now to really make net zero the next kind of front of the culture war.
Kirsty Lang:
And so Jane likes something on Net Zero Watch. And then what happens to her next? And what sort of content does she find there?
Mai Rosner:
Jane likes the page Net Zero Watch and immediately Facebook's algorithm offers her a menu of recommendations of other pages that they believe, based on her existing activity, which is this single like, she would enjoy. Here's another example from their page.
Speaker 2:
The idea that there's any source of energy that we can derive that's not going to produce some pollutant as a consequence, that's the kind of nonsense you hear from people who say things like net zero. We're going to hit net zero by 2050. It's like no we're not.
Kirsty Lang:
They sort of seed doubts in people's minds about whether it's, just little doubts, rather than saying it's not happening.
Mai Rosner:
Yes, exactly. And in doing so, they kind of change the shape of discussions around climate change. So it's not about whether it's real or not. It's about attacking measures to mitigate against it, or confusing these conversations about what net zero really means and how it's going to impact people in their daily lives. And so the content primarily promoted to Jane on their page is this distract and delay tactics that are really focused on attacking climate change mitigation policies.
Kirsty Lang:
So their tactic is to concentrate say on the cost of living crisis. Is that somehow to imply that maybe worrying about the climate emergency is something the metropolitan and elite or the middle class can afford to do, but ordinary people can't afford it. Is that the sort of message she was getting?
Mai Rosner:
That is exactly the kind of message that they're pushing for. And concerningly, we're seeing an increase of culture war tactics used in climate conversations in the UK. This kind of discourse has existed for much longer in the United States, but increasingly pundits and politicians and influencers who kind of traffic in culture wars have been using climate as the new frontier to pose net zero is dreamed up by metropolitan elites. And in fact, it will keep the working class cold and poor, effectively is what they're trying to argue.
Kirsty Lang:
How worried are you by this kind of output?
Mai Rosner:
I think it's very concerning because this is the kind of disinformation that really changes the contours of political debate. And we know that these conversations and these narratives don't stay online, they come into our kind of political lives and shape public discourse around these issues. So already in the United Kingdom, we see connections in Parliament with this group, Net Zero Watch. One of their board members is Steve Baker, a conservative member of Parliament. And there's the creation of the Net Zero Scrutiny group, which is a coalition of about 20 conservative MPs that are calling for fracking to be resumed. And there have been criticizing government efforts to decarbonize industry. Here's an example from Net Zero Watch's Facebook page that spreads misinformation about fracking.
Speaker 3:
Fracking.
Speaker 10:
Yeah, fracking.
Speaker 3:
Really?
Speaker 10:
This thing that environmentalists hate. It's like don't frack.
Speaker 3:
But it's a double edged sword, right? Because fracking has definitely polluted some water supplies.
Speaker 10:
Not really.
Speaker 3:
No?
Speaker 10:
No.
Speaker 3:
It hasn't polluted any water supplies.
Speaker 10:
Look-
Speaker 3:
Did you ever see that-
Speaker 10:
Everything pollutes something.
Mai Rosner:
We've also recently had Nigel Farage announced his campaign to get a referendum on net zero.
Nigel Farage:
I think we need a referendum on the whole net zero proposal. Why? Because it's been imposed upon people without any public discussion.
Mai Rosner:
This is really concerning because these kinds of online disinformation campaigns have the ability to derail progress towards solving our most grave collective issue, which is the climate crisis.
Kirsty Lang:
So if Jane starts on Net Zero Watch, what path does she then take down the Facebook rabbit hole?
Mai Rosner:
So the first page that Jane was offered by Facebook's algorithm after she liked Net Zero Watch was a page called Climate Depot. This next clip is an example from the Climate Depot Facebook page.
Speaker 4:
A to Z of all the global warming claims, you'll find out about all the extreme storms that aren't happening. In other words, as CO2 has risen in the atmosphere, extreme weather has actually declined. Droughts, floods, tornadoes, hurricanes, all on climate time scales are down. There was a survey, the American meteorological society a few years back, of the US meteorologists and up to 75% were skeptical of global warming.
Mai Rosner:
Climate Depot is an outlet run by Marc Morano, who is a very high profile climate denier based in the US. He runs communications for a for-profit think tank called CFACT, which has received funding from Exxon Mobile and from Chevron. And they are a climate denial group.
Kirsty Lang:
So that's the Facebook algorithm that's taking her onto, if you like, more extreme sites.
Mai Rosner:
Yes, that is the Facebook algorithm. And we decided to take the simulation a step further. So we then liked Climate Depot, that page. And saw, again, Facebook offer a menu of recommendations. The pages that we looked at through that series of recommendations were mostly conspiracy pages or climate denial pages that had very small followings. And that just shows the kind of funneling effect of the information becoming more extreme, the radicalization, and the communities becoming smaller as you go further and further down the rabbit hole. So what we found was that Jane was not only directed to more disinformation of the same degree, that just affirmed her existing beliefs, but that often that information got worse. So that what began on a page full of distracted and delay narratives ended on conspiratorial pages about chem trails and climate change being a hoax to control the population.
Speaker 5:
I mean the plans, when you look out there, the plans the environmentalists want is for us to give up our way of life, when they're not going to get China or India to give up their way of life. So they've gotten so extreme about it, they're encouraging ecoterrorism now.
Kirsty Lang:
Do you know how the algorithm has changed over time? Are we now seeing Facebook using algorithms that have a greater tendency to send people like Jane down these ever more extreme rabbit holes, if you'd like?
Mai Rosner:
So what this ultimately comes back to is Facebook's business model. Facebook gets 97% of its revenue from advertisement. And the way they continue to generate profit is by keeping users on their platform for as long as possible. So this is an engagement based model for profit. So how it works is that Facebook tries to keep you on its platform for as long as possible to show you as many ads as possible so that they can generate revenue. But all the while that you're on the platform, they are also amassing data points about you, about your characteristics, your likes, your dislikes. They use that data to sell more targeted advertising. It is not just the advertising that is targeted towards your profile. It's also the information. So Facebook is exploiting the principle that individuals like to consume content that affirms their existing beliefs. And so what they do is they could serve two users with different profiles, completely different information on exactly the same topic. And that drives users into filter bubbles, where they're only encountering information that affirms their existing belief.
Speaker 5:
I think we are involved in causing the climate to change. I do. I also don't care. I know you're supposed to care passionately about this issue. I don't care. I think our role is overstated by a bunch of antagonistic hysterics who think that we have to scare ourselves into carrying out their agenda so that they can get government funding.
Mai Rosner:
But what the engagement model does further is that in order to keep you on the platform, they continuously serve you information that is more extreme, more likely to elicit a reaction, more likely to get you to share, or to be angry, or to like, or to comment. And so it's the engagement based business model. It is optimizing for user engagement that is causing this filter bubble effect, which is pushing users down rabbit holes.
Kirsty Lang:
It's kind of like a drug addiction really, isn't it? They have to kind of sort of up the dose all the time, as you get more used to it.
Mai Rosner:
So I think now is probably a good time to introduce our other user, John. What we noticed with Jane, the fact that she was continuously served information that affirmed her belief and that at times became more extreme. If that's true that the way we are consuming information is pushing us into these filter bubbles, then it's also true that someone who cared about the climate crisis and was seeking out good information about it would be served more good information.
Mai Rosner:
And so in order to test this, we set up a second account and we gave this man an identity. His name was John. And John, similar to Jane, did not have many kind of identifying features other than that he was a man living in the UK and he expressed an interest in good climate information by liking the IPCC's Facebook page. And as soon as he did that, Facebook's algorithm continued to serve him more good information about the climate crisis. And we traced his trajectory through multiple page recommendations. John was continuously served more reliable climate science information. And this split screen existence between two people from the same place on the very same platform, dealing with the same issue, kind of exemplifies the polarizing nature of not just big tech, but the way that we consume information.
Speaker 6:
Hundreds of scientists issued a new report, warning our world is rapidly warming.
Speaker 7:
And we begin this hour with breaking news out of Geneva, a dire message from climate scientists.
Speaker 6:
Authors of the intergovernmental panel on climate change report say drastic cuts to greenhouse gas emissions across the globe are desperately needed.
Mai Rosner:
Contrary to Jane's experience. John was consistently served more reliable climate science information. And that kind of split screen reality between the two experiences, on the very same platform, shows that kind of radicalizing effect. Where it's two people being served completely different information on the same topic.
Kirsty Lang:
Now this starts to explain why we live in such polarized societies. The Facebook algorithm is just reinforcing our bubbles all the time.
Mai Rosner:
Exactly. We saw the division between the two realities very starkly in the kind of information that Jane and John were offered. For example, in Jane's world, she was being served information that said that there was more Arctic ice than ever. Whereas John was shown accurate information, which detailed how Arctic ice is melting at an alarming rate and contributing to sea level rise. Jane was served information that extreme weather events like hurricanes, tornadoes and wildfires are not increasing. Whereas John was accurately shown information that these events are increasing. And that they're not only increasing in frequency, but that they're increasing in severity.
Kirsty Lang:
I mean, do we know what this does to somebody, like Jane, as they start being led down more extreme rabbit holes?
Mai Rosner:
Yeah. So what it does is weaken support for climate action and really muddle conversations around climate change. So we're no longer speaking about the best ways to tackle the climate crisis, but we end up muddled down in debating the proven reality of the climate crisis. That in turn hinders the ability of policy makers to take meaningful climate action.
Kirsty Lang:
Now, Facebook, or I guess we should call them Meta these days, admit that climate disinformation is a big issue on their platform and their solution has been to set up a climate science center. Do you know whether that'll have any impact?
Mai Rosner:
Yeah, so they set up this climate science center, which was designed to connect users with reliable information on climate science. And they also announced this initiative where they flag climate content with these banners that direct users to the climate science center. Of the content that we analyzed, less than 25% of climate disinformation had a flag. Only 34% of outright climate denial had a climate science flag. And at the same time that Jane was sporadically and infrequently receiving these flags on the content she was viewing, she was also being encouraged to actively like and follow pages that almost exclusively espoused climate change denial or disinformation. So those two efforts are completely at odds.
Kirsty Lang:
And why is that? I mean, that sounds kind of crazy. Is that because Facebook just doesn't have enough monitors or as you say, it's just, it's the profit motive. They're just making so much money out of advertising and directing people down these rabbit holes that they don't have an incentive to tackle it.
Mai Rosner:
Yeah. Facebook does know how to solve these problems. That is, in point, that was really emphasized by whistleblower Frances Haugen and her testimonies to UK government and to Congress in the US. Is that they know how to fix these problems, but they do not have the will to do so because they're unwilling to weigh user safety against their profits. So even in the instance of the climate science center, another initiative that they launched was a $1 million investment into groups who are fighting climate disinformation. $1 million is equivalent to about 30 minutes of Facebook profit. So it just does not have, they're not committing the requisite resources to tackle the scale of this problem and they have shown time and again that they are unwilling to self-regulate.
Kirsty Lang:
Well, I think we can hear a clip now of Mark Zuckerberg, the CEO and founder of Facebook. He has been challenged on this issue and this is one of his responses.
Mark Zuckerberg:
At Facebook, we do a lot to fight misinformation.
Speaker 8:
Mr. Zuckerberg, yes or no. Do you acknowledge that there is disinformation being spread on your platform?
Mark Zuckerberg:
Sorry I was muted. Yes. There is. And we take steps to fight it.
Speaker 8:
Thank you. Yes or no. Do you agree that your company has profited from the spread of?
Mark Zuckerberg:
Congressman, I don't agree with that. Our approach to fighting misinformation, of which climate misinformation I think is a big issue. So I agree with your point here, we take to a multi-pronged approach. One is to try to show people authoritative information, which is what the climate information center does. And then we also try to reduce the spread of misinformation around the rest of the service through this broad third party fact checking program that we have. In which one of the fact checkers is specifically focused on science feedback and climate feedback type of issues.
Kirsty Lang:
Mai, in your view, what does Facebook need to do to show that it's serious about tackling this problem?
Mai Rosner:
I think that Facebook has proven that they are unable to self regulate and it really is up to governments to step in and legislate against the power of big tech to shape our realities in these really dangerous ways. So far, the European Union has gone the farthest in doing so with their flagship legislation, The Digital Services Act, which we are campaigning to get the most robust version of it passed. And if passed, it will require these platforms to conduct risk assessments against their impacts on fundamental rights, and also will require transparency measures that will give independent auditors access to information about these algorithms.
Kirsty Lang:
This is basically about regulation. I mean, you are saying that the only way really to tackle the behavior of big tech is to regulate it. I mean, I know in this country, for instance, there's regulation for broadcasters and newspapers. You can't just publish hate speech and lies. Would you like to see these platforms treated as though they were publishers or media platforms, and be subject to similar types of regulation?
Mai Rosner:
So one of the issues with algorithms is that we really don't understand how they work. And there are very few people in the world who do. It is thanks to the good work of many researchers that we even have, like a basic understanding of how these algorithms are shaping our conversations. In large part, they remain a black box, and even many people within these companies don't understand how exactly the algorithms work and how they're serving us different kinds of information. And for something that has such a large impact on our lives, on our public discourse, on the health of our democracies, we need more transparency and more information about how these algorithms are working and how they're shaping our behaviors. These are platforms that we engage with every day. 50% of adults in the UK and in the US say that they get their news from social media.
Mai Rosner:
So there needs to be greater responsibility placed on these platforms about how they shape our realities in these really dangerous ways. And like I said, the European Union is making strides towards doing so, but other jurisdictions, notably the United States, need to follow suit because these companies have been allowed to run unaccountable for far too long. And this issue of climate disinformation is not the only one that Facebook has admitted to playing a role in. They admitted that they played a role in the genocide in Myanmar, and even the scale of that kind of atrocity has not been enough to get them to change their behavior.
Kirsty Lang:
How scary did you find it being Jane online?
Mai Rosner:
I found it concerning and alarming because we are at a really critical precipice in the fight to mitigate against the worst impacts of climate change. And what it emphasized to me is that [inaudible 00:27:12] kind of cooperation and shared understanding that we really need right now is being eroded. And we just don't have time to be debating the proven reality of the climate crisis, whether or not it's real, whether or not it's impacts are serious. We don't have time. We really are on the critical moment. And I am concerned that our conversations both online, and then as a consequence, in the halls of power are getting eroded around this issue.
Kirsty Lang:
And what do you think is driving these people to put out this climate disinformation? Do they seriously believe it? What's their motivation?
Mai Rosner:
It depends on who you're looking at. There are, as I mentioned in the United States, there's a kind of a network of for-profit think tanks that are very closely affiliated with the energy industry and the likes of the Koch brothers and the Mercers. And they have kind of profiteering and industry interests in changing the conversation.
Mai Rosner:
Others who are sharing this content further down the rabbit hole, on the kind of corners of the internet, might not have a motive. They might just truly believe this stuff. But the important thing is that this content not be amplified. And that's what Facebook is doing, is that it's taking these kind of fringe ideas that don't actually land with a vast majority of the electorate and they are amplifying them and they're giving outsized voice and reach and power to climate skeptics.
Kirsty Lang:
Now, I know you focused, in your case study with Jane, she's a Facebook user, but presumably this kind of stuffs on other platforms, right? Like YouTube and TikTok.
Mai Rosner:
Yes, absolutely. This investigation focused specifically on Facebook, but we know that climate change disinformation spreads on Twitter and on YouTube and across many other social media platforms. It is not solely Facebook's responsibility. It is also just the radicalizing effect of big tech in general, and its effect on dividing and polarizing conversation.
Kirsty Lang:
How difficult is this to fix because I mean, I'm thinking these platforms have millions and millions of users. I mean, you can't moderate what we're all looking at. Right? So, I mean, is it quite, actually quite a challenge for big tech to control this misinformation? Or is there an easy fix?
Mai Rosner:
So I wouldn't say that there's an easy fix. I mean, in the case of Facebook, you're right, they have actually billions of users around the world. And part of the reason is that they've been focused so much on growth and expansion and on becoming this gargantuan platform. But they have not invested in the kind of responsibility that you need to steward a platform of that size. So they could sink far more resources into content moderation, and specifically in other languages and in other jurisdictions. Because the Facebook that we have, for example, in the United States and the United Kingdom is actually the best version. And this is the version that gave us January 6th.
Mai Rosner:
So the amount of resources that the platform sinks into English language moderation is so much higher than in any other language. And in other jurisdictions Facebook's has millions and millions, billions of users, in fact, whose content is not being moderated. But beyond content moderation, this is also the issue of the business model and for optimizing for engagement. And when you put engagement and profit over people's safety, then that's where it's not about every individual piece of disinformation. It's about how these narratives get amplified, how they spread, and how they are encouraging users to engage with radicalizing content.
Kirsty Lang:
So, I mean, just finally, how can we equip people like Jane to recognize disinformation?
Mai Rosner:
In some ways I just don't think that it should be left to the individuals to mitigate against these overwhelming systems that are pushing them in particular directions.
Kirsty Lang:
You'd encourage people to what?
Mai Rosner:
I would encourage people to consume more diverse news sources, and to challenge their assumptions, and where possible to always refer to, especially on climate science, to reputable sources like the IPCC, and to refer back to the science.
Kirsty Lang:
Can you just remind people what the IPPC is?
Mai Rosner:
Yes. The IPCC is the UN body on climate change. It is a UN convened group of scientists that work specifically on the climate crisis.
Kirsty Lang:
So this is kind of like the top scientist from all around the world. The most trustworthy source of information on the climate emergency.
Mai Rosner:
Yes, exactly.
Kirsty Lang:
Mai Rosner, Global Witness, thank you very much. I'm Kirsty Lang. You've been listening to Planet v. Profit. Join us next time, wherever you listen to your podcasts.
ANNOUNCER
Planet v. Profit is executive produced by Amy Richards, Louis Wilson, Anna Zaraga, and Rachel Taylor. Music from Epidemic Sound and The Blue Dot Sessions. Edited and mixed by Brendan Welch. Ekemini Ekpo is the assistant producer. Series produced by Lee Schneider. Planet v. Profit is a production of Red Cup Agency for Global Witness. Visit globalwitness.org for more information, and to join our mailing list.