play_arrow

keyboard_arrow_right

skip_previous play_arrow skip_next
00:00 00:00
playlist_play chevron_left
volume_up
chevron_left
play_arrow

Calling Bullshit

Hey Facebook: What’s that Smell? Part 1

Calling Bullsh!t February 9, 2022 2992


Background
share close

Our guests

sinan
Sinan Aral

@sinanaral

MIT Professor & Author of “The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health–and How We Must Adapt

lucie
Lucie Greene

@lucieluxury

Strategist & Author of “Silicon States: The Power and Politics of Big Tech and What It Means for Our Future”

rose_square
Rosemarie Ryan

@rosemarieryan

CEO of co:collective

kamran-crop
Kamran Asghar

@kamasghar

Founder & CEO of Crossmedia

Is the company claiming to bring us together actually tearing us apart?

Stated purpose: Give people the power to build community and bring the world closer together.

January 6th, 2021. For the first time in U.S. history, an armed mob at the U.S. Capitol attempted to disrupt the peaceful transition of power from one presidential administration to the next. In the aftermath, it became clear that Facebook had played an outsized role in inciting the event.

How did Facebook go from a place for people to share silly cat videos, to a space for bad actors to make the world considerably worse? In this episode we look at Facebook, starting all the way back to when it began as a dorm-room prank and trace its development into one of the world’s most powerful and dangerous companies.

We’ve got to transcend this debate about whether social media is good or evil. The answer is yes.

-Sinan Aral

Facebook’s BS score is

Show notes

Episode Transcript

MUSIC: “Shit Got Real” BY Jess Fenton 

[SOT] False news is an ever growing virus that threatens to corrode the very spirit of democracy. Facebook is front and center.

[SOT Roger MacNamee] There’s more than one problem going on here, right? I mean, the election problem, the media problem, that’s one class of things. The data privacy is another class.

[SOT President Obama] If we are not serious about facts and what’s true and what’s not. If we can’t discriminate between serious arguments and propaganda, then we have problems.

[SOT Mark Zuckerberg] It is our mission to connect everyone around the world and to bring the world closer together.

TY MONTAGUE 

Facebook says that they’re bringing us together… 

LUCIE GREEN

Ha!

THEME MUSIC: “In Passage” by Migration

TY MONTAGUE (VO)  

Welcome to Calling Bullshit, the podcast about purpose-washing…the gap between what companies say they stand for and what they actually do — and what they would need to change to practice what they preach. 

I’m your host, Ty Montague. I’ve spent over a decade helping companies define what they stand for — their purpose — and then help them to use that purpose to drive transformation throughout their business.

Unfortunately, at a lot of companies and organizations today, there’s still a pretty wide gap between word and deed. That gap has a name: we call it Bullshit. 

But — and this is important — we know that Bullshit is a treatable disease because we’ve helped countless companies close that gap. So when the Bullshit detector lights up, we’re going to explore things that a company should do to fix it. 

TY MONTAGUE (VO)   

We’re devoting two episodes to take a hard look at Facebook. Like you, I’ve been thinking a lot about the company since January 6th 2021, when…

[SOT President Trump] We will never give up, we will never concede…

 TY MONTAGUE (VO)

…an angry, armed mob stormed the United States capitol building. 

President Trump and others had fed them the big lie…  that the election had been stolen.

 [SOT President Trump] All of us here today do not want to see our election victory stolen by emboldened radical left democrats….

 TY MONTAGUE (VO) 

And it spread like wildfire on social media, including Facebook — ultimately, leading to a frenzied mob, out for blood.

 [SOT Crowd] Hang Mike Pence!

 TY MONTAGUE (VO) 

Glued to the news, watching the chaos unfold, I couldn’t stop thinking: how the hell did we get here?

MUSIC: “Yes Indeed” BY Jess Fenton

TY MONTAGUE (VO) 

Watching the Capitol riots that day, I could clearly see how much damage the gap between word and deed can actually do.

So I decided to take a harder look at as many so-called purpose-led organizations as I could to try and understand how many of them are just purpose-washing? What I discovered led to the creation of this show.

 Each week, I feature a company or an organization and invite experts to help me investigate the potential gaps between what they say they stand for and the actions that they’re taking. We’ll explore why these gaps exist, and, more importantly, what the organization needs to do to close them.

 So let’s dig in to Facebook Episode one starting with a quick rewind. 

TY MONTAGUE (VO) 

…back to when Facebook was just an online network for college kids. Born in Mark Zuckerburg’s dorm room at Harvard in 1999, Facebook took off fast. In those early days, the company’s motto was “move fast and break things.”

[SOT Mark Zuckerburg] People are just too careful, I think it’s more useful to like, make things happen and then apologize later than to make sure you dot all your I’s now and just not get stuff done.

TY MONTAGUE (VO)

Soon, Facebook was for everyone. It was where we all reconnected with old friends and shared pictures of our dog and by 2015… 

[SOT] If you used Facebook this past Monday, you were in good company. You and about a billion other people…. That’s one in seven people on earth….” 

TY MONTAGUE (VO)

As it grew, some things definitely got broken, and the company came under increased scrutiny — which really reached a boil around the 2016 Presidential election.

 [SOT] We’re learning more about how groups believed to be linked to Russia used Facebook to meddle in the 2016 election…

 [SOT]  Facebook’s newest scandal revolves around a data analysis firm called Cambridge Analytica… it raises some troubling questions, including about Facebook’s role in targeting voters… 

 TY MONTAGUE (VO)

So in 2017, with Americans more polarized than ever, Mark Zuckerberg announced a new kinder, gentler mission:

 [SOT Mark Zuckerburg] The thing that I think we all need to do right now is work to bring people closer together. And I think that this is actually so important we’re going to change Facebook’s whole mission as a company to focus on this.

 TY MONTAGUE (VO)

“Give people the power to build community and bring the world closer together,” which brings us to the key question of this episode — is it all just… bullshit?

 So get out your BS detector and join me on a quest to find out.

TY MONTAGUE (VO) 

To get to the bottom of this, I talked to some people who’ve spent a lot of time thinking and writing about Facebook. First up: Sinan Aral, director of the MIT initiative on the digital economy, founding partner of Manifest Capital and author of The Hype Machine – How Social Media Disrupts Our Elections, Our Economy, and Our Health, and How We Must Adapt. 

TY MONTAGUE

Sinan, thank you so much for joining us. 

SINAN ARAL

Thanks for having me.

TY MONTAGUE

I got a strong sense from the book that you believe that there is both great good that has flowed from the creation of social platforms like Facebook, as well as great harm. Can you first talk about some of the positive aspects that you see?

SINAN ARAL

We’ve got to transcend this debate about whether social media is good or evil, because the answer is yes.

I mean, I call this the promise and the peril. So we went through a decade of techno utopianism, which was related to this mantra: Facebook was going to connect the world and provide life-saving health information, and so on. And access to jobs and meaningful human connection. Then we went through a decade of techno dystopianism, where Facebook was destroying democracy and polarizing society and spreading misinformation and so on. 

The question that we need to be asking is how do we achieve the promise and avoid the peril? What is the promise? Well, when Nepal had its greatest earth- earthquake in a hundred years, Facebook spun up a “donate now” button and raised more money than Europe and the United States combined for relief efforts.

TY MONTAGUE (VO) 

Sinan also reminded me that Facebook is crucial to the success of important community efforts. Like the Ice Bucket Challenge, which raised a quarter Billion dollars for ALS research and the Black Lives Matter movement which relies on social media to spread awareness and information.

SINAN ARAL

Imagine that globally. And, in some countries, Facebook is the entire internet, like in the Philippines and parts of Africa. Imagine the value that’s being created worldwide. 

But I’m also realistic about all of the peril: false news, election integrity, erosion of privacy. Our real challenge now is, how do we achieve the promise and avoid the peril? 

TY MONTAGUE

Right. you do a great job of unpacking the promise. Could you spend another minute just unpacking the peril a little bit? 

SINAN ARAL

Think back to the 2016 US presidential election. Russia sent manipulative messages to 126 million people on Facebook, 20 million people on Instagram, 10 million tweets from accounts with 6 million followers on Twitter. Erosions of privacy, our lack of control over our own social networks, live streaming of mass murders, like the Christchurch, New Zealand mass murder,  lots of bullying and hate speech, potential effects on depression, and loneliness and isolation, which we’re still sorting out scientifically. Certainly, all of these effects are very important and the polarization of society. Does the algorithm polarize us into political factions that hate each other? 

TY MONTAGUE 

Yeah. Yeah. In the book you use a term, filter bubbles. Could you go into what you mean by that term? 

SINAN ARAL 

There’s two main algorithms that run social media. One is the people you may know, or friend recommendation algorithm that really guides the structure of the human social network online. And the other are the feed algorithms that control the flow of information over this network.

And, as these algorithms, A) connect us to people who are like ourselves, who believe what we believe. And B) feed us information that is more likely to be what we already believe and what we want to see, the more we get trapped into our own way of thinking and the less we have access to diversity. And this has been shown in large-scale experimental research to polarize society and to create affective polarization, which is hatred of the other side. That is essentially what I mean by filter bubbles, being trapped in an information environment where you’re only seeing things that comport with what you already believe. 

TY MONTAGUE 

And the algorithms are designed to do this? 

SINAN ARAL 

Well, the algorithms are actually designed to maximize engagement, primarily, which means to keep you engaged with the platform because that’s what the business model runs on.

TY MONTAGUE (VO)

This is how Facebook works. It’s an attention economy. It needs attention to sell it to advertisers and persuade their users, whether it’s to support a political campaign, a good cause, or to convince you to buy new shoes. Attention is essential to their business model.

SINAN ARAL 

So how do you keep the attention? By giving people things that either rile them up, that are emotionally charged, or that comport with what they already believe.

TY MONTAGUE 

…And so though the algorithms are not necessarily designed to separate us by, by being designed to maximize our interaction with them, essentially, they pull us apart kind of as a result of that.

SINAN ARAL 

Exactly. 

So, when it comes to polarization, we know through experimental evidence that the large-scale experiments show that the algorithms pull us apart. Now, is that the sole cause of polarization in America? Holding constant news media like CNN versus Fox? Holding constant politicians, which are themselves polarizing?

It’s very difficult to know what is primarily or solely responsible for polarization. But we do know that the algorithms have an important role in that.

TY MONTAGUE (VO) 

Sinan says that even though most of us want truth, what really makes us click is a strong emotional reaction. And it’s here that my  bullshit detector is going off – because FB knowingly manipulates the users emotional reaction by optimizing the algorithm to reward the inflammatory and the ridiculous. 

TY MONTAGUE 

You point out that one of the, you know, primary drivers of attention is novelty. 

I think you said “novelty is the short game and authenticity is the long game.” And I just wonder if you would talk more about that, those two ideas. 

SINAN ARAL 

Yeah, absolutely. So, we conducted a study, which was at the time it was published the largest uh longitudinal study of the spread of true and false news online, on the cover of Science Magazine in 2018.

And it was a 10 year study of the spread of verified true and false news stories spreading on Twitter. And what we found in that study was false news traveled farther, faster, deeper, and more broadly than the truth in every category of information

And when we looked into why, novelty was the primary explanation. People were shocked and awed by things that were surprising to them. And false news was much more novel, shocking, and surprising to them than true news. And so, in a sense, novelty and surprise, and shock and awe are engaging in the short term.

And they are also, what we found in that study, was they’re also emotionally charged. They’re blood boiling, they’re anger, inducing, they are, um, you know, uh, emotional and that gets your attention. The only problem with that is that I don’t want to see that all the time. When I think about what I want out of the information that I consume online, the types of people that I’m drawn to, in terms of influencers and so on, are people who are authentic, are people who are real and people who are relatable. And so, what I mean by that is that. The short-term shock and awe, anger inducing, blood boiling, surprising, shocking information might be short-term engagement, spiking, but it’s not sustainable. 

And so, the true leaders of the new social age are going to be the ones that realize that long-term shareholder value of their companies really should align with society’s values, not just the shock and awe of the short-term spikes in attention.

TY MONTAGUE

Ooh, preach, love that. 

So do you think that there’s a world in which the algorithms could actually be redesigned to do a better job of bringing us together?

SINAN ARAL

Absolutely. I think that that’s really part of the promise. Now, two things that are essential to achieving that are algorithmic transparency and choice. We need to know how these algorithms work. We need to give consumers choices between different algorithms and, and be upfront about which algorithm is going to deliver what type of result. 

TY MONTAGUE

So Sinan, on the Bullshit Podcast, we are always trying to discover how wide the delta is between word and deed in a company. Fundamentally Facebook says that they are bringing us together. So, bottom line, how wide do you think Facebook’s delta is? 

SINAN ARAL

Uh, currently their delta is very wide. you know, they have espoused a tremendous amount of the promise. And they haven’t paid enough attention to the peril. So I do believe that they would prefer to be achieving the promise and avoiding the peril, but it takes effort and it sometimes costs to do that. And I think that They haven’t invested enough in addressing the problems that are being created in society. 

They’re realizing that their current path is not sustainable. The question is, what is the path to closing the delta? Right now we’re at a crossroads, we’re at a crossroads between truth and falsity. We’re at a crossroads between meaningful human connection and, uh, polarization. We’re at a crossroads between democracy and authoritarianism and the next 18 to 24 months are absolutely critical because right now the world’s attention is on these platforms. We have to lean in, we have to make the right decisions, and we have to have science under the hood of all of these conversations. We need those transparent discussions to take place, and we need them to take place now. 

MUSIC: “Introducing” by Jess Fenton

TY MONTAGUE (VO)  

Sinan believes in the promise of Facebook, but because they haven’t sufficiently addressed the peril, we, as a society, are at a crossroads. How Facebook responds in the next 18-24 months will be crucial for closing that gap. 

And because of this prognosis, I wanted to get a second opinion. My next guest, Lucie Greene, is a futurist, strategist and author of the book Silicon States: The Power and Politics of Big Tech and What it Means for Our Future 

I first asked her why she thought Facebook changed its mission to focus on “bringing people together” in the first place.

LUCIE GREENE

I think it’s partly to do with the scale of these companies and how quickly they grew. They epitomized, let’s say in the post bubble era, this very glamorous idea of startups and hacking and all of that stuff. And, like, at a certain point with the scale that Facebook had reached, it becomes disingenuous.

You see the rise of this idea of purpose branding generally becoming a bigger thing. So Facebook is not in a vacuum here. You have Google talking, you know, removing the whole like “Don’t be evil,” but sort of also making big claims to solve disease and solve aging and Airbnb being about belonging. And this was led by big tech in a major way, the lexicon of brand messaging really shifted during this moment to something that was bigger.

LUCIE GREENE

In the bigger context you have Facebook, that can no longer claim to be a plucky startup moving into international markets like India, like Africa, like South America, and having a sort of reckless image was- is, is not the way that you do that, right? 

TY MONTAGUE 

That, that makes sense. What I did notice in the timeline is that right after he made this announcement, or very soon after he made this announcement, the Cambridge Analytica scandal broke. Do you think that’s a coincidence or do you think those might be connected? 

LUCIE GREENE

I think it’s interesting in Facebook’s trajectory. They are quite responsive to these major — well, depending on how they perceive major scandals to be.

Like, so for example, the previous major, big change that you saw in Facebook’s messaging maybe after move-fast-and-break-shit is after the Aaron Sorkin movie comes out, which is hugely unflattering.

TY MONTAGUE  

Oh, that’s right. The Social Network. 

LUCIE GREENE

Right. So The Social Network comes out and suddenly Mark Zuckerberg presents himself as a major philanthropist, right? So he was sort of using major philanthropy as a balm to negative imagery about him being sort of an immature brat who created a sort of women-face-comparison website. So you do see that as a sort of pattern and I wonder also the degree to which Lean In might have been to some degree strategic about lack of diversity at Facebook.  

After the Cambridge Analytica scandal, I listened to the shareholder call with Facebook and the shareholders were saying, even if this was incompetence or lack of foresight, you should be investing more in looking at potential fallout from policies like this because they could have a real knock-on effect on the share value of this company. Well they were extremely dismissive of that, but it became very clear that Facebook’s notion of foresight, be it strategic or ethical or whatever, to me, it should be all of those things, it’s sort of woefully under invested in. And I also think structurally is not empowered. I speak to people from Facebook. I have friends that work for Facebook and it’s very much the sort of “yes” culture. People don’t like it if you disagree or question or debate anything. 

TY MONTAGUE

What would you say their real mission is? What are they really trying to do? 

LUCIE GREENE

I think they’re trying to own more and more aspects or like the total communication experience, in terms of dialogue and expression. But also they way, not just individuals, but business communications. So they’re ramping up their B2B communication, hugely integrating, without much, I haven’t seen much about the privacy of B2B messaging, and so on just trying to own more share of human and business communication to monetize that through targeted advertising. 

TY MONTAGUE

So just following the thread of the Cambridge Analytica scandal for just a second, um, after it broke, there was a consumer backlash there, hashtag Delete Facebook. I had a bunch of friends who deleted their Facebook accounts at the time. But it didn’t seem to have much effect. 

LUCIE GREENEI think there’s a real cognitive dissonance between what is becoming more discussed in the public sphere and what is totally embedded in people’s consumer behaviors. I think people are becoming more and more aware of the way Amazon treats its employees, but they’re still using Amazon. We’ve seen that from clear data points. More and more aware of the gig economy and how predatory it is. And yet, we’re all still catching Ubers and ordering from Caviar and, and so on.

I think it would be very easy for people to leave Facebook itself. It’s where your mom hangs out, right? It’s become more extreme. It’s become more right wing, but they’re acquiring much more in every acquisition they’re acquiring or for now, you know, they might be broken up, we saw this week, but like, you know, It’s very easy to leave Facebook. It’s very difficult to leave WhatsApp. 

I think by diversifying in the way that they have, they’ve managed to make sure that at least to some degree, that they’re capturing some part of your life in a way that is sort of inescapable. 

TY MONTAGUE

Right, yeah. So you can leave Facebook, but you can’t ever really leave Facebook is the idea, right? They’ve got you surrounded.

LUCIE GREENE

Yes.

TY MONTAGUE

Losing a few eyeballs during Delete Facebook, didn’t seem to slow them down, but, but I actually thought more recently in the wake of the George Floyd murder in Minnesota, when a new outcry arose from organizations like Stop Hate for Profit asking advertisers to boycott Facebook, because the platform has continued to essentially condone and spread hate speech and a lot of advertisers responded and signed up. Big ones. Coca-Cola, North Face, REI, Unilever, like giant global corporations. But even that seems to have petered out at that point. I thought maybe they would listen because ad dollars, ultimately at the end of the day are the thing that they care about. But that didn’t even seem to have an effect. 

LUCIE GREENE

I think it speaks to how much they own the consumer experience, right? 

TY MONTAGUE

Even advertisers can’t avoid it.

LUCIE GREENE

I think advertisers have literally no power in this scenario. So if you are an advertiser, there isn’t really much alternative. And if you were a CMO and accountable for that stuff, I think you just uh, ultimately have to vote with your KPIs. 

TY MONTAGUE

Right. So, okay Lucy, we are always trying to discover how wide the delta is between word and deed in a company. Facebook says that they’re bringing us together. How wide do you think Facebook’s delta is? 

LUCIE GREENE

Ha. I mean it could not be more, um, polar, right? There’s just, just a huge, huge gap 

TY MONTAGUE

So just to, just to put a fine point on it, Lucy, is Facebook a bullshitter? 

LUCIE GREENE

Yes.

TY MONTAGUE

Thank you.

BS RATING THEME MUSIC 

TY MONTAGUE (VO) 

Folks, it is time to make the call: is Facebook “giving people the power to build community and bring the world closer together? Based on what I’ve heard so far, I’m calling bullshit. 

 But remember: Bullshit is a treatable disease — so after the diagnosis, we always discuss the cure. So I’ve invited three authorities on the topic to join us for a round table discussion of the positive actions Facebook should take to sort themselves out. This conversation and Facebook’s official BS score – after a quick break.

INTERSTITIAL MUSIC 

TY MONTAGUE (VO)  

Welcome back. Since we have concluded that there is a pretty big gap between what Facebook says it stands for and what it actually does, what should Mark Zuckerberg and his leadership team do to solve it? In this podcast we don’t just curse the darkness we also like to light a few candles so I’ve assembled my own panel of experts, and asked them to propose some concrete things Facebook should do to get back on track.

TY MONTAGUE  

Joining us, we have Sinan Aral, who you have already met and who graciously agreed to join us for the group discussion here as well. Thank you for that Sinan and welcome. 

SINAN ARAL

Thanks for having me.

TY MONTAGUE  

And we are joined by two new guests Kamran Asghar, founder and CEO of Crossmedia. Founded in 2000, Crossmedia is a forward-thinking, 100% transparent, highly analytical, totally integrated creative media agency with over 160 employees.  Thanks so much for being here today, Kamran.

KAMRAN ASGHAR

You’re very welcome.

TY MONTAGUE  

Last, but certainly not least, we have Rosemarie Ryan, founder and co-CEO of Co-collective. And also my business partner, Rosemarie is a renowned leader, brand builder, and rabble rouser in the marketing industry.Thank you for being here, Rose.

ROSEMARIE RYAN

Delighted to be here.

TY MONTAGUE  

Alright. So let’s get right into it. Sinan, I’m going to ask you to go first. What are some things that Facebook could do differently to truly live its mission to empower people to build community and bring us all closer together? 

SINAN ARAL

Well, I think it’s a good mission, if we can live up to it. And I think there are three things that Facebook could do to more closely align with their mission. 

And that is move from a perspective of short-term thinking to more long-term thinking. Second, to listen to their employees. And, third, to truly embrace transparency. 

And let me tell you what I mean. So right now Facebook’s business model, uh, is based on attention and engagement. And what they’ve focused on so far is a very short-term understanding of attention by trying to maximize engagement. But I think the true leaders of the new social age will be the ones that realize that long-term shareholder value is maximized by aligning the company’s profit incentive with society’s values. Because, when you have the short-term engagement model that creates fake news, that creates bullying, that creates hate speech, and that creates, crowd action like we saw in the Capitol riot as well as stock market actions like we saw with GameStop, that that might be short-term advertising, profit maximizing, but that it risks a regulatory and public opinion backlash that is not good for Facebook’s bottom line in the long-term.

They should give consumers choice between algorithms one through 10 while describing how all of them work and have a dropdown menu on my feed that says, I want the more diversity algorithm or I want the more such and such algorithm, with much more background on how all of them work. And they need to have much more comprehensive, transparent, and precise, detailed content moderation policies, They currently don’t have that.

The one last thing I’ll say is they’re going to have to thread what I call the transparency paradox, which is that we have been asking them to be simultaneously more transparent. Oh, and by the way, more secure at the same time. And The way they do that is with technical and policy oriented approaches like differential privacy, which allows you to reveal data without revealing individuals’ private information so that you can be more transparent and more secure at the same time. 

TY MONTAGUE  

Thank you, Sinan. Rosemarie. What are your thoughts on ways that Facebook might better do their story? 

ROSEMARIE RYAN

So I guess the first thing I would say is that I think they have an excellent story and an excellent purpose. One that is not just critical for them to fulfill, but I think one that we as a world, a community need to kind of get behind. I think they now need to step back, slow down, and fix things because it can no longer be about growth for growth’s sake, but really growth in service of community. And so that, I think, involves not just doing the odd thing in service of this, but really taking stock, focusing and investing their time, their considerable talent. They have some of the best talent in the world.

For me, there are two key issues that I think that they need to tackle. The first is the issue of polarization. And the second is the issue of misinformation. They talk a lot about empowering people, giving people the power. They use people’s data as a way to kind of sell advertising, but they don’t actually share it with the people that they’re taking it from. So if we are able to give people the tools to see what they’re engaging with, it might actually help them be more mindful and behave slightly differently. 

So, you know, one thought is that they could share a usage report at the end of each week. That shows your exposure to different types of content. So think about the kind of Apple model, raising your awareness of what you’re seeing, how much time you’re spending with it. With conspiracy theories, with puppy content, with fake news, with family news, you name it.

Maybe show how much of that content is labeled fake or untrue and we can talk about how we actually get to that. How much is negative in tone versus positive? There are lots of different ways to kind of break down your data. They have all of it. Once they have that, they can then maybe tie that to some kind of accountability to help you maybe watch less hate content, lower your usage on things that are more conspiracy driven, give people actual tips on how to kind of navigate the content they’re using.

I mean, there’s a lot of different apps right now that use nudges as a way to encourage more healthy behavior. So how can they use more kind of nudges on the platform? And then maybe taking it one step further, and rather than just this usage report that you get every week, actually building it into the platform and your usage in real time. And then I think, last but not least, all of that data there, they should be sharing it with the world. Make it global. Show everybody what’s actually really happening.

TY 

Yeah, love it. Okay, Kamran, bring us home, let’s hear a few thoughts from you and then, and then we can get into the discussion. 

KAMRAN ASGHAR

Well, I’m in the camp, probably that it, um, maybe it’s too late for Facebook, to change. And if they were to change, then really I think a rewrite of the mission entirely is in order. I think trying to bring the world together, it’s bold and it’s ambitious, but it’s also, what’s led to, it’s a bit naive and what’s led to some of the issues that they have today. And I would, I would bring it down to a much simpler version. And I felt compelled to sort of rewrite it and it goes something like this, it’s more of a purpose statement. 

“Our purpose is to make it easy for people to connect. Our duty is to protect your privacy. You have a choice to join our community. The platform is free for you to use. If you join, you’re allowing Facebook and your friends to see whatever you share about yourself. You can control who sees your information and you should manage your settings the way you might manage anything extremely important to you, like your finances, your car, or your house. We strive to do the right thing. As a technology company, we are always in beta and as a result, we sometimes get things wrong. When we make a mistake, we vow to be open, transparent, and communicative to you about exactly what is happening and the potential impact to you.”

And I think if you turn the mission statement a little bit on its head and create more purpose behind what you’re actually aiming to do — “connect” and, at the same time, “protect the privacy of its users.” But I think our rewrite of the mission is in complete order. 

TY MONTAGUE  

All right, some great ideas. Sinan, I loved several of the things that you talked about there, in particular, the move from short-term, thinking to long-term thinking, it sounds like, I believe it’s sort of a fantasy that they are currently maximizing shareholder value because what they’re doing today is threatening shareholder value by thinking short term and not taking into consideration stakeholders.

How do we convince them of that? You know, because, in a way, you have to convince a publicly traded company to forgo some short-term income in service of thinking longer term. That’s a hard thing. 

SINAN ARAL

Yeah. I have some thoughts on that. So in my book, I describe four levers that we have to sort of fix the social media crisis that we find ourselves in. And those levers are money, code, norms, and laws.

Money is the business models of the platforms that govern the incentives for how the advertisers and the consumers behave on the platforms. Code is the way that the platforms and the algorithms are designed. Norms is how we adopt and use and espouse a zeitgeist about what we think about social media in society and laws of course are regulation.

So money and code is under the purview of the platforms, in a sense. And so, in terms of norms, things like the Delete Facebook movement and the Stop Hate for Profit movement express a public sentiment about the unhappiness that people have about the direction that Facebook has taken. The employees whistleblowing and leaving in droves express an opinion from the inside about how happy employees are with the direction that Facebook has taken. That pressure needs to continue and be stepped up. 

Secondly, I think we can make some meaningful inroads from the perspective of regulation. So the reason that Facebook doesn’t have any incentive to change is because it doesn’t have competition. And so it can continue doing what it’s doing, and its short-term view of profits is profit maximizing. People can’t leave Facebook easily because they’re technically locked in because there are network effects and so on. We need to look at regulation. When I say that people immediately think, Oh, you mean break up Facebook. But that’s not what I mean. The social media economy runs on network effects, which means that the value of a platform is a function of how big it is.

And in markets that run on network effects, they tend toward market concentration. They tend towards monopoly. So if you break up Facebook, it’s just going to tip the next Facebook like company into market dominance. What we need is structural reforms of the social media economy itself. We need to be able to connect across platforms that will preserve the value created in the network effects. And allow me to switch from one platform to another and still keep my social network. That will enable us to vote with our feet when we don’t like the policies or privacy policies of a given platform and that kind of switching is what’s going to really enable competition. 

ROSEMARIE RYAN

I was just going to jump in and say that I think people already are voting with their feet. Facebook is becoming an increasingly toxic environment and the audience that they have there is not actually the most desirable audience for short term growth in terms of attracting advertisers. Younger generations have already decided that that is not a place that they want to spend their time in and new platforms like Tik Tok, et cetera, are starting to kind of fill that space. So I think long-term growth, I think there’s some immediate short-term growth issues that they’re probably starting to feel that will only increase if they don’t start to address some of the issues. 

TY MONTAGUE  

And Kamran I see you nodding there. I’d love to hear your perspective because I mean, you’re at that coal face with clients, like having them decide whether or not to invest in Facebook or not. What kind of conversations are you having with them?

KAMRAN ASGHAR

Yeah, it runs the gamut. Most clients understand the value Facebook can bring But they’re very very wary, like all of us are. And I think the idea of competition is starting to emerge. And I think some of the things you’re seeing with, you know, um, Apple IOS14 and the limiting of data access are really going to start to chip away at Facebook’s power and make it probably less of a tool for advertisers to extract certain kinds of data and more of a general advertising platform to do what advertising is intended to do, which is help move minds and behaviors and not manipulate.

On the idea of regulation though, I’d like to go back to that for one second, because I completely agree there needs to be some sort of regulatory body. And, frankly, I’d like to see the founders and the owners of these platforms step up and create a council of self-regulation amongst themselves and decide that they’re going to put some standards and practices in place, you know, not unlike the alcohol industry, where you have to, market messaging around drinking responsibly bringing full transparency in how they operate.

But right now, because Facebook and other social platforms are technologies or platforms and they’re not media companies and they’re not consumer brands. They’ve completely skirted all regulation that is typically applied to other industry. You know, you would never send, you know, food into a market that was spoiled, the FDA would never allow that. The same way you would never put a toy in a kid’s hand that could explode in their face. And essentially Facebook and social platforms have been allowed to take data and use that in an unintended, harmful way. I would like to see some self-regulation occur first before the government would step in. And if they can’t, then I really do feel the government needs to decide once and for all that for social platforms, right, um, there is a way to operate and behave that is mandated by law.

ROSEMARIE RYAN

Yeah. They are media platforms. They’ve been arguing that they’re not for years. But that’s exactly what they are. 

KAMRAN ASGHAR

I wish that Facebook would come out and just say they are a media platform, but unfortunately they don’t have writers. They don’t have editors. And so they don’t have that kind of code of ethics that most media abide by. At least good trusted media abide by, right? So users can say and do whatever they want and Facebook has completely washed their hands of any, any kind of obligation around editing what is on the platform or what kind of code of ethics should be used on the platform. And it would be great if they considered themselves a media brand, but, a media company, sorry, but they, they simply don’t. 

TY MONTAGUE  

Rosemarie, um, you, you talked about, this idea of a, a daily report that just reflects back to you, your usage, what you’ve been exposed to and why. Why would Facebook not do that today? Why does that not exist? 

ROSEMARIE RYAN

I think it would cost them money and they would have to expose people to just how much of the data they probably have on them. To your point about not being a media company, Kamran, I don’t think they want to kind of feel like they’re stepping in between people and how they’re communicating on the platform. Although, I think it’s, it’s a moral obligation for them to help people understand. This is a new tool. People would argue that people’s brains are being rewired. I think it is behoven on them to help it. If they now understand what kind of consequences of this are, people themselves get to see it. And I think also if you make it more personal, so that I personally understand what I’m engaging with, I become much more intentful about what I do moving forward. So, I don’t know. Perhaps they think it will slow down that engagement. 

TY MONTAGUE  

Yeah, or expose things that they don’t want exposed. Sinan, you had a similar idea, the idea of a dropdown that allows me to choose the algorithms that I want to maximize for diversity of thought or whatever else that I actually want to be exposed to, I think, is awesome. Why would they not do that?  

SINAN ARAL

If you combine Rosemarie’s idea with this idea of choice and algorithmic transparency, if you gave users feedback about the choices that they’ve been making and then gave them those choices, as well as transparency around those choices, you might see them making healthy choices rather than unhealthy choices. Basically you’d have a dropdown menu of algorithm one, two, and three, and then there would be sort of published papers written by non-Facebook employees, scientists, researchers, academics, experts who were given access and reported back about what they found from an objective standpoint about how these, how each of these algorithms behave.

And this is part of the also the movement around healthy eating, where you sort of get what you put people, uh, you put up a digital mirror in front of people that reflects back their choices and behaviors to them. And then they actually, that is a quote unquote nudge, as Rosemarie was talking about, to behave in more healthy ways to create a more healthy communications ecosystem.

KAMRAN ASGHAR

I could envision a hundred percent opt-in model where maybe you don’t even need to see the algorithm, but just inform, you know, Facebook and advertisers, what you’re interested in and when you’re interested in it. And then have brands actually pay customers for their data. Now that’ll cost Facebook money. But the idea of control and incentive aligned with transparency, I think could be a game changer for Facebook, any platform and digital advertising as a whole. 

TY MONTAGUE  

So I want to, um, shift gears for a second and talk about what happens if Facebook decides not to do any of this and to continue to proceed, whistling past the graveyard and pretending everything is fine. So if they refuse to change the way they’re operating today, what would we say would be a better way of describing what Facebook is truly up to today? So Rosemarie, if you were going to rewrite Facebook’s mission to align with the way that they actually behave right now, what would you say? 

ROSEMARIE RYAN

I mean, I think I would just abbreviate it. So I would say it’s give people the power to build community and support their own worldview and damn everybody else.

TY MONTAGUE  

Right. 

ROSEMARIE RYAN

That’s the mission they’re currently on. 

TY MONTAGUE  

To capture and hold the attention of the world in service of advertisers, everywhere would be another way to explain what they’re doing and maybe a meaner way to put it would be to capture and hold your attention to extract the maximum amount of your personal data in service of any company, organization, or individual who is able to pay us for it.

ROSEMARIE RYAN

Pretty accurate. Part of the reason why they came up with this mission statement, I imagine, is because they realize the unintended consequences of some of what they had built. And decided they had to take a step back and really think through what their responsibility was and what their real intention was. So I think they did the work of getting to this. They don’t seem to have done the work of actually making it real. Going back to story doing, excellent story. One that we all want to get behind. I think, one their employees, to Sinan’s point, want to get behind. I think, most of the users on their platform want to get behind. Certainly, I think, the advertisers would like to get behind, um, but they have not put in the energy and the effort to really make it a reality. I think Mark Zuckerberg, when he sets his mind to something, can do pretty much anything.

And he’s got brilliant people surrounded by him, actually, some lovely people that work for him. So what is stopping them [from] actually taking this very powerful and compelling mission and putting it to work. That’s the question. I-I don’t, I’m kind of confused and I don’t think it’s just economic. They’ve got a ton of money. 

SINAN ARAL

I think this notion of connecting the world is aspirational and I agree that they are just not aspiring hard enough to get there. I also don’t really understand what prevents them, because I think that they could be enormously financially successful and still truly aspire to achieve that mission.

KAMRAN ASGHAR

I’ve heard you talk about this Sinan. They’re so focused on the likes and the shares that then obviously are what they monetize both in the social sense and in the advertising sense. Everything in the organization points, not to the mission, but to those metrics. And so the mission doesn’t stand higher than the metrics. But if they really took a hard look at the business and said, we could diversify and offer pretty unique features and products to certain segments, then I think they could start to think about themselves a bit differently than just focusing on those two metrics that drive advertising.

SINAN ARAL

I think that’s a really, really good point. It brings up a pet peeve of mine, I got to say, which is just the metaphorical concept of the like button, okay. The like button. What is the like button?

It says what we prioritize most is popularity. The more people like something, the more valuable it is on our platform. And I’ve always thought about that like button because it’s so arbitrary.

KAMRAN ASGHAR

So arbitrary.

SINAN

So let me give you a different hypothetical reality. What if there was a knowledge button? Or a truth button? Or a this taught-me-something button? Or a wellness button? 

So, right now we’re all incentivized to be as popular as we possibly can to get the most likes. What if we got the most truths or the most, he taught me somethings or, would we then be incentivized to teach people things and to bring truth to the discussion and to encourage wellness? And if that was the case, then could we make the mission and the metrics match by designing the platforms to have metrics that match the mission. 

ROSEMARIE RYAN

I love that. I think that would really help with true community building when you’re bringing people together. 

KAMRAN ASGHAR

That’s awesome. The number one thing that I wrote down in terms of what Facebook could do to change their behavior entirely, and people’s behavior is remove the like button, but I didn’t have an answer for what would replace the like button. And I think a series of diagnostics based on truth and real opinion would open it up into a whole new stratosphere in terms of understanding what people really feel and think. That’s actually a useful tool.

ROSEMARIE RYAN

Yeah. It made me think button. 

TY MONTAGUE  

Okay folks, um, we solved it.

ROSEMARIE RYAN

Does that mean, you’re not going to fire me Ty?

TY MONTAGUE  

You get to keep your job too Rose, which you know, was an unexpected outcome, but I’ll go with it. Thank you everybody 

MUSIC: “Shit Got Real” BY Jess Fenton 

TY MONTAGUE (VO)

I’d like to end the show today by giving Facebook an official BS score. On this show BS exists on a scale from zero to one hundred, zero being the best, zero Bullshit, one hundred being the worst, total bullshit…

It’s fairly obvious Facebook isn’t living up to their mission of “giving people the power to build community and bring the world closer together?”.

So I’m giving them a BS score of 72. That’s pretty high. If you disagree, visit our website, callingbullshitpodcast.com, to weigh in with your own score and we’ll also track their behavior over time to see if they can bring that score down. You’ll also be able to see where Facebook ranks on BS compared to other companies we feature on this show.

And if you’re running a purpose-led business, or you are thinking of beginning the journey of transformation to become one, here are three things you should take away from this episode: 

1) Because young consumers are demanding it, there is a lot of pressure today on business leaders to declare the positive purpose that they are pursuing. But your purpose isn’t something you use for a press release and then stick in a drawer.  In Facebook’s case, the purpose of “empowering everyone to build community and bring the world closer together” is great. Mark just needs to prove that he really means it. Which brings us to takeaway number two.

2) Once you’ve aligned on your purpose, it’s all about action. We’ve talked today about a number of actions that Facebook could take. Actions like giving people a daily report on what content they have been exposed to by the algorithm and why. Or giving people choice. A drop down menu that allows you to choose the kind of content you want to see so you can tune the platform to your interests. The actions for your company will undoubtedly be different. The point is taking action is a vital part of being purpose-led.

And 3) Transparency is key. No company is perfect. That’s okay. A good purpose should articulate a vision that the company has to work toward over time. As a leader, you need to be honest about where you’re succeeding and where you’re failing. That builds trust. In Facebook’s case, their leadership needs to do a much better job of letting us know what they are really committed to doing to change the platform.

Speaking of which, Mark Zuckerberg, If you ever want to come on our show to discuss any of these ideas or any other aspects of today’s episode, you have an open invitation.

TY MONTAGUE (VO)

Ok, so that was our original ending back in August 2021. But Facebook, true to form, kept right on moving fast and breaking things which put them right back in the news:

[SFX]  News dial 

MUSIC: “Chaos” by Jess Fenton

[SOT Haugen] The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook.

[SOT] Facebook has known for years that instagram is toxic to teenage girls.

[SOT Haugen] The only way we can move forward and heal Facebook is we first have to admit the truth. .. the way we’ll have reconciliation and we can move forward is by first being honest and declaring moral bankruptcy.

TY MONTAGUE (VO)

Whistleblower allegations, lawsuits, Wall Street Journal revelations, and right in the middle of all of that, the name change to Meta. The BS definitely got deeper, and we decided that we had to keep digging. 

So please join us for episode II where I speak with Ramesh Srinivasan, an engineer working in the place where technology and humanity intersect. We’ll give Facebook a new score and, as ever, to find a few more candles to light. I’m sure there’s a candle in there, somewhere.

MUSIC: “Expectations” by  Dylan Sitts, Megan Woffard via Epidemic Sound

TY MONTAGUE (VO) 

Thanks to our guests today, Sinan Aral, Lucie Greene, Kamran Asghar, and Rosemarie Ryan. You can find them, ironically, on social media.  

We’ve got all their handles on our website: callingbullshitpodcast.com. 

If you have ideas for companies or organizations we should consider for future episodes, you can submit them on the site too. 

And check out Sinan’s book, The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health–and How We Must Adapt, and Lucie’s book, Silicon States: The Power and Politics of Big Tech and What It Means for Our Future.

If this discussion hit the “made me think” button for you, subscribe to the Calling Bullshit podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

Thanks to our production team: Susie Armitage, Hannah Beal, Amanda Ginzburg, Andy Kim, D.S. Moss, Mikaela Reid, Lena Bech Sillesen, Jess Fenton, and Basil Soper.

Calling Bullshit was created by co:collective and is hosted by me, Ty Montague. Thanks for listening.

Don’t agree with our Bullshit Score?
Give us your take.

Tagged as: .

Rate it
Post comments (0)

Leave a reply

Your email address will not be published. Required fields are marked *