Steve Clemons: Hi, I'm Steve Clemons and I have a question. Is there a way to protect free speech online while fighting misinformation and disinformation? Let's get to the bottom line.

A lie can travel halfway around the world while the truth is putting on its shoes. The saying might be a few centuries old, but it certainly rings true today. Anyone anywhere can publish anything and if it catches on, there's no stopping what it can do either for good or for evil. So, how can people be protected from the nefarious manipulation of information and the disruptive promotion of falsehoods by governments and organisations? Or should fake news be allowed to compete in an open market with real news?

Fortunately, we have three people in the room who have the answers to these questions. Melissa Ryan, a digital strategist who studies online toxicity and extremism and writes the weekly blog, Ctrl Alt-Right Delete. I love that name.

Jody Westby, a legal expert on privacy and computer crime and the founder of Global Cyber Risk.

And Jennifer Brody, who has worked on programs combating online disinformation and works at the digital rights group Access Now. Thank you all for joining us today. We're going to have some fun with this one I think.

So, let me just start with you, Melissa. Censorship or free speech. How do we draw that line between what we see as my right to say what I want to do and the question of basically becoming the source of conspiracy theories and falsehoods?

Melissa Ryan: I think it's really important that we consider the frame of speech and who social media empowers. The tech companies have consistently made policies that empower the already powerful. So, what happens is for the vast majority of us, not only is our freedom of expression limited, but our right to be safe and free from hate and harm online is curved.

Clemons: But do we have to have a judge for that? Do we have some judge said, "Oh, the less powerful people get certain safe space to do this and you more powerful people don't get that"? I mean, I'm just sort of interested how you draw the line in a tangible way, in an understandable way, between those that have at least in the United States, a guaranteed right of free speech, and those who are behaving irresponsibly. And there's a lot of clamour to shut some of that down.

Ryan: Yeah, I mean you really have to think about how many people don't have free speech because they've been driven offline entirely or they have been the victim of violence that started online. I think Facebook and the Rwanda genocide is the most prominent example, but you know, here in the US, we certainly have folks that have been driven offline because they were harassed to the point where they had to radically alter their lives. The focus of this free speech conversation always seems to centre on white men of a certain age and it leaves, women and people of colour who have been disproportionately the victims of harassment. No one's really talking about their speech.

Clemons: I want to go to something, because this is a big issue. This is a big issue, politically, in the United States right now. And I think it's a big issue in the rest of the world, where I think no matter where you're at, you're trying to find that tipping point, that inflexion point. Fascinating exchange between Mark Zuckerberg, the CEO of Facebook and Congresswoman Maxine Waters. Let's play that.

Maxine Waters: Two-point-seven billion people use your products. That's over a third of the world's population. That's huge. That's so big that it's clear to me and to anyone who hears this list that perhaps you believe that you're above the law. And it appears that you are aggressively increasing the size of your company, and are willing to step on or over anyone, including your competitors, women, people of colour, your own users, and even our democracy to get what you want. You have opened up a serious discussion about whether Facebook should be broken up.

Clemons: Now, before I get to Mark Zuckerberg's response, Jody, what she's basically saying is Facebook, you're acting like you're your own country. You have your own rules. You have your own gravitational forces and you don't have to behave in an accountable way. Let me give you all the power for a minute and you're going to fix Facebook and you're going to fix this. What should we be doing?

Jody Westby: Well, the first thing we should be doing is the SEC should start taking a look at Mark Zuckerberg. They are the only ones that have any power to do anything about him. To remove him from office, to check what he does within the company. And so far all they've done is chase Elon Musk and have left Mark Zuckerberg alone. When the company is notorious for, the first thing he wants to do is make money. The second thing he wants to do is then say, "Oh, we're sorry, we apologise. We recognised this later than usual." And then "Now, we'll do something about it," and make a promise.

So, when he has a third of the world's population, he does have a huge amount of content coming over his network. And that as we know well from the Senate Intelligence Committee reports and the investigations that that social network, particularly Facebook, has been abused.

And so now, we are seeing him in the crosshairs. One thing that he should be doing is he should be informing people when they see things. He could first of all be monitoring against their terms and conditions and informing people. And he should be removing content when it violates the terms and conditions, not only from Facebook but his other sites like Instagram, so something doesn't carry forward on another site. But he is in a huge power position and so far our government, the regulatory agency that could do something, has done nothing.

Clemons: I want to hear what Mark Zuckerberg said, but I want to just come back in a moment after we do this. When you talk about money it's important because he's the CEO of a public company. Elon Musk is the CEO of a public company, as well. And so part of that is, I find it interesting is, how do we look at CEOs, have a certain responsibility. That responsibility traditionally hasn't necessarily been for the public good, though they may want to pretend that's the case. They have a bottom-line responsibility, but let's listen to Mark Zuckerberg.

Mark Zuckerberg: Chairwoman. Our policy is that we do not fact check politician's speech. And the reason for that is that we believe that in a democracy it is important that people can see for themselves what politicians are saying. Political speech is some of the most scrutinised speech already in the world.

Clemons: So, Mark Zuckerberg is making the case that he wants political speech out there. Political speech for which he's being paid to put on his sites, of course, but he wants to create an open market so that the public can look at what a politician is saying and that people can judge it. Whether it's true, whether it's falsehood, and have that competitive arena. But we've got some news that just recently - Jack Dorsey, the CEO of Twitter, sees it differently.

He writes, "We've made the decision to stop all political advertising on Twitter globally. We believe political message reach should be earned and not bought. Why? A few reasons." And I encourage folks to go take a look at Jack Dorsey's tweet and then the many links there. So Melissa, tell me what you think about Jack Dorsey's message versus Mark Zuckerberg.

Ryan: Well, I mean it's interesting, Mark Zuckerberg, the frame of that hearing was, we know we've messed up everything else, but please trust us with your money, because Facebook wants to start a cryptocurrency. All of the tech platforms, they have broken the public trust in such a big way. I don't think Facebook realises how big of a problem they have with ads. People have an expectation that when they see a political ad that the information will be true and that the network or the platform will be a gatekeeper and keep false information out. Facebook doesn't seem to understand this.

Twitter who I've been a huge critic of, I'm not always a fan of the way they do things. I think they see the writing on the wall. I think they've seen how much terrible press Facebook has gotten for their ads decision and decided that they want none of this. That it's just easier not to take that ad money than to constantly be in the position of fact-checking political ads.

Clemons: Jody?

Westby: Well, Facebook has done a very clever, genius tactic here. When they invited the real media to come on to their site and they would pay them, because most people were starting to look at Facebook with a bit of a question about the legitimacy of what they were reading.

Clemons: We should note for the global viewing audience that Facebook has started a Partners Program for America's leading media.

Westby: They have. They started a Partners Program where they invited the Wall Street Journal, USA Today, Los Angeles Times. A number of the leading newspapers and press, as well as some smaller ones, and they are paying them to put their content on their site. So what this effect is, is it legitimises Facebook site as a legitimate news source and therefore, I think will be harder to weed out all the fake news that's in there and all the misinformation.

I think it was a genius move on his part, even paying them, because he will now continue to make money because people will still be coming there for legitimate content. But it's going to be harder for now for people to discern I think, between what's real and what is not real.

Ryan: And that decision is already controversial, too, when you consider that Breitbart News was one of the outlets that Facebook chose. Breitbart has a long history of anti-Muslim propaganda, hate speech. We know they had from a Buzzfeed expose in 2017. We know they were in constant communications with notorious alt-right figures about coverage. So, Facebook is already facing just intense criticism for this new partnership.

Clemons: There is a hilarious guy, I find it very funny. Basically, they went out and advertised on Facebook with some fake ads and this fellow's name is Adriel Hampton. But originally Mike Gravel, former governor of Alaska used to be running for everything. He was running for president. Had gone out and they took down an ad that said that Lindsey Graham, Senator Lindsey Graham, a staunch conservative, certainly not supportive of the New Green Deal, was a big advocate of the New Green Deal. And they took that ad down. And what this fellow named Adriel Hampton did, who had raised the money, he filed to run for office. To run for governor, and is now challenging the Facebook norms and rules and he's got that ad up there again.

So what do you think about that? Is there a kind of civil resistance to basically take on ... I mean I don't know what Ctrl Alt-Right Delete does, but is this sort of one of the shenanigans that you guys might hatch?

Ryan: Well, we are a weekly newsletter, so I write about all of these issues. One thing that we -

Clemons: Are you going to write about this?

Ryan: Absolutely. We've covered it a lot already and one of the most interesting criticisms was Facebook wouldn't define who counted as a politician. When they announced this policy, someone made a joke on Twitter, I think it was a New York Times reporter of "That's the sound of a thousand Edgelords running for school board." Because if anyone is a politician, you can actually abuse democracy to make it so that your ads won't be fact-checked.

We know there's already some [inaudible 00:12:25] platform alt-right figures running for Congress. There are three Q and Nonbelievers running for Congress. Facebook is now in the position to determine, "Is this person a politician, or are they not?" You mentioned Adriel Hampton. Facebook said, "Oh, you're not really a politician because you're only running for office to prove a point." They're going to be forced to make a decision over who is and isn't a valid politician on Facebook over and over again, as long as this policy stands.

Clemons: In your world where you're trying to provide access to those that don't often have it, what's happening there? Because I know you work globally.

Brody: Sure. So yeah, Access Now where I work, we're a global nonprofit defending and extending the digital rights of users most at risk online. We do policy work, we do advocacy work. We also have a digital security helpline. It's a pro bono service. Human rights defenders, civil society can reach out if they feel that they're threatened online.

Clemons: So, I know in some cases with Ruhanga and with others, you see internet blackouts happen. Is there a case to be made that maybe we need to live with the chaos and the false news and the real news, and not empower a government to call for the government to begin lying to us. Jodi?

Westby: I think Zuckerberg is doing two things right, to give him credit. One is, he's standing strong against government intrusions on encryption, and so that's a good thing. Second is, when he says we have freedom of expression and if we flood the marketplace with thoughts and ideas, then there's more out there and that's when truth rises to the top. He's right. That's always how we've viewed democracy. We've never been shutting down the speechmaker. We've just been giving all the people speaking the truth, the voice, as well.

But there's an interesting nuance here and people get confused around it. And that is what Trump did with Cambridge Analytica and the data that they gathered. And how they did it was perfectly legal. It was perfectly illegal for a foreign entity or a government to do it, to try to influence our election. So, we can have political candidates that can try to sway the opinions and views of a voter and that is okay in our country.

So, when we start talking about someone's lying, well, sorry, but all politicians lie. We know that. So, part of it is sorting it out, but it's also realising what is the difference between our politicians lying and how do we know it's our politicians and it's not actually a misinformation campaign that's sponsored by another country? It gets very confusing for the public to sort this out, I think.

Clemons: So, who regulates that?

Westby: Well, we have federal law that prohibits, the federal election laws prohibit foreign interference.

Clemons: And this is your comment about the SEC and others. As I listen to you -

Westby: No, the SEC has to do with Zuckerberg's voting power and that no one, not even his board can control him. The only entity in the world that can control him is the Securities and Exchange Commission because he's a public company. The federal election commission has the regulatory power over the laws that govern foreign interference.

Ryan: Although I would point out that the federal election commission right now is not functioning. Because the commissioner [crosstalk 00:17:28] so when people ask who's -

Westby:
[inaudible 00:17:30] no one's home.

Ryan: Who's watching the store. [crosstalk 00:17:31]

Clemons: But Jody put something on the table is really important, which is the nefarious actions of foreign governments. How do we mitigate that, particularly the state-based players?

Westby: Well, yeah, 47 countries have been identified that have used state-sponsored groups to attack political opponents or activists. And that's twice from last year, twice the amount from last year. So, we have a steep upward curve on people using this to exploit. But what we do about it? We have a big problem because president Trump and Senator McConnell in the Senate don't want to do anything about election security or regulating any of these activities. So, I don't see, unfortunately, anything happening in the US from lawmakers. It's either going to be a regulator or a court.

Clemons: To your knowledge, what is the United States doing about it? Are we disrupting anyone?

Westby: Of course.

Clemons: I mean, I use we in the American sense. I know we have a lot of global Watchers, so it's used that for a moment. But when you look at NotPetya was this Russian virus that was released, created tens of billions of dollars of damage. [EDITOR'S NOTE: White House officials reportedly estimated the damage at more than $10 billion]. It's a fascinating story to look at what this computer malware virus achieved, but what was fascinating about this is that this malware was based upon American malware, [crosstalk 00:20:06] and it was tweaked American malware. So, I guess my question is from the best of your knowledge, where does the United States stand in terms of being one of the malicious players in other countries?

Westby: Well, look, I think the New York Times did a really good article on this. We have played and interfered in elections around the world as intelligence activities and as covert activities for decades. But that doesn't mean that it's OK. And it doesn't mean that maybe we're doing that in our national interest because we're trying to overthrow an authoritarian regime or we're undertaking actions that we feel are in the best interest of international law and democratic principles. So, you have to look at it in that whole context. But I just set that aside because all governments do intelligence activities.

Clemons: But when you say that, I get goosebumps, Jody, because what comes back to mind is what we begin to do internationally can begin to happen domestically. I remember J Edgar Hoover and the FBI and the wiretaps, which we're learning about more and more, and various dimensions that we saw of wiretaps without warrants. Which was a part of our history and they're very near history. Do you worry about a national security state here where every digital moment in our life becomes a risk?

Ryan: I worry about every moment in our lives becoming a risk. I also worry about domestic actors here who have learned from malicious foreign actors and have adopted those tactics. I think the rhetoric.

Clemons: This is the Alt-Right crowd.

Ryan: The Alt-Right crowds.

Clemons: Who are your biggest villains in that story?

Ryan: In that story? I mean, honestly, in this cycle, I'm really worried about what the Trump campaign and the various independent actors, Democratic and Republic are -

Clemons: Do you see the Trump campaign as the villain?

Ryan: I see them as one of the villains. Yeah.

Clemons: And who else?

Ryan: I worry about all of the various independent expenditure groups. Again, we have an SEC that's not functioning. We have tech companies for the most part, that have said that you're allowed to say and do whatever online. There are no rules and when there are no rules, bad actors are going to be bad. It's not -

Clemons: Jody, what is your experience in this in terms of, you've been watching the nexus between technology, cyber-behaviour and the national interest for a long time. I'm really interested in the question of how we begin to frame this in a more mature way. I mean we would always like to wave a magic wand and say, "Everybody be respectful. Everyone play nice." But that's not going to happen. The real world is not a kumbaya world. The real world is: groups pursue their interest, group are doing, but how can we really set in place and move the needle to get a more responsible set of behaviours here, or more accountable behaviours. And then, as we look at this problem globally, because what is happening here today is happening in elections, in Europe, happening in elections in Asia, happening in elections in Africa, happening in elections in Latin America, and we see the same pattern of players. No one seems to have yet risen above that. So, what's the secret pathway to a better environment?

Westby: If we take the situation as it is and say we're on the eve of the 2020 election, then, I think the first thing is to have an awareness campaign of recognizing fake news. Helping people understand. When I was in school it is, well you got to consider the source and people would challenge us, "Is that a reliable source for your reporting?" So, one is to better inform the public about the fake news and to also then to get the social media companies to take socially responsible actions.

For example, the Trump campaign went on Facebook, bought ads, and they were getting all of this voter registration information claiming that the person hadn't registered or his registration was going to expire. And then they would finally kind of send them to a voter registration side, but it was a scam to get people's information. So, that kind of tactic should not be allowed. That should clearly be a violation of Facebook's policies. They should be aggressively monitoring what they're doing. If they don't want to monitor what's happening to meet their terms and conditions, then do what Twitter did and just ban the ads.

But then there's another part here that the public doesn't understand, and that is there's a huge private-sector role in our elections. Private-sector companies are largely the ones involved in voter registration. They're the ones that program most of the voting machines for every single election. They're the ones that tally most of the votes. They're the ones that do the polling and they're the ones that do the reporting of the votes. People don't understand that. They think all the votes are within our election officials in these towns and communities. And they aren't. Those are very serious points of exploitation and so I think there are some things we can do about that. There are things we can do in this cycle to just ... this is a prime moment. We're on the learning curve with technology. We have to step up and teach our public.

Clemons: Thank you for that and Jennifer, just the last couple of minutes we have. I'd like to hear from each of you what you think would be the most important thing we should, what is the most important lever we could push to get to a healthier balance between expression and responsibility?

Brody: Great question. I think a big piece of this puzzle that we haven't touched on today is addressing the lack of data protection laws, right? Disinformation, misinformation, is so malicious and problematic because it is hyper micro-targeted. If companies are only collecting X amount of data, that's reasonable for them to collect then, because disinformation has always existed. It's not new. It's there's new tools.

Clemons: So, that would be your area, Jody.

Westby: Yeah. I think the SEC should begin an investigation into Mark Zuckerberg and consider removing him from office, from his company. That would send the most powerful message to every social media platform around the world.

Clemons: That's a big zinger. Melissa, what's your zinger?

Ryan: I think we need a new social contract with tech companies. I think we have to push them. Use every regulatory and legislative solution we have to push them on political ads, to push them on speech and push them on, again, protecting the vast majority of their users.

Clemons: I really appreciate this conversation. If I had my own zinger, I would probably say, I want to look at the consumption and I want people to become better critical thinkers. To look at everything sceptically and learn how not to get their passions and emotions hijacked. We can debate that later.

Clemons: I want to thank you all for being with us. Jody Westby, CEO and founder of Global Cyber Risk. Jennifer Brody, legislative manager of Access Now and Melissa Ryan, editor of Ctrl Alt-right Delete and CEO of Card Strategies. Thank you so much for being with us. Great conversation. Thank you.

So what is the bottom line?
Despite more than two decades since the information revolution, it's easier than ever to hijack people's emotions and inflame their passions. Human nature hasn't changed just because technology has changed.
The importance of education and critical thinking has never been more crucial. But all of us are going to need some new rules. And private companies are going to need guidance.

There is no easy way around it - We all share the responsibility of protecting speech, and penalising those who would lie and manipulate - And that is the bottom line.

Source: Al Jazeera News