Skip to main content

tv   The Communicators Social Media and Free Speech  CSPAN  January 30, 2021 6:30pm-7:03pm EST

6:30 pm
unfiltered view of government. c-span was created by america's cable television companies in 1979. today, we are brought to you by these television companies who provide television to viewers as a public service.
6:31 pm
>> i think a lot of people perceive that their problems, they come from the left and the right, they don't always agree on what the problems are and i'm sure were going to discuss that. a lot of people agree that there are a lot of issues with the social media companies, the so-called giant platforms having too much market power so their competition issues that need to be thought about and addressed. there are perhaps regulatory issues. it's a topic that is deserving of serious public discussion that ought to be thoughtful when
6:32 pm
discussing particular incidents about what one company or another happens to have done. >> i think we have learned a lot and we have seen in the last decade, we have gone from a tech utopianism where it was isn't this great were going to have this great interactive discussion where everybody will be able to's -- speak their mind. where reporting will be able to discuss things like police brutality and these injustices. we have seen that it empowers all of that. but it also empowers all of the bad things. the harassment, the would be dictators.
6:33 pm
deliberate disinformation or just the plain -- of humans disagreeing with each other. it has gone from some lofty debate to an endless thanksgiving dinner where you are most left wing and right wing relatives are spending the time arguing. >> you mentioned disinformation. could you define that in your view, and also, is there a solution? harold: it is important. it is important to distinguish between disinformation and misinformation and just plain information. disinformation is taking something you know is not true and pushing it out there as if it were true. deliberately spreading a fake news story is disinformation where you are attempting to undermine true information or
6:34 pm
trust and the reporting of true information. misinformation are things where people in all honesty and good faith are repeating things that are just wrong. sometimes, they are wrong because we have learned better. initially, we thought with covid we should not have everybody try to wear a mask. we should just leave that for the folks in the front lines. then we learned that was wrong, we needed everybody to have a mask. now, if you are spreading the information you heard, that is misinformation. if you are doing it deliberately to try to get people to stop wearing masks, that is this information. we do have ways in which we are trying to deal with that because as we have seen, this is not just harmless. people die, people can be
6:35 pm
incited to violence through manipulation of misinformation and disinformation. we are working on a number of solutions both to identify these disinformation and misinformation, and to try to address it. peter: do you have an issue with his issue of disinformation and misinformation? and what about controlling it? randolph: it is difficult. i don't have a quibble that is substantial with this definition. i would say sometimes you have both. it is hard to separate the two. i am going to use another example to illustrate how we can go astray with two -- quickly -- too quickly labeling something.
6:36 pm
why we have to be careful and why people get upset about the actions of twitter and facebook and others. i don't want to relitigate the election, i want to be careful about that because i don't want to be canceled in some other place. i think if we take the hunter biden example of the laptop, and the new york post story, and the way the new york post story was banned and relegated to secondary status and so forth, we are familiar with that. it was labeled disinformation and that was the rationale for it at the time. everybody knows that. at the core of that was the claim by many people including
6:37 pm
former national security advisor's, a raft of them, the russians were behind the hunter biden investigation. i think it later turned out we know that was largely not true. the point is not to resolve all of the facts surrounding that. one persons misinformation and disinformation is seen in a different light another person. there are obviously certain things that are true. other things that are false. they are not always alternative facts. that is not what i am suggesting here. what i'm suggesting is someone who has always been a strong proponent of first amendment rights and free speech rights, that we really have to be
6:38 pm
careful when we talk about the solutions. i know we are going to get to that. one thing i will make clear up front, i disagree with a lot of actions that twitter takes, facebook, google. for my own perspective as someone that is right of center, free-market conservative, to my way of thinking, there is a bias in their actions. against those that have that perspective. but nevertheless, that said, i have always said that these companies have a first amendment right to put up on their platforms what they wish. they have a first amendment right to take down what they
6:39 pm
wish. that is a different question though, which i assume we will get to hear at some point of their section 230 immunity from liability for being sued. i don't think they have a first amendment right necessarily to be immune from all suits. all that is a way of saying, when you get to the question of solutions, and what to do about it, it is really difficult. i will throw this out before turning it back to harold. we may ultimately end up in the same place. i am beginning to believe that it is really a competition problem. the way to ultimately improve the situation really is to have more platforms and to make sure the existing platforms that we
6:40 pm
have are not acting in anticompetitive ways to suppress the emergence of new platforms. we have the example of parler. i am not suggesting we are convinced there is an antitrust solution. i think that is a place we need to look for a solution. harold: let me respond to a couple of things. i have to say there has not been any showing of any kind of bias with regard to liberal or conservative.
6:41 pm
there have been some studies that showed there has been a monitoring bias against people of color and minority communities, particularly language used in those contexts is frequently judged more harshly by content moderators than the same language is when spoken or posted by predominately white people and conservative organizations. really what is going on is just the market. to the extent there is a bias, it is a bias that is market driven. they used to say in journalism, if it bleeds, it leads. therefore, the headlines were always kind of dramatic crime stories and that kind of thing. the same thing is true with social media. in place of if it leads -- bleeds, it leads. it is if it enrages, it engages. we have algorithms weighted towards giving us the most
6:42 pm
extreme views first. we are seeing that is very problematic for having not just the kind of discussions that we need to have about events of the day, so get -- civic engagement but the way they radicalize people and bring them ultimately to violent action. i think antitrust is a part of this. i think we want to have a proliferation of different kinds of content moderation and engagement. as we have also seen, that is not going to solve all the
6:43 pm
problems. the concern has been, great. now that you have driven the most extreme speakers off of twitter and facebook, where the police and everybody else can see them, you have driven them to apps like telegram where they can coordinate more explicitly through encryption. a lot of this has to do with, what are the problems we are trying to get at? what are the motivations of
6:44 pm
various speakers? how do we have a policy that is respectful of the dangers of the first amendment. there are very real dangers with regard to acting in a rush and trying to suppress a wide range of speech as dangerous. we have seen in the past that goes poorly -- there is actually a role here for federal regulation to provide appropriate guidelines and solutions that are narrowly focused on imminent danger. peter: you warned against a digital democracy task force set up by the government. randolph: i did. there is not much attention paid -- to a proposal by i think there were ultimately eight democratic congressmen and women who signed onto a proposal to create this task force and to have -- this would be a cross agency task force with its own resources and budget. essentially, when you get right down to it and do a little reading between the lines, it would be tasked with trying to make determinations about what is harmful speech. that ultimately i think would be where the effort would go under this task force. my response to that was the cure is worse than the disease, it goes back to what i said about the first amendment. as much, as many problems as i have with the actions of some of the social media platforms, and
6:45 pm
i do not disagree with you, we are not going to resolve it here about the leftward biases of those platforms, i think we have a difference in that. nevertheless, i don't want the government ultimately to be in the business of being the arbiter of what is true or not true or harmful or not harmful. the keyword, when you look at their terms of service. they talk about obscene speech
6:46 pm
and guilty speech and so forth. where a lot of the actions are ultimately taken are based on claims the speech is harmful. not all, but that is a lot of it. a lot of the terms of service contain the language concerning the ability to concern -- remove speech as harmful. your version of what is harmful and my version, and peter's, it is not likely to be the same. well we all agree, all of us would agree that there is
6:47 pm
certain speech that is intended to incite violence, i think you used the word extremism. we might even have large agreement about what is extreme. we would not necessarily agree on what is harmful or when you invoke other terms like that. it gets really difficult. i am wary of the government, wary of the government and i think you ought to be, or maybe our, of the government making those types of determinations. i want to pose a final question to you. were you troubled by the way twitter and facebook handled the story about hunter biden's laptop? whether you agree with any part of it in terms of being true or not, in retrospect where you troubled by the handling of that?
6:48 pm
harold: i would say i am troubled because i have these concerns. because they arise with or without any sort of government intervention because the response of the way these companies handled it was in response to a public uproar by one loud segment of the community. and then of course when they did it, you had all the people that did not like it making their own loud response. companies are stuck in this position. here is a newspaper that is protected under the first amendment, that is giving its views. it is not a russian operation or something like that. at the same time, there are other ways that can handle it and there are things we can do.
6:49 pm
i am going to first of all, like anybody who has written a book, i am going to promote my book, the case for the digital platform act which is available for free online or through amazon. where i spend two chapters discussing this because these are complicated and difficult issues. one of the things i urge is we focus on certain types of objective behavior. there are a bunch of things we could identify as essentially forms of fraud in terms of disinformation that we have acted in and other electronic media. for example, bots designed to artificially promote stories. that is an action that can be
6:50 pm
noticed, guarded against, acted against and made illegal in the same way we make harassment by telephone or fraud by wire to be illegal because it is a concrete action being used to deceive. there is an argument for what we might call to report red flag knowledge. where we don't want the companies to make an analysis of who is a good guy and who is a bad guy, but to the extent we can identify patterns that point to potentially illegal activity. there may be a responsibility to report those to the fbi or potential enforcement authorities so they can go to the courts, get a warrant. do the due process things that are the role of government to do.
6:51 pm
that is detected early by those who are in the position to best detect it. i agree with you these issues raise troubling questions. we are going to have to balance a lot of different interests. i don't think these are showstoppers. i think these are things we need to work through. not wait until the pressure builds to make decisions out of panic or leave these to a handful of private companies to act as they see best. peter: twofold question. do these private companies have the right to ban the then president of the united states
6:52 pm
from their platforms? do you agree that was a smart decision? randolph: they certainly have the right. i do think it is important for us to understand and respect the rights of these companies to make their decisions. i do think in this case, it was justified. we saw a direct correlation between the president's activities in his social media platforms and the response from people to take up, to have a violent response. the number of writers who were there on january 6 who said and apparently in all sincerity, we were responding to our president. our president told us to fight for the constitution and for freedom.
6:53 pm
ok. there really is an eminent danger. it is responsible to take the president off line if he is the source of these inciting, and the technical legal sense, statements. i have some sympathy for the companies, you don't lightly remove the president of the united states from speaking to the public. while it was justified here, i don't think it was necessarily justified before january 6, even though i know a lot of people who thought they should have done it years earlier. peter: randy may, same question. randolph: i agree with harold they have the right to do it. again, that is based on their first amendment right. i don't think they should have taken the actions they have taken, to the extent that they made these extensions more than a short-term suspension. the tweets surrounding the january 6 riot in the capital i think, that is obviously a difficult case.
6:54 pm
people parsed the tweets a different way. i think one could perhaps justify taking down certain tweets. what really is troublesome to me and i think it illustrates the problem is they go to the extremes and as far as i know, even today, ex-president trump is suspended from twitter and facebook. i don't think there is a justification for that. what makes people really wonder about their actions and troubled
6:55 pm
by their actions, they do leave up tweets from ayatollah khomeini. i know that example is used a lot but i think it is personally a good one. he routinely tweets death to jews and so forth. i understand they can come up with justifications for why that is different. that he is one type of world leader i suppose and trump was a different type. i think often there rationales seem to be too much seat of the to put it one way. or post hoc, to invoke the latin term, rather than consistent and principled. what i would hope for would be consistency as much as possible. and principled judgments.
6:56 pm
it is hard to get there, and congress is in the thick of it. considering changes to section 230. peter: randy may, what would you like to have done to section 230 of anything. randolph: i would like at least for congress to consider. i don't think it should be repealed completely. even though president biden and ex-president trump thought it should be repealed. it is worth looking at whether the types of things harold talked about with looking at actions rather than words expressions, perhaps that might
6:57 pm
be a solution that would lend itself to some sort of legislative language that would not trample on the first amendment rights. i just want to add finally, antitrust, congressional committees are looking at the market power of these companies. i think that is a worthwhile exercise that needs to continue, irrespective of the outcome. peter: section 230. harold: i think we have a real problem of section 230 myopia. some part should be excluded. for example, paid advertisement should not necessarily be
6:58 pm
entitled to the same type of protection as traditional speech. that is more of a business transaction in should be judged under business speech rules. what we are really talking about our rules of the road that don't apply without congress taking some affirmative step. 230 says you are not liable for things you would be liable for without the protection. most of the things people are upset about like taking people off when they should not be taken off or leaving people in when they should not are not
6:59 pm
things you are not normally liable for. if the problem is we need these to be more transparent and consistent, we should have a law that focuses on that. if the problem is people are being harassing or there is deception, we ought to focus on that. >> if the problem is misinformation is being pushed out because of the algorithms, let's start with the right policy then think about whether we should amend to 30 rather than saying to 30 is the answer, what's the question. >> harold is the senior vice president of public knowledge. gentlemen, we appreciate your giving some context to this national conversation on the communicators. >> you are watching c-span, your unfiltered view of government. c-span was created by america's
7:00 pm
television companies in 1979. today, we are brought to you by these television companies who provide c-span to viewers as a public service. >> sunday night on "q&a" lawrence roberts talks about his book "mayday 1971" which examines the spring offensive when tens of thousands of anti-vietnam war protesters, including vietnam war veterans came to washington dc in an effort to shut down the federal government. >> the story it tells is a much larger one. it's a story about how we as a nation, as a people, as individuals dealt with those periodic emergencies in american democracy. does the justice system deliver justice? can be posted -- can people stick by their principles or
7:01 pm
their fear and it's the story between the clash between an embattled president, and this case, richard nixon, who confronts a social movement in the streets, in this case, the antiwar movement, just as he is fighting to get reelected. what constitutional lines did his administration cross in an effort to stay in power? >> lawrence roberts, sunday night at 8 p.m. eastern on c-span's "q&a." >> former president trump became the first president to be impeached twice. this week, the impeachment managers delivered the articles of impeachment to the senate with maryland democratic representative jamie raskin reading the article before the senate. >> donald. -- donald john trump does warrant impeachment, trial, removal from office and disqualification to hold and enjoy any office of honor, trust
7:02 pm
or profit under the united states. >> so help you god. >> the following day, senators were sworn in as jurors in the trial. republican kentucky senator rand paul content -- requested a point of order to dismiss the charge as unconstitutional. >> therefore, i make a point of order that this proceeding which would try a private citizen and not a president, vice president or civil officer violates the constitution and is not in order. >> the motion was tabled. afterward, the senate approved the rules of the trial and adjourned until tuesday, february night, marking the start of the senate impeachment trial. watch the senate impeachment trial live at 1 p.m. eastern on c-span2. stream live on or listen on the c-span radio app. >> former federal reserve chair janet yellen was sworn in by vice president kamala harris is the new treasury secretary. miss yellen is the first woman


info Stream Only

Uploaded by TV Archive on