Is having bad information worse than having no information?

Silenced, for now

The move by Twitter, Facebook, and Snapchat to remove or suspend Donald Trump’s accounts and decisions by Google, Apple, and Amazon that led to a shutdown of Parler brings questions about the unchecked power of social media and the future of the platforms. University of Michigan experts weigh in.

Bans, restrictions have mixed impact following Capitol riot

The amount of potential misinformation was impacted on at least one social media platform following actions to suspend or shut down thousands of accounts, including President Donald Trump’s, following the Capitol riot Jan. 6, according to a University of Michigan measure of “iffy content.”

Hand holding smartphone showing decline sign.

Between Jan. 5 and Jan. 13, the U-M Center for Social Media Responsibility’s Iffy Quotient on Twitter fell from 14.8 percent to 11.5 percent, while on Facebook it went from 10.9 percent to 11.6 percent.

This means that fewer URLs from iffy sites made the top 5,000 most popular URLs on Twitter in the immediate days after the platform took action to ban the president permanently and suspend some 70,000 user accounts.

The center’s Iffy Quotient, produced in partnership with NewsWhip and NewsGuard, measures the fraction of the most popular URLs on Facebook and Twitter that come from iffy sites that often publish misinformation. NewsWhip determines the most popular URLs each day, while NewsGuard provides website ratings, with Media Bias/Fact Check providing ratings for sites unrated by NewsGuard.

“We shouldn’t overlook the fact that Facebook’s Iffy Quotient was already lower than Twitter’s on Jan. 5 and, on average, has actually been lower than Twitter’s over almost the last two years,” says Paul Resnick, director of the center and the Michael D. Cohen Collegiate Professor of Information. “Still, it is encouraging to see a marked drop in Twitter’s Iffy Quotient after they very publicly intervened on their platform.”

In addition to seeing fewer iffy sites among the most popular 5,000 URLs on Twitter, relative engagement with iffy content was down, albeit barely so on Facebook. On Jan. 5, the engagement share of iffy content on Twitter was 24.3 percent but by Jan. 13 it was down to 9.5 percent. On Facebook, the engagement share was 16.9 percent on Jan. 5 and 16.8 percent on Jan. 13.

“What this means is that over this eight-day period on Twitter, the URLs that were most engaged with were less and less often from iffy sites. In other words, there was more robust engagement with iffy sites’ URLs on Twitter before they announced that they were taking some specific actions,” says James Park, assistant director of the center. “Naturally these things fluctuate, but it’s noteworthy to clearly see this sort of result on Twitter after they’ve taken some direct action.”

“One particular value we believe the Iffy Quotient has — illustrated by recent events — is to help assess whether there are measurable effects that follow the platforms making announcements or taking actions,” Resnick says.

Since it was launched in 2018, the Iffy Quotient has measured content around elections, the COVID-19 pandemic, and incidents of racism, protests, and riots.

A formative moment

Cliff Lampe, professor of information, studies the social and technical structures of large-scale technology-mediated communication, working with sites like Facebook, Wikipedia, Slashdot, and Everything2. He has also been involved in the creation of multiple social media and online community projects. His current work looks at how the design of social media platforms encourages moderation, misinformation, and social development.

“This is a formative moment for social media companies,” he said. “They have the obligation and right to police their platforms for the type of content they want to host. Still, many people feel a lack of agency, since the power of the platform can feel overwhelming to the individual and group. How social media platforms navigate this over the next few months could define the industry for a decade.”


[Read the full Q&A with Lampe.]

‘Deplatforming Parler was a good start’

Libby Hemphill, associate professor of information and associate director of the Center for Social Media Responsibility, is an expert on political communication through social media, as well as civic engagement, digital curation, and data stewardship.

“Finally deplatforming Trump was a big move for social media platforms,” she says. “Coupled with other actions like shuttering QAnon groups and propaganda accounts ahead of the elections in Uganda, I hope that we’re seeing platforms step up to meet their public obligations. However, I don’t expect them to continue holding folks accountable unless extremists and disinformation campaigns stay bad for business.

“We should definitely consider whether three companies ought to have this much power over our communication networks, but Apple, Google, and Amazon finally flexed their market muscles. They could do more to root out apps and customers who violate their terms, but deplatforming Parler was a good start.”

Considering the broader social and cultural context

Sarita Schoenebeck, associate professor of information, specializes in research on social computing, social media, and human-computer interaction. Several of her studies have focused on online harassment.

“For years, platforms have evaluated what kinds of content are appropriate or not by evaluating the content in isolation, without considering the broader social and cultural context that it takes place in,” she says. “This means harmful content persists on the site, and content that should be acceptable may be removed. We need to revisit this approach. We should rely on a combination of democractic principles, community governance, and platform rules to shape behavior.

“We also should center commitments to equity and justice in how platforms regulate behavior. Allowing some people to engage in hate speech and violence simply means that others can no longer participate safely or equitably, and that is not the kind of society — whether online or offline — that we should aspire to.”

The potential for regulation

Josh Pasek is an associate professor of communication & media and political science, faculty associate in the Center for Political Studies, and core faculty for the Michigan Institute for Data Science at U-M. His current research explores how both accurate and inaccurate information might influence public opinion and voter decision-making and evaluates whether the use of online social networking sites such as Facebook and Twitter might be changing the political information environment.

“Most critically, tech companies are afraid of the potential for regulation,” he says. “When they were accused of spreading misinformation, they reacted by providing minimal fact-checking on claims that might mitigate that criticism. When they were being criticized for supposedly stifling voices on the political right, they bent over backward to ensure their policies would not have a disproportionate political impact even though the violations of those policies were far from politically neutral. And when they read the tea leaves, it had become clear that the most likely source of future regulation was from a Democratic Congress that would be more worried about dangerous information and incitement than ensuring that even the extremes of the political spectrum had a platform.

“Social media companies really don’t want to play a police role, but they are far more worried about regulation than about playing that role. That said, it may be a good thing, because it is really not clear that there are any other actors that we would trust more to do it.”

 

Comments

  1. diane corba - 1982 BS Pharmacy

    Ms Hemphill claims de-platforming Parler was a good start? On what basis, for what reason did they block Parler other than to block their competitor? Has Ms Hemphill ever been on Parler? I was and enjoyed following conservative and reasonable commentators/ politicians without the concern that those in charge of Facebook or Twitter were blocking content. Censorship is wrong and un-American and taking Parler down was censorship. They have looked at which platforms were involved in planning the horrific attack on the Capitol and it was not Parler leading that plan Facebook and Twitter were involved. I have removed most every email thread I get from University of Michigan although I am a proud graduate. Where I object is how far left the University I WAS proud to attend has gone. Where is freedom of thought? Where is curiosity of opinions? Where is Free Speech? Where is balanced journalism? This all truly saddens me as a patriotic American.

    Reply

    • Vincent Tiseo - 1988

      I agree with Ms. Corba! U of M should be championing free speech, not cheering censorship. What will the professors says when liberal voices are silenced? Will they then complain that the social media companies are a “threat to democracy?” You cannot have it both ways. This nation has a 1st Amendment for a reason; let’s listen to both sides so that we can find some common ground.
      If you are going to silence hate speech (however that is vaguely defined), then it has to be equal. Antifa and others used, and still use, many social platforms to advance their chaos. Why is that not an issue? Why are those organizations not silenced?

      Reply

      • Stephen Charles - 1979

        What is scarier still is that the UofM has become a cheerleader for Soviet style censorship. It is frightening to see that UofM has become a repository advocating the memes of national socialists.

        Reply

        • julian Stienon - 1962

          For Stephen Charles, I agree w/ your comment. Nott sure if it indicates thin ice or open water. Both are dangerous. I do not need memes of “national socialists”. Translate “national socialists” to German, take the first two letters of each German word and it spells “NAZI”. That’s why. I live thru the days of the 3rd Reich and have no desire to revisit any form of its rebirth, “neo” or not! Thanks!

          Reply

      • Carl Stein - 1982

        Yes, the nation has a First Amendment for a reason. Reading the initial words of the amendment, “Congress shall make no law,” tells us the reason for the First Amendment: to restrict the power of Congress. The First Amendment restricts the power of Congress to censor, “Congress shall make no law … abridging the freedom of speech, or of the press.” The First Amendment does not restrict the power of private entities, such as Facebook and other social-media companies. Please learn about the First Amendment before you write about the First Amendment.
        In addition, Antifa is not an organization. Antifa is short for anti-fascist. You can be anti-fascist or pro-fascist, Antifa or Fascist; that is your choice. Antifa is an idea. You ask why is Antifa not silenced. Who do you propose shall silence an idea? Please remember that the First Amendment restricts the federal government from silencing an idea. In addition, the Fourteenth Amendment extends the First Amendment restrictions on governmental power to the state governments.

        Reply

    • Val Edwards - 1971 AMLS

      Well said Diana C. You are not alone.
      As a librarian who has had to fight against censorship at various levels, we have a freedom to read and decide for ourselves, until now? Or are we living in another McCarthy era? Journalists have fought for freedom of speech but now they seemingly censor it.

      Reply

    • Gerald Matthews - 1969

      I agree with Diane Corba completely. Free speech seems to be a thing of the past. I hope that donors begin to speak with their pocketbooks in order to be heard.

      Reply

    • Keith Mieczkowski - 2000

      I completely agree with Diane Corba’s sentiments.What has happened to the University of Michigan we knew?

      Reply

  2. Kenneth Gourlay - 2000 BS Education

    Pasek suggests that corporations policing their own platforms may be the lesser of two evils since “it is really not clear” that any other entity has public trust. Obviously, there is a trust problem with the way our governments police us, but for us to give up on their responsibility and turn it over to a private corporation still seems like a terrible idea. Corporations like Facebook and Twitter, driven by a profit motive, will never be the ideal source of policing. We need to fix our government, not hope for benign governance from a corporation.

    That said, unpopular or harmful political opinions are not and should not be illegal. The intentional publication of libel or hate speech is one thing, but we need to learn how to effectively respond to misinformation and political unrest. Whether that is through individual critical thinking skills, an authoritative fact-checking organization, or something else is an open question.

    Reply

  3. Stephen Charles

    “We also should center commitments to equity and justice in how platforms regulate behavior. Allowing some people to engage in hate speech and violence simply means that others can no longer participate safely or equitably, and that is not the kind of society — whether online or offline — that we should aspire to.”

    Wow. This anti-First Amendment stance is more at home in a Soviet Gulag than on an American university. And this comes from a “professor”? Who is she to determine what is “hate speech”? I find her call for censorship “hate speech” and think she should she banned. So there!

    Reply

  4. Connie Boyd - 1967

    What’s truly frightening is that a major university views censorship as a best choice. President Trump was banned while Isis continues recruitment. Who should be the judge, societal values in a free society. The “Iffy patrol”? Really? Teach correct principles and society will govern itself.

    Reply

  5. Lynn Broniak-Hull - 1976

    I agree that social media companies should not censor non violent differences in political opinion or stories and reports of any form. It is frightening that any US citizen would think that is acceptable.

    Reply

  6. Barry Zajac - 1989

    I don’t understand why people seem to think that freedom of speech somehow equates to the right to a platform or an audience, particularly one that they do not pay for (if you’re not paying for a service, you are the product). It makes perfect sense to me that there is a limit to how much hostility and disinformation a platform can tolerate before they start losing eyeballs (and clicks) and social media quickly turns into antisocial media. They also want to avoid having the government become their “curator,” which could easily lead to a more concerning censorship.

    Reply

  7. Gary Barber - 1968

    As a once proud graduate of U of M I have looked on it’s relentless leftward drift with alarm for years. It seems lately with the banning of certain words by the IT department and the the praise of the people interviewed for this article for suppression of speech they disagree with has raised that concern even higher. When will the people who value the hard won freedoms of thought and expression start pushing back against these leftist/ Marxist authoritarians who want to dominate and control every aspect of social and political life?
    A no longer proud wolverine who will never donate to U of M again.

    Reply

  8. R B - 1984

    There is no value in free hate speech. On the contrary, hate speech occupies more oxygen and employs fewer brain cells than any level of the political discourse and criticisms that the 1st Amendment was designed to protect.
    The comments here about imaginary “leftists” are misguided in that the voices of hate speech and disinformation are not being silenced or discriminated against. On the contrary, they have enjoyed disproportionate amplification and influence relative to their numbers.
    And always, Go Blue.

    Reply

Leave a comment: