Kate Starbird – 91Ě˝»¨News /news Thu, 04 Mar 2021 21:28:57 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 91Ě˝»¨Center for an Informed Public co-authors report on mis- and disinformation surrounding the 2020 U.S. election /news/2021/03/02/uw-center-for-an-informed-public-co-authors-report-on-mis-and-disinformation-surrounding-the-2020-u-s-election/ Wed, 03 Mar 2021 00:20:22 +0000 /news/?p=73074  

illustration of a person putting a ballot in ballot bos with text "The Long Fuse: misinformation and the 2020 election"

The , a nonpartisan coalition of research institutions, including the 91Ě˝»¨, that identified, tracked and responded to voting-related mis- and disinformation during the 2020 U.S. elections, released its final report, “” on Tuesday, March 2. The report is the culmination of months of collaboration among approximately 120 people working across four organizations: the , , and the .

A handful of EIP researchers including the UW’s , associate professor of human centered design and engineering, will discuss key findings, insights and recommendations from the final report hosted by The Atlantic Council scheduled for noon-1:30 p.m. PST, Wednesday, March 3. The event is free and open to the public. Register .

The EIP’s “Long Fuse” final report expands upon the coalition’s rapid-response research and policy analysis surrounding the November 2020 U.S. election and will detail how misleading narratives and false claims about voting coalesced into the metanarrative of a “stolen election,” which propelled the Jan. 6 insurrection at the U.S. Capitol.

The EIP’s final report will also include a set of policy recommendations and share insights about how the coalition of researchers carried out their work, and how this model may be expanded to combat future large scale misinformation events.

Among the key findings:

  • Misleading and false claims and narratives coalesced into the metanarrative of a “stolen election,” which later propelled the Jan. 6 insurrection
  • Narrative spread was cross-platform: Repeat spreaders leveraged the specific features of each platform for maximum amplification
  • The primary repeat spreaders of false and misleading narratives were verified, blue-check accounts belonging to partisan media outlets, social media influencers, and political figures, including President Trump and his family
  • Many platforms expanded their election-related fact-checking and moderation policies during the 2020 election cycle, but application of moderation policies was inconsistent or unclear

The 2020 federal election demonstrated that actors — both foreign and domestic — remain committed to weaponizing viral false and misleading narratives to undermine confidence in the U.S. electoral system and erode Americans’ faith in our democracy, according to the report. Mis- and disinformation were pervasive throughout the campaign, the election, and its aftermath, spreading across all social platforms, the report found. The EIP was formed out of a recognition that the vulnerabilities in the current information environment require urgent collective action.

For more information, contact CIP Communications Manager Michael Grass at megrass@uw.edu.

]]>
91Ě˝»¨ to create 91Ě˝»¨Center for an Informed Public with $5 million investment from Knight Foundation /news/2019/07/22/university-of-washington-to-create-uw-center-for-an-informed-public-with-5-million-investment-from-knight-foundation/ Mon, 22 Jul 2019 15:30:44 +0000 /news/?p=63264
Jevin West teaches a class in “Calling BS.” Photo: Quinn Russell Brown/91Ě˝»¨

The 91Ě˝»¨ today announced a $5 million investment from t to create the 91Ě˝»¨, led by an interdisciplinary group whose mission is to resist strategic misinformation, promote an informed society, and strengthen democratic discourse. The Center is also funded by a $600,000 award from the William and Flora Hewlett Foundation.

The Center brings together existing areas of excellence at the 91Ě˝»¨and builds upon the university’s ability to better understand how and why fake news, misinformation and disinformation are created. The Center will combat what researchers call the “misinformation epidemic.”

“We really see the Center as a university-wide effort,” said, principal investigator and inaugural director for the Center. “Misinformation touches everything.”

Kate Starbird teaches a class in misinformation. Photo: Mark Stone/91Ě˝»¨

The 91Ě˝»¨Center is one of five institutions receiving major investments from the Knight Foundation nationally and is the only recipient in the Western United States. 

“A functioning democracy is an informed democracy,” said Sam Gill, Knight Foundation vice president for communities and impact. “ 91Ě˝»¨is bringing together leading scholars in computer science, sociology and law to equip our democracy with the right tools to navigate the digital age.”

’s support to the 91Ě˝»¨is part of a $10 million effort announced in 2018 to examine and combat digital disinformation’s impact on U.S. democracy and elections.

Recent decades have seen a profound shift in the ways people, groups, and organizations produce and consume information and participate in public discourse. While many positive advancements have emerged from new technologies and platforms, the new information environments also have opened the door to misinformation, disinformation and fake news.

“It’s one of the most important problems of our time that we as a society need to solve,” West said. “This is not a left or right issue. This is an issue that transcends political boundaries. Everyone wants to get this right.”

The principal investigators at the Center are a who’s who in this field of research, widely recognized for their respective expertise. In addition to West, co-director of DataLab, who is known for his Information School class “Calling B.S.: Data Reasoning in a Digital World,” there are four researchers who will lead various initiatives for the Center:

  •     , co-director the Social Media Lab, Information School;
  •     , director of the Technology & Social Change Group (TASCHA), Information School;
  •     , co-director of the Tech Policy Lab, School of Law; and,
  •     , director of the Emerging Capacities of Mass Participation Lab (emCOMP), Human Centered Design & Engineering.

The Center will be devoted to educational efforts, research, policy and community outreach around misinformation and disinformation campaigns. Additionally, researchers will establish a network of Community Labs in public libraries and other institutions to co-create and assess research-based interventions.  

Housed within the Information School, the Center for an Informed Public is scheduled to officially open in fall 2019.

###

About the John S. and James L. Knight Foundation

Knight Foundation is a national foundation with strong local roots. We invest in journalism, in the arts, and in the success of cities where brothers John S. and James L. Knight once published newspapers. Our goal is to foster informed and engaged communities, which we believe are essential for a healthy democracy. For more, visit.

About the UW

The 91Ě˝»¨ was founded in 1861 and is one of the pre-eminent public higher education and research institutions in the world. The 91Ě˝»¨has more than 100 members of the National Academies, elite programs in many fields, and annual standing since 1974 among the top five universities in receipt of federal research funding. Learn more at uw.edu.

 

]]>
Information School to welcome high school students March 19 for ‘MisInfo Day’ – from ‘Calling BS’ faculty duo /news/2019/03/18/information-school-to-welcome-high-school-students-march-19-for-misinfo-day-from-calling-bs-faculty-duo/ Mon, 18 Mar 2019 21:38:56 +0000 /news/?p=61275 What is misinformation, and how — and why — does it spread? The 91Ě˝»¨ is taking a leading role in helping people better navigate this era of increasing online fakery and falsehood.

On March 19, the iSchool will welcome more than 200 Seattle-area high school students for “,” a daylong workshop on how to navigate the misinformation landscape from and , the faculty duo who created the “Calling BS in the Age of Big Data” class and .

“MisInfo Day,” will be from 9:30 a.m. to 2:30 p.m. in the Husky Union Building’s North Ballroom.

West is an iSchool assistant professor, Bergstrom a professor of biology. Their most recent creation is , a website that helps users learn to tell real from fake images online.

The students — many of whom are studying government — will come from Nathan Hale, Franklin, Bellevue and Toledo high schools. Discussions will include defining misinformation and why we find it so compelling as well as “tips and tricks” for determining if news reports and social media posts are legitimate

The afternoon session will be an “Ask the Experts” panel, where the students will hear professionals from the Seattle Public Library, Snopes.com and the 91Ě˝»¨about their work. The students are asked to “come with questions about misinformation, fact-checking, confirmation bias and more.”

Other faculty and staff involved are:

  • , iSchool assistant professor
  • , 91Ě˝»¨librarian who manages the Information Science collection
  • , assistant professor in the Department
  • , assistant professor in the
  • Liz Crouse, one of several students involved from the iSchool’s Masters of Library Science program, who assisted West in coordinating the event and will conduct pre- and post-program surveys of students for an ongoing research project. Other MLIS students will lead breakout sessions during the event.

Bergstrom and West’s “Calling BS” work has drawn wide attention from press as well as other institutions, some of whom have already expressed interest in holding events modeled on “MisInfo Day.”

###

For more information, contact Maggie Foote, iSchool communications director, at 206-250-5992 or m2foote@uw.edu

]]>
‘Trump in the World’: Jackson School faculty give public talks through spring quarter /news/2018/03/08/trump-in-the-world-jackson-school-faculty-give-public-talks-through-spring-quarter/ Thu, 08 Mar 2018 18:11:07 +0000 /news/?p=56821 The presidency of Donald Trump continues to have significant impacts on international affairs, global alliances and the role of the United States in the world.

Faculty at the UW’s and will explore these issues in a series of public lectures and discussions through spring quarter.

The series “” will be moderated by , professor and director of the Jackson School.

The lectures will be held Tuesdays from 4:30 to 6 p.m. in Room 220 of Kane Hall, starting March 27, and all are open to the public. For students, the series is a 2-credit lecture class.

The lectures are as follows:

March 27: Japan, with .
April 3: Two Koreas, with .
April 10: Indo-Pacific strategy challenges, with .
April 17: Migration, with .
April 24: Global energy challenges, with .
May 1: Online disinformation, with .
May 8: Israel/Palestine, with .
May 15: The European Union, with .
May 22: Putin and Russia, with .
May 29: The Kurds, and a general discussion with Kasaba.

All the speakers are faculty members in the Jackson School except Starbird, who is a professor of human centered design and engineering.

###

For more information about the series, call 206-543-6001 or write to jsisadv@uw.edu.

]]>
The Twittersphere does listen to the voice of reason — sometimes /news/2016/04/04/the-twittersphere-does-listen-to-the-voice-of-reason-sometimes/ Mon, 04 Apr 2016 16:53:33 +0000 /news/?p=47015 In the maelstrom of information, opinion and conjecture that is Twitter, the voice of truth and reason does occasionally prevail.

91Ě˝»¨ researchers have found that tweets from “official accounts” — the government agencies, emergency responders, media or companies at the center of a fast-moving story — can slow the spread of rumors on Twitter and correct misinformation that’s taken on a life of its own.

This tweet from WestJet’s official account quelled online rumors that one of its planes had been hijacked. Photo: @WestJet, Twitter

The researchers documented the spread of two online rumors that initially spiked on Twitter — alleged police raids in a Muslim neighborhood during a hostage situation in Sydney, Australia, and the rumored hijacking of a WestJet flight to Mexico — that were successfully quashed by denials from official accounts.

The research team from the in the 91Ě˝»¨Department of Human Centered-Design & Engineering and the Information School’s presented their findings in a at the Association for Computing Machinery’s Conference for Computer-Supported Cooperative Work and Social Computing  in March.

“A lot of emergency managers are afraid that the voice of the many drowns out the official sources on Twitter, and that even if they are part of the conversation, no one is going to hear them,” said co-author , a 91Ě˝»¨doctoral candidate in the Department of Communication. “We disproved that and showed that official sources, at least in the cases we looked at, do have a critical impact.”

The case studies also offer lessons for organizations that may have plans in place to deal with an actual crisis, but haven’t considered how to handle online rumors and communicate before they have complete information or know what is true.

“Oftentimes in a crisis, the person operating a social media account is not the person who makes operational decisions or who even decides what should be said,” said senior author and emComp lab director , a 91Ě˝»¨assistant professor of human-centered design and engineering.

“But that person still needs to be empowered to take action in the moment because if you wait 20 minutes, it may be a very different kind of crisis than if you can stamp out misinformation early on,” she said.

The 91Ě˝»¨researchers found that the vast majority of the tweets both affirming and denying the two rumors were retweets of a small number of Twitter accounts, demonstrating that a single account can significantly influence how information spreads. Much of the online rumoring behavior was driven by “breaking news” accounts that offer the veneer of officialdom but don’t necessarily follow standard journalistic practices of confirming information.

“Avoiding social media channels because you don’t want to be confronted with misinformation is a real danger for an organization. You’re essentially opening up a space for information to be spreading without your voice being a part of it.” – , 91Ě˝»¨assistant professor of human-centered design and engineering

The first rumor was one of many that spread during the “Sydney Siege” of December 2014, in which a gunman took 18 hostages at a chocolate cafĂ© in Australia. A radio talk show host reported that federal police were raiding homes in the largely Muslim Lakemba neighborhood when, in fact, officers were on a previously scheduled tour of a local mosque.

Over a period of several hours, Twitter users posted 1,279 tweets related to the rumor. Of those, 38 percent affirmed the rumor, and 57 percent eventually denied it.

Nearly all of the affirmations happened in the first hour and 20 minutes, before police responded to the rumor, and the bulk of these stemmed from just five Twitter accounts that were widely retweeted.

Once the Australian Federal Police issued a single tweet — “@AFPMmedia: Reports that the APF is conducting search warrants in the Sydney suburb of Lakemba are incorrect” — the tweet volume related to the rumor increased to one per second. Ninety percent were retweets of the single police account source, and all were denials. Affirmations of the rumor never resurfaced in a significant way.

The second rumor the team tracked was a possible hijacking of a WestJet flight from Vancouver, British Columbia, to Mexico in January 2015, which generated more than 27,000 related tweets. It surfaced on Twitter after flight-tracking websites picked up what they believed was a “hijacked” code coming from the plane, which was likely caused by an instrument error on the ground.

Being Saturday afternoon, no WestJet communications employee was officially on duty. But one member of the company’s social media team caught it from home about 20 minutes after the rumor surfaced.

For the next 10 minutes, a growing crowd of users from “breaking news” accounts, aviation enthusiasts and others began tweeting about the signal code and a possible hijacking. While WestJet was close to certain that the signal was an error, company officials did not yet know for sure, because the plane was in final descent and direct communication was not allowed due to security protocol. As a WestJet employee explained in a later interview with the research team:

“The biggest question for us was: ‘Do we respond now with almost confirmed information, or do we wait five minutes to get confirmed info? We chose, ‘Let’s get it out now,’ and then five minutes later confirmed.” The two WestJet denial tweets corresponded with a rapid drop in online chatter, and everything was back to normal within a couple of hours.

The volume of tweets denying the WestJet hijacking rumor ultimately surpassed those that affirmed it (left). Many of those were retweets of a just handful of accounts (right). Photo: 91Ě˝»¨

After that experience, WestJet decided to expand its inventory of precrafted tweet templates that do not require managerial approval and would be tweeted according to a specific protocol depending on how the issue is trending. This allows social media managers to respond to a fast-moving story and issue some type of official statement — even if complete information is lacking — before a situation escalates.

In today’s information economy, it’s important for emergency response agencies and other organizations to invest in the personnel and have an engaged social media presence before a crisis hits, Starbird said. And these two examples of online rumoring behavior demonstrate how that investment can pay off.

“Being online is really important, even if you don’t want to be,” Starbird said. “Avoiding social media channels because you don’t want to be confronted with misinformation is a real danger for an organization. You’re essentially opening up a space for information to be spreading without your voice being a part of it.”

The research was funded by the National Science Foundation.

Co-authors are former 91Ě˝»¨Master of Digital Communication and Media/Multimedia student Cynthia A. Andrews, 91Ě˝»¨human-centered design and engineering undergraduate student Yuwei Ding and 91Ě˝»¨Information School assistant professor .

For more information, contact Starbird at  and Fichet at 949-878-6049 or efichet@uw.edu.

]]>
Hold that RT: Much misinformation tweeted after 2013 Boston Marathon bombing /news/2014/03/17/hold-that-rt-much-misinformation-tweeted-after-2013-boston-marathon-bombing/ Mon, 17 Mar 2014 15:34:04 +0000 /news/?p=31160 It takes only a fraction of a second to hit the retweet button on Twitter. But if thousands of people all retweet at once, a piece of information 140 characters long can go viral almost instantly in today’s Internet landscape.

If that information is incorrect, especially in a crisis, it’s hard for the social media community to gain control and push out accurate information, new research shows.

A graph shows hashtags on Twitter and how they are related to each other.
This network graph shows relationships among the 100-most prevalent hashtags used on Twitter after the Boston Marathon bombing. The connecting lines represent hashtags that were in the same tweet. #boston was dropped from the graph because it connected with every other tag. Photo: U of Washington

91Ě˝»¨ researchers have found that misinformation spread widely on Twitter after the 2013 Boston Marathon bombing despite efforts by users to correct rumors that were inaccurate. The researchers presented their findings at in Berlin March 4-7, where they received a top award for their .

On April 15, 2013, two explosions occurred near the finish line of the Boston Marathon, killing three people. Three days later, the FBI released photos and video surveillance of two suspects, enlisting the public’s help in identifying them. Massive speculation broke out on mainstream media and social media sites, particularly Twitter. After a shooting on the Massachusetts Institute of Technology campus and a manhunt, one of the suspects was shot dead and the other arrested the evening of April 19.

The entire time, a flurry of tweets was published on Twitter using hashtags such as #boston, #prayforboston, #mit and #manhunt. A number of incorrect rumors surfaced that spread rapidly before corrections started appearing. And when they did, corrective tweets were minimal when compared with the volume of tweets that spread incorrect information.

“We could see very clearly the negative impacts of misinformation in this event,” said , a 91Ě˝»¨assistant professor in the Department of Human Centered Design & Engineering. “Every crisis event is very different in so many ways, but I imagine some of the dynamics we’re seeing around misinformation and organization of information apply to many different contexts. A crisis like this allows us a chance to see it all happen very quickly, with heightened emotions.”

Starbird, whose research looks at the use of social media in crisis events, began recording the stream of tweets about 20 minutes after the finish-line bombing. Her team, with the help of collaborator in the UW’s Information School, later got the complete dataset – 20 million tweets – to fill in gaps when the sheer volume of tweets coming in was too great to capture in real-time.

Researchers from the 91Ě˝»¨and Northwest University in Kirkland, Wash., analyzed the text, timestamps, hashtags and metadata in 10.6 million tweets to first identify rumors, then code tweets related to the rumors as “misinformation,” “correction” or “other.”

For example, they analyzed the rumor that an 8-year-old girl had died in the bombings. The researchers first identified tweets containing the words “girl” and “running,” then whittled that down to roughly 92,700 that were related to the rumor. They then found that about 90,700 of these tweets were spreading misinformation, while only about 2,000 were corrections. While the Twitter community offered corrections within the same hour the rumor appeared, the misinformation still persisted long after correction tweets had faded away.

“An individual tweet by itself is kind of interesting and can tell you some fascinating things about what was happening, but it becomes really interesting when you understand the larger context of many tweets and can look at patterns over time,” said Jim Maddock, a 91Ě˝»¨undergraduate student in Human Centered Design & Engineering and history who did most of the computational data analysis for this project.

A previous study analyzing the spread of misinformation on Twitter during the 2010 Chile earthquake found that Twitter users actually crowd-corrected the rumors before they gained traction. But the earlier study excluded all retweets, which the 91Ě˝»¨team found to be a significant portion of the tweets spreading misinformation.

The 91Ě˝»¨researchers hope to develop a tool that could let users know when a particular tweet is being questioned as untrue by another tweet. The real-time tool wouldn’t try to glean whether a tweet is true or untrue, but it would track instances where a tweet is contested by another.

“We can’t objectively say a tweet is true or untrue, but we can say, ‘This tweet is being challenged somewhere, why don’t you research it and then you can hit the retweet button if you still think it’s true,'” Maddock said. “It wouldn’t necessarily affect that initial spike of misinformation, but ideally it would get rid of the persisting quality that misinformation seems to have where it keeps going after people try to correct it.”

The team currently is looking at the relationship between various website links within tweets and the quality of information spread during the Boston crisis. They also are conducting interviews with people who were close to the scene in 2013 to see what effect proximity had on information sharing.

Paper co-authors are Mason and Mania Orand of the 91Ě˝»¨and Peg Achterman of Northwest University.

The research was funded by the National Science Foundation.

###

For more information, contact Starbird at kstarbi@uw.edu and Maddock at maddock@uw.edu.

Grant numbers: NSF 1342252, 1243170.

]]>