Carl Bergstrom – 91探花News /news Wed, 25 Feb 2026 00:31:34 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 Q&A: Researchers discuss potential solutions for the feedback loop affecting scientific publishing /news/2026/02/24/researchers-discuss-potential-solutions-for-the-feedback-loop-affecting-scientific-publishing/ Tue, 24 Feb 2026 20:30:01 +0000 /news/?p=90702
Scientific publishing relies on the work of unpaid peers to assess the validity of the science in manuscripts. But this process has reached a critical point where there are too many manuscript submissions and not enough peer reviewers. Photo: iStock

Scientists share their work by publishing articles in journals, such as Nature, Science or PLOS Biology. One major part of the publishing process involves having these manuscripts reviewed by unpaid peers. These scientists specialize in the same topic and volunteer to make sure the science is sound and the authors haven’t missed anything critical in their data analysis.

The peer review process has reached a critical point where there are too many manuscript submissions and not enough peer reviewers. , 91探花 professor of biology, and , North Carolina State University professor of statistics, used mathematical modeling to demonstrate this crisis in the form of a self-perpetuating cycle. The team describes this cycle and potential interventions in PLOS Biology.

91探花News reached out to Bergstrom and Gross to learn more about this cycle and how the potential interventions could mediate this crisis.

“Social trust in science can wax and wane, and even a little slippage has real consequences for scientists, their livelihoods and society as a whole.”

Carl Bergstrom and Kevin Gross

Why is the process of peer review important for science?

Carl Bergstrom Photo: Carl Bergstrom

Carl Bergstrom and Kevin Gross: Peer review helps scientific literature maintain its credibility. The system of peer review guarantees that published research has been scrutinized by experts in the relevant field. While peer review is not, and never has been, a watertight seal of approval 鈥 peer reviewers are human, too! 鈥 it has proven to be a system that, by and large, helps ensure the reliability of the scientific literature.

What is happening to create and perpetuate this cycle you describe in your paper?

CB and KG: The basic insight that drives our paper is that when peer review functions effectively, it helps journals select the science most worthy of their readers鈥 attention and creates a strong motivation for scientists to be selective about where they submit their work. After all, a scientist gains little by having their paper rejected by a top journal. So high-quality reviewing encourages scientists to choose where they submit their work carefully, and to submit only their very best work to the most prestigious outlets. Thus, effective peer review reinforces itself through a virtuous cycle.

Kevin Gross Photo: Kevin Gross

The cycle can spin in the other direction too. If peer reviewers have to dilute their efforts over a larger volume of submitted manuscripts, then each manuscript may receive less scrutiny and editors鈥 decisions consequently become less predictable. This encourages authors to try their luck at journals that might otherwise have been a stretch, increasing the volume of manuscripts that need to be reviewed even further and making editorial decisions even less predictable, and so on.

Why are we seeing this crisis now?

CB and KG: To be fair, scientists have been bemoaning the fragile state of peer review for decades. So we are far from the first to observe that using the goodwill of volunteers as a lynchpin of the scientific enterprise may not be a robust model.

But there is reason to believe that the situation is more dire now. There isn鈥檛 one single cause driving this more recent turn 鈥 many factors contribute. For example, over the past few decades, scientific communities have become larger and looser knit, and the willingness to volunteer tends to decline as groups become more diffuse.

Large commercial publishers have also discovered that scientific publishing can be a lucrative business 鈥 especially when they can dip into a tradition of free peer-review labor. Drawn by the sizable profits they could make, these publishers have launched countless new journals, crowding the journal landscape. Scientists, in turn, now have more options for what to do with a paper that has been rejected once or numerous times. There鈥檚 always another journal to send it to. And each time a paper is resubmitted, a new set of peer reviewers must be found.

The pandemic also shocked the system by compelling many researchers to reassess their time commitments. It seems that we have collectively yet to fully rebound to pre-pandemic levels of willingness to review.

Should people be concerned about the science described in current peer-reviewed papers?

CB and KG: Well, to back up a bit, the primary responsibility for the integrity and accuracy of the scientific literature rests squarely with the authors, as it always has. And, thankfully, most authors have strong reputational incentives to make sure that their work is solid and will stand the test of time. But authors have their blind spots.

Peer review isn鈥檛 going to suddenly collapse and take the literature down with it, but as the system becomes stressed, we might start to see a few more cracks emerge. While that isn鈥檛 catastrophic, it isn鈥檛 good for science, either. Social trust in science can wax and wane, and even a little slippage has real consequences for scientists, their livelihoods and society as a whole.

What about this crisis concerns you?

CB and KG: Perhaps our biggest concern is that journal editors who become frustrated with the inability to find willing peer reviewers will turn to AI for machine review instead. There may be ways in which machine review could complement human peer review, but we think it鈥檚 important that human review continues to be the engine of editorial deliberations at scientific journals.

Peer review is not just a process for making an accept-or-reject decision.听 Peer reviewers also provide commentary and feedback for the authors. These reports provide a venue for honest dialogue that helps researchers hone their ideas and grow in their careers. Outsourcing manuscript review to robots risks collapsing a discourse that is crucial to scientific progress.

One solution you discuss is to pay reviewers. Is this a viable solution?

CB and KG: Paying reviewers isn鈥檛 as crazy as it may sound. The landscape of scientific publishing includes both nonprofit and for-profit journals, and all sorts of business models in between. It seems reasonable that especially scientists who review for for-profit journals should be remunerated for their efforts when they provide a service on which the viability of the journal depends.

Perhaps the most compelling argument for paying reviewers is that, of all the possible interventions one could propose, it requires the least amount of coordination among different stakeholders to succeed. As soon as one journal figures out a working model for paying reviewers, then everyone will notice that paying reviewers is viable, and there will be market pressure on other journals to follow suit.

Another idea that we quite like is for journals to offer substantial monetary awards for the most constructive or helpful reviews. This idea has its drawbacks too. Editors would have to spend a little bit of time choosing the prizewinning reviews, and editors could always select their friends for the prize. But every alternative is going to have its drawbacks, and it鈥檚 important to focus on the net effect, especially when the viability of the status quo seems so tenuous.

If we want to keep peer review voluntary, what are other possible solutions?

CB and KG: There are lots of possible interventions. But the intervention that probably would enjoy the broadest support would be for university hiring and promotion committees to prioritize quality of publications instead of quantity. Most academic scientists today are working in a system that rewards a researcher for the number of publications above all else. This obviously creates incentives for researchers to submit lots of manuscripts, which puts lots of pressure on peer review. If the norms changed so that hiring and promotion hinged on a candidate鈥檚 top two or three papers instead, then researchers’ incentives would change and the pressure on peer reviewers would diminish.

This research was funded by the National Science Foundation and the Templeton World Charity Foundation.

For more information, contact Bergstrom at cbergst@uw.edu and Gross at krgross@ncsu.edu.

]]>
Cloaked in color: UW-led research finds some female hummingbirds evolve male plumage to dodge aggression /news/2025/03/20/cloaked-in-color-uw-led-research-finds-some-female-hummingbirds-evolve-male-plumage-to-dodge-aggression/ Thu, 20 Mar 2025 16:00:01 +0000 /news/?p=87796 A small, bright blue bird hovers in front of a flower.
A white-necked jacobin hummingbird. Credit: Lukas Hummel

Why do humans wear clothes? One reason is that changing outfits allows people to tailor their look in hopes of attracting or avoiding attention. New research led by the 91探花 found that hummingbirds may take a similar approach.

It鈥檚 been known for some time that some 鈥 but not all 鈥 female white-necked jacobin hummingbirds take on the brightly colored plumage worn by males. In a study published a team of researchers from the 91探花and Carnegie Mellon University have discovered the reason: They鈥檙e mimicking males. That trickery results in reduced aggression from other hummingbirds and increased access to nectar resources.

鈥淭his research takes a mental model we鈥檝e been describing for a while in our papers and gives it a mathematical backbone,鈥 said , an evolutionary biologist who led this research while working as a postdoctoral scholar in the 91探花biology department. 鈥淚t can be easy to think of natural selection as a force that is constantly choosing one single optimum. But this model adds to our understanding of how diversity, especially diversity within sexes, can be a stable endgame.鈥

Relying on the principles of game theory and incorporating previously collected behavioral data, the researchers developed a model of hummingbird behavior to better understand how some birds choose their colors. The findings showed that 鈥渉ybrid signals鈥 鈥 an equilibrium that can occur when signalers in a given situation may be dishonest 鈥 likely exist in nature.听

鈥淚n these hummingbirds, females want to mimic males,鈥 said Kevin Zollman, a co-author of the study and director of Carnegie Mellon鈥檚 Institute for Complex Social Dynamics. 鈥淚f they all did that, then they would end up being disbelieved. So, they end up settling into an equilibrium where some of them 鈥榣ie,鈥 and they are sometimes 鈥榖elieved.鈥欌

This mimicry helps explain how female polymorphism 鈥 in which females of a species can take many forms 鈥 persists among white-necked jacobins and other hummingbird species.听

鈥淭his model elegantly explains this puzzling female polymorphism in one species, but also offers a framework to study testable predictions of plumage differences, or lack thereof, between sexes across hummingbirds,鈥 said , a 91探花assistant professor of biology and co-author of the study. 鈥淲e can use this to understand signals beyond plumage coloration, like different behaviors or body parts, such as long tails or auditory signals, which would entail different costs for the signalers and different model outcomes.”

, a 91探花professor of biology, is also a co-author. This research was funded by the National Science Foundation Postdoctoral Research Fellowship in Biology Grant, awarded to Falk, as well as the Walt Halperin Endowed Professorship and the Washington Research Foundation as Distinguished Investigator, awarded to Rico-Guevara.

This article is adapted from a press release by Carnegie Mellon University.听

]]>
Researchers prefer same-gender co-authors, 91探花study shows /news/2023/08/29/researchers-prefer-same-gender-co-authors-uw-study-shows/ Tue, 29 Aug 2023 18:10:45 +0000 /news/?p=82438 A group of people at a table with papers and water bottles.
Research from the 91探花 and Cornell University suggest a behavioral component is in play when scientists seek out collaborators. Photo: Pixabay

Researchers are more likely to write scientific papers with co-authors of the same gender, a pattern that can鈥檛 be explained by varying gender representations across scientific disciplines and time.

A new study from the 91探花 and Cornell University, recently published in , finds consistent gender homophily 鈥 the tendency of authors to collaborate with others who share their gender 鈥 in a digital collection of 560,000 published research articles over a 50-year period. While this observation is not new, researchers also used novel methods to rule out seemingly logical explanations for the pattern, such as a field鈥檚 gender balance or authorship norms for writing research papers.

The findings suggest a behavioral component is in play when scientists seek out collaborators.

鈥淩esearchers use social discretion when choosing their collaborators,鈥 said , co-author and associate professor of philosophy at the UW.听鈥淒o they express this by choosing same-gender co-authorship teams? How can we study this at a scale that includes multiple fields while also respecting听the diversity of authorship demographics and practices at finer-grained levels?”

The research team, comprised of scholars in statistics, information science, biology and philosophy, mined articles published between 1960 and 2011 from the online repository JSTOR. To help link genders to more than 800,000 author names, the team relied on social security records and crowdsourced data. Because of limitations in the data, this research was restricted to those who identify as men and women and didn鈥檛 include nonbinary and intersex identities.

The team then grouped authors from the same fields and eras, creating 50,000 hypothetical reconfigurations of authors.

鈥淲e re-simulated hypothetical datasets. Our thinking was: How different is what we actually observed versus these hypothetical scenarios that we constructed,鈥 said听, co-author and assistant professor at Cornell who was a doctoral student in statistics at the 91探花when he started this research. 鈥淰ery different, it turns out. This suggests that some other source of homophily is occurring in the data we observed.鈥

The team can鈥檛 say definitively why researchers tend to collaborate with those of the same gender. Data science methods can鈥檛 measure intent, but Wang said the findings suggest consideration of gender may be a factor.

Other co-authors from the 91探花were , associate professor in the 91探花Information School; , professor of biology; and , professor of statistics and of social work. This research was supported by the National Science Foundation and the 91探花Royalty Research Fund.

Adapted from a Cornell University press release.

For more information, contact Lee at c3@uw.edu and Erosheva at erosheva@uw.edu.

]]>
Communication technology, study of collective behavior must be 鈥榗risis discipline,鈥 researchers argue /news/2021/06/14/communication-technology-study-of-collective-behavior-must-be-crisis-discipline-researchers-argue/ Mon, 14 Jun 2021 19:33:40 +0000 /news/?p=74654

Our ability to confront global crises, from pandemics to climate change, depends on how we interact and share information.

Social media and other forms of communication technology restructure these interactions in ways that have consequences. Unfortunately, we have little insight into whether these changes will bring about a healthy, sustainable and equitable world. As a result, researchers now say that the study of collective behavior must rise to a 鈥渃risis discipline,鈥 just like medicine, conservation and climate science have done, according to a published the week of June 14 in the Proceedings of the National Academy of Sciences.

鈥淲e have built and adopted technology that alters behavior at global scales without a theory of what will happen or a coherent strategy for reducing harm,鈥 said , the lead author and a post-doctoral researcher at the 91探花鈥檚 .

Social media and other technological developments have radically reshaped the way that information flows on a global scale. These platforms are driven to maximize engagement and profitability, not to ensure sustainability or accurate information 鈥 and the vulnerability of these systems to misinformation and disinformation poses a dire threat to health, peace, global climate and more.

No one, not even the platform creators themselves, have much understanding of how their design decisions impact human collective behavior, the authors argue.

鈥淲e urgently need to understand this and move forward with focus on developing social systems that promote well-being instead of creating shareholder value by commandeering our collective attention,鈥 said co-author , a 91探花professor of biology and faculty at the Center for an Informed Public.

Collective behavior and other complex systems are fragile. 鈥淲hen perturbed, complex systems tend to exhibit finite resilience followed by catastrophic, sudden, and often irreversible changes,鈥 the authors write.

While there are studies and disciplines that focus on complex systems in the natural world, 鈥渨e have a far poorer understanding of the functional consequences of recent large-scale changes to human collective behavior and decision making,鈥 the authors write.

Averting catastrophe in the medium term (e.g., coronavirus) and long term (e.g., climate change, food security) will require rapid and effective collective behavioral responses 鈥 yet it remains unknown whether human social dynamics will yield such responses.

鈥淲e have seen individual studies about how climate-change disinformation gets over-represented even in the mainstream media, and studies show that in digital media that problem only gets worse,鈥 said co-author , an associate professor of environmental studies at New York University.

Lacking a developed framework, tech companies have also fumbled their way through the ongoing coronavirus pandemic, unable to stem the 鈥渋nfodemic鈥 of misinformation that impedes public acceptance of pandemic control measures such as wearing masks, widespread testing for the virus and vaccinations.

The situation parallels challenges faced in conservation biology and climate science, where insufficiently regulated industries optimize profits while undermining the stability of ecological and Earth systems.

鈥淚f we have a decade or so to act on climate change, we have far less time to sort out our social systems,鈥 Bak-Coleman said.

Historically collective behavior has best been understood as when animals or people exhibit coordinated action without an obvious leader. This includes how fish school to evade predators or when a crowd spontaneously breaks into applause or becomes silent.

That thinking has evolved over the past decade, the authors write, from a phenomena to a contemporary view of collective action as a framework that reveals how interaction among individuals gives rise to collective action.

Additional co-authors on the paper include Rachel Moran at the UW; Mark Alfano at Delft University of Technology and Australian Catholic University; Wolfram Barfuss at University of T眉bingen; Miguel A. Centeno, Andrew S. Gersick, Daniel I. Rubenstein and听 Elke U. Weber at Princeton University; Iain D. Cousin at University of Konstanz; Jonathan F. Donges at Stockholm University; Mirta Galesic and Albert B. Kao at Santa Fe Institute; Pawel Romanczuk at Humboldt Universit盲t zu Berlin; Kaia J. Tombak at Hunter College of the City University of New York; and Jay J. Van Bavel at New York University.

Funding came from the 91探花eScience Institute, the John S. and James L. Knight Foundation, the 91探花Center for an Informed Public, the Deutsche Forschungsgemeinschaft, the National Science Foundation, The Max Planck Society, The Baird Society, The Emmy Noether Program, The Santa Fe Institute and the U.S. Navy鈥檚 Office of Naval Research.

For more information, contact Bak-Coleman at joebak@uw.edu.

 

]]>
Q&A: It’s not just social media 鈥 misinformation can spread in scientific communication too /news/2021/04/21/qa-its-not-just-social-media-misinformation-can-spread-in-scientific-communication-too/ Wed, 21 Apr 2021 21:04:25 +0000 /news/?p=73926
Academia is not immune to spreading misinformation, write 91探花researchers Jevin West and Carl Bergstrom in a recent paper. Photo: 91探花

When people think of misinformation, they often focus on popular and social media. But in a published April 12 in the Proceedings of the National Academy of Sciences, 91探花 faculty members Jevin West and Carl Bergstrom write that scientific communication 鈥 both scientific papers and news articles written about papers 鈥 also has the potential to spread misinformation.

The researchers note that this doesn’t mean that science is broken. “Far from it,” write , an associate professor at the 91探花Information School and the inaugural director, and , a 91探花biology professor and a CIP faculty member. “Science is the greatest of human inventions for understanding our world, and it functions remarkably well despite these challenges. Still, scientists compete for eyeballs just as journalists do.鈥

91探花News asked West and Bergstrom to discuss misinformation in and about science. Their emailed responses are below:

91探花News: Many of us are familiar with the idea of fake news or misinformation on social media. Can you explain how some of these same concepts 鈥 such as hype and hyperbole, bias, filter bubbles and echo chambers and data distortion 鈥 also pop up in science and science communication? Why does this happen?

Jevin West

Science is run by humans, and humans respond to incentives. Scientists have strong incentives to be first to a result and to have their work noticed. Attention is a scarce resource. This creates an environment where scientists, universities, funders and journalists often hype their work more often than their results warrant. One example is an eye-catching paper title or a headline from a science journalist: 鈥淢uons upend all of physics.鈥

Carl Bergstrom

Researchers used to visit libraries and browse printed journals to keep up on the latest scientific research, but this is largely a thing of the past. Today most researchers access the literature through search engines, recommender systems and, to some degree, social media platforms. That creates the same kind of filter bubble problems that we see in society more broadly. Platforms optimize engagement, and the best way to engage a person is to deliver content that grabs their attention. Although the effects are less pronounced in science, it is still an issue that is not well understood and requires more attention.

 

West and Bergstrom are co-authors of 鈥,鈥 which came out in paperback this week.

 

How does a crisis like COVID-19 further fuel these issues?

The COVID-19 crisis, like any major crisis, involves high levels of uncertainty especially at first. As we tried to understand what was happening with SARS-CoV-2 early in 2020, we were looking at a virus about which we had very little prior knowledge 鈥 it had never been in humans until just a few months before. In uncertain environments, people are especially eager for answers. This creates an uncertainty vacuum into which all sorts of nonsense flows.

While scientists take their time to understand the origin of the virus, conspiracy theorists provide ready-made answers. Those with specific agendas cherry-pick from the range of research results. Scientists strive to accelerate research by sharing work prior to peer review, but reporters and others do not always treat that work with due caution. Journals try to hasten the peer review process, but sometimes this results in low quality work slipping through.

Despite all these challenges, science has come through remarkably well. Within 15 months, 10 vaccines already have been developed, with more on the way. Scientists sequenced the genome in a matter of days, worked out the structure of the virus and its proteins in exquisite detail, and are using sequence data from around the globe to track the spread and evolution of the virus and its many variants. Despite the challenges noted in our article, science remains among the greatest human inventions for understanding our world.

The term “significant” has a unique meaning to the scientific community. Can you describe that difference? How does the push for significance affect scientific results and papers?

In the science community, 鈥渟ignificant鈥 generally refers to statistical significance 鈥 the idea that a research result is statistically unlikely under some null hypothesis. This is a tricky concept, not only for the public, but also for scientists. Statistical significance does not necessarily mean that the effect is of a meaningfully important size. The cutoffs for deciding statistical significance differ based on the type of data and the discipline. And once a threshold level of statistical significance becomes entrenched, humans find ways to game the system to reach it 鈥 trying different methods until something works, for example. These are major topics of discussion in science today, and researchers look for better ways to report the degree of statistical support that their results carry. Again, as with the other topics discussed in this article, it doesn鈥檛 mean science is broken. It just means that science is in an ongoing process of refinement and improvement.

Can you talk about what happens when scientists find negative or non-significant results? Why could this be a problem?

Negative results tend to be boring: This drug doesn鈥檛 cure a disease, this sensor does not detect its target, this chemical reaction fails to proceed, this explanation for a phenomenon is unfounded. As a result, people are less interested in reading them, journals are less interested in publishing them and consequently scientists often cut their losses and don鈥檛 bother submitting negative results for publication. But this creates problems of its own. If scientists preferentially publish positive results, the scientific record is not an unbiased picture of scientific discovery. The positive results are in journals for everyone to read, while the negative results are hidden away in file cabinets or, more recently, on file systems. Indeed false claims can even become established as fact. Bergstrom and colleagues in 2016.

Fortunately, science has recognized this problem over the last decade and has proposed some solutions . For example, some publishers encourage the publication of negative results. Some fields have adopted a system known as “registered reports,” where researchers submit their experiment for peer review before the results are available, and publishers agree before the work is done to publish the results regardless of whether the results end up positive or negative.

What are some interventions that can help reduce misinformation both in science and in communications about science?

The most important intervention is teaching the public what science is and what is not. This includes teaching about the history and philosophy of science. It requires having scientists themselves engaging in the public. It involves calling out predatory journals (non-peer-reviewed journals), being cautious with preprint papers, understanding the tactics of those pushing purposeful and disingenuous doubt about science (e.g., ), and paying special attention to health misinformation that looks like science but is often anything but.

With more people paying attention to science and preprints right now thanks to the COVID-19 pandemic, what are some steps the general public can take when looking at preprints or news stories about science?

The rise of preprints is a good thing for science. Instead of waiting years for results, research findings can be made available immediately. During the pandemic this has been critical. But this shortened time scale comes at a cost. Preprints are not peer-reviewed. Peer review can take months and even years, and it doesn鈥檛 guarantee foolproof results. But it does a reasonably good job at filtering out the crackpot papers and those with obvious problems.

The public and journalists have to be extra careful with preprints. There have been preprints during the pandemic that have spread across the media landscape, even though there have been major problems with the paper and even debunked by more credible experts. If referencing newly deposited pre-prints, readers should invest more time into investigating the author, lab and institution pushing the results. When sharing results from preprints, it is important to tag the paper as non-peer-reviewed.

That said, some of the worst and most damaging papers published during the pandemic have gone through peer review, including a paper at Lancet that led to the cancellation of clinical trials 鈥 and later turned out to be fraudulent 鈥 so we have to be careful not to let up our guard on the peer-reviewed literature as well.

For more information, contact West at jevinw@uw.edu or Bergstrom at cbergst@uw.edu.

]]>
Information School to welcome high school students March 19 for ‘MisInfo Day’ 鈥 from ‘Calling BS’ faculty duo /news/2019/03/18/information-school-to-welcome-high-school-students-march-19-for-misinfo-day-from-calling-bs-faculty-duo/ Mon, 18 Mar 2019 21:38:56 +0000 /news/?p=61275 What is misinformation, and how 鈥 and why 鈥 does it spread? The 91探花 is taking a leading role in helping people better navigate this era of increasing online fakery and falsehood.

On March 19, the iSchool will welcome more than 200 Seattle-area high school students for “,” a daylong workshop on how to navigate the misinformation landscape from and , the faculty duo who created the “Calling BS in the Age of Big Data” class and .

“MisInfo Day,” will be from 9:30 a.m. to 2:30 p.m. in the Husky Union Building’s North Ballroom.

West is an iSchool assistant professor, Bergstrom a professor of biology. Their most recent creation is , a website that helps users learn to tell real from fake images online.

The students 鈥 many of whom are studying government 鈥 will come from Nathan Hale, Franklin, Bellevue and Toledo high schools. Discussions will include defining misinformation and why we find it so compelling as well as “tips and tricks” for determining if news reports and social media posts are legitimate

The afternoon session will be an “Ask the Experts” panel, where the students will hear professionals from the Seattle Public Library, Snopes.com and the 91探花about their work. The students are asked to “come with questions about misinformation, fact-checking, confirmation bias and more.”

Other faculty and staff involved are:

  • , iSchool assistant professor
  • , 91探花librarian who manages the Information Science collection
  • , assistant professor in the Department
  • , assistant professor in the
  • Liz Crouse, one of several students involved from the iSchool’s Masters of Library Science program, who assisted West in coordinating the event and will conduct pre- and post-program surveys of students for an ongoing research project. Other MLIS students will lead breakout sessions during the event.

Bergstrom and West’s “Calling BS” work has drawn wide attention from press as well as other institutions, some of whom have already expressed interest in holding events modeled on “MisInfo Day.”

###

For more information, contact Maggie Foote, iSchool communications director, at 206-250-5992 or m2foote@uw.edu

]]>
Fake faces: UWs ‘Calling BS’ duo opens new website asking ‘Which face is real?’ /news/2019/03/04/fake-faces-uws-calling-bs-duo-opens-new-website-asking-which-face-is-real/ Mon, 04 Mar 2019 22:52:03 +0000 /news/?p=61086
Which of these two realistic renderings of faces is real, and which is a computer-generated fake? Biology professor Carl Bergstrom and Information School professor Jevin West — creators of the “Calling BS” class and site — now have a website to help you better discern between fake and real images online. Here, the image on the right is real. Check your skills at their site, .

Go ahead, give it a try. Look closely, study the context and click your answer, choosing which of two realistic headshots is actually a real photograph 鈥 and which is complete fakery.

How did you do? Don’t worry 鈥 read the site, and try again.

is the new website from of the 91探花 Information School and of the biology department, the duo who drew wide attention since 2017 for their innovative Information School class, “Calling Bullshit in the Age of Big Data.”

As with their Calling BS class, Bergstrom and West seek to address 鈥 and help people navigate 鈥 the increasing amount of misinformation and deception they see online. When the two saw the artificial intelligence-powered website 鈥 which renders extremely realistic portraits of utterly nonexistent people 鈥 they wanted to spread the word that the ability to generate believable, lifelike faces was now possible and proliferating online.

“We did not create the technology,” Bergstrom stressed. “We wanted to get the word out that this is now possible. Generally up to this point, we have trusted faces in photos: If it’s a photo, it’s a real person 鈥 at least up to this point.”

WhichFaceisReal in the news:

The Verge: “”
CBC Radio: “”
Buzzfeed: “”

As Bergstrom and West explain on WhichFaceisReal, the “phenomenal” algorithm used to create realistic fake faces was developed by software engineers at NVIDIA Corporation and uses what is called a General Adversarial Network, where two neural networks “play a game of cat and mouse,” each trying to create artificial images and the other trying to tell the difference. “The two networks train one another,” they write. “After a few weeks, the image-creating network can produce images like the fakes on this website.”

And as with their Calling BS work, the website was immediately popular, with about 4 million “plays” of the game in about two weeks.

Are people guessing well? Mostly, yes, West said. Overall, so far, about 70 percent of players choose correctly when trying to distinguish fake from real 鈥 and the site may be helping them learn to do better.

“In our initial analysis this appears to be the case, but we need to verify this with more rigorous analysis,” West said. “We鈥檙e also looking into what kinds of images are the most difficult to discern. For example, are fake images of younger or older people more difficult to identify?”

Misinfo Day – March 19
Jevin West will welcome over 200 area high school students for a daylong workshop discussing fake news and information online. Students will hear from a panel of experts from Snopes.com, the Seattle Public Library and the UW.

There are a few hints and “tells,” however, that help one choose more wisely. Bergstrom and West offer advice in a tab on their site labeled “learn.” Look for inconsistencies in the background of the photo, or how the hair or eyeglasses are rendered.

And there is what they call the “silver bullet” to know fakes online: The algorithm used, called , is unable to generate multiple fake images from different perspectives of the same faux-person. So their advice is, to verify, look for a second photo of the same person.

Or as Bergstrom said, “How do you know if your next Tinder match is real? See if he has a nice headshot as well as a nice shot of himself petting a tiger. Always look for that tiger photo.”

West and Bergstrom plan to continue adding features to their site to add to the challenge of choosing between real and fake 鈥 they may remove backgrounds (which are among the “tells”) to make the choice harder, and will be asking users to view a single photo and say whether it’s real or generated.

“We want to bring public awareness to this technology,” said West. “Just like when people started to realize that you could Photoshop images, we want the public to know that AI can replicate human faces.

“That will hopefully make people start to question things they see in different ways. It will hopefully force us all to corroborate evidence even when we see a photo that looks human.”

###

For more information, contact West at jevinw@uw.edu or Bergstrom at cbergst@uw.edu.

]]>
How economic theory and the Netflix Prize could make research funding more efficient /news/2019/01/02/contest-theory-research-funding/ Wed, 02 Jan 2019 19:26:12 +0000 /news/?p=60379 As scientific funding becomes increasingly scarce, professors in STEM fields spend more time in their offices writing grant applications: by , as much as one-fifth of their research time. That takes time and energy away from teaching students, training young researchers and making discoveries that boost our collective knowledge and well-being.

Two scientists believe that, with professors vying for such a small pool of funds, the grant-application process has become a competition not over who has the best ideas, but who is the best at writing grant applications. In a published Jan. 2 in the journal , co-authors , a professor of biology at the 91探花, and , a professor of statistics at North Carolina State University, use the economic theory of contests to illustrate how this competitive system has made the pursuit of research funding inefficient and unsustainable. They show that alternative methods, such as a partial lottery to award grants, could help get professors back in the lab where they belong.

Link to related coverage:

To receive a grant today, professors apply to funding agencies like the National Science Foundation or the National Institutes of Health. Reviewers evaluate and rank the applications, and the highest-ranking applications receive grant funding.

But over time, the percentage of proposals that receive funding has dropped dramatically. This is largely because the pool of available funds has not grown to keep pace with the number of STEM researchers.

“Back in the 1970s, the top 40 to 50 percent of applications to agencies were funded,” said Bergstrom. “Agencies merely had to separate the good research plans from the bad based on the grant applications.”

Funding thresholds for grant applications have tightened steadily since the 1970s. In 2003, only the top 20 percent of research project grant applications to the National Institute of Allergy & Infectious Diseases . In 2013, the success rate had plummeted to 8 percent. Gross and Bergstrom argue that the funding pool has grown so small relative to the number of applicants that the nature of the grant-application process had changed.

“When agencies only fund the top 10 or 20 percent, they aren’t just separating bad ideas from good ideas,” said Bergstrom. “They’re also separating good from good.”

“This has two effects on the grant-application process,” said Gross. “First, professors must apply for more and more grants before they’re awarded one. Second, the application process becomes a contest to determine who can write the best grant proposals 鈥 so professors spend more and more time trying to perfect each individual application.”

Gross and Bergstrom realized that today’s grant-application process can be described using the economic theory of contests. In contest theory, teams compete to produce a product or complete a task for an agency; the agency picks a winner and retains the fruits of the team’s efforts, while the winning team receives a prize such as cash. For the , for example, teams competed to produce an algorithm that would predict how users would rank films on its streaming service. Netflix received the winning algorithm, while the winning team pocketed $1 million.

“If we were to apply contest theory to grants, then professors are the ones competing to create a product 鈥 the best grant application 鈥 for the agency,” said Gross. “That’s not a particularly good system, though, because the funding agency doesn’t want grant applications for their own sake. They want to fund research.”

In their paper, Bergstrom and Gross illustrate how the grant-application process is consistent with economic contest models. They show how funding a relatively small fraction of grant applications 鈥 such as the top 10 or 15 percent 鈥 makes the practice of science inefficient: The negative costs associated with trying to produce the best grant application could potentially outweigh the economic value of the science produced.

If agencies funded a higher percentage of applications, professors could spend less time trying to write the perfect grant application. In addition, funding agencies wouldn鈥檛 have to subjectively choose winners among high-quality proposals that are all based on sound science. But this option would require significantly expanding funding to agencies like the NIH and the NSF, a politically difficult task.

Using the economic theory of contests, Gross and Bergstrom modeled a controversial alternative: awarding grants instead by partial lottery. Under a partial lottery system, funds are awarded by random draw among a pool of high-ranking grants 鈥 the top 40 percent, for example. Since applicants would be aiming to clear a lower bar for a smaller prize 鈥 a shot at the lottery instead of a guaranteed payout for winning proposals 鈥 the contest theory model predicts that applicants would spend less time trying to perfect their applications, Bergstrom said.

Partial lotteries have been proposed by others, 91探花professor of laboratory medicine and Johns Hopkins professor . They鈥檙e also used by two funding agencies in New Zealand and the Volkswagen Foundation. Gross and Bergstrom simply use contest theory to show how this system could also free professors from the seemingly endless cycle of grant applications.

But partial lotteries aren’t the only viable solution, they say. Funding agencies could also award grants based on merit, such as a professor鈥檚 past record of excellence in research. But that system also would need mechanisms to help early-career faculty and professors from underrepresented groups obtain grants, Bergstrom said. Hybrid systems are another option, such as a partial lottery for early-career faculty and merit-based grants for later-career faculty.

“There are many potential routes out of the current hole,” said Bergstrom. “What doesn’t change is our conclusion that the current grant-application system is fundamentally inefficient and unsustainable.”

###

For more information, contact Bergstrom at cbergst@uw.edu and Gross at krgross@ncsu.edu.

]]>
After much media attention, 91探花Information School’s ‘Calling BS’ class begins /news/2017/03/28/after-much-media-attention-uw-information-schools-calling-bs-class-begins/ Tue, 28 Mar 2017 19:55:49 +0000 /news/?p=52575 The very name of the class, when proposed, seemed to fire imaginations nationwide and beyond. Now with the beginning of spring quarter, the 91探花 Information School’s new course “Calling Bullshit in the Age of Big Data” is getting started.

The class was conceived and is taught by iSchool assistant professor with biology professor . It’s a one-credit course offered through the iSchool as INFO 198 but listed also in the biology department, as BIOL 106B. The class will meet on Wednesday afternoons, with its first session on March 29.

When registration opened, the class reached its 160-student ceiling within one minute, like a rock concert selling out.

Full-length lectures are now available on

Each session will take up a different topic related to, well, BS. As the online shows, West and Bergstrom will start with definitions and “spotting BS,” followed by a session on “the natural ecology of BS.” Subsequent topics include causality, statistical traps, big data, publication bias and predatory publishing, fake news, and the ethics of calling and refuting BS.

In a lengthy online , the instructors say the class is not about any one party or politician 鈥 despite being “particularly timely today” 鈥 and will not seek to comment on the current political situation in the country or the world.

“This class is about how to spot bullshit and how to call it. It’s not about cataloguing all the bullshit out there, telling students what we think is bullshit in contemporary science and society, or calling bullshit on the most egregious cases,” said Bergstrom.

Examples to be used in class, he said, are those “that serve a pedagogical purpose” by showing ways BS is spread and demonstrating effective ways to refute it. Basically, they say, the course is about how numbers, statistics, data visualization models and algorithms are increasingly used for propagating BS, and how people can detect and it and avoid being taken in.

West and Bergstrom plan to hold a three-credit version of the class in fall quarter.

And though the class is for 91探花students, Bergstrom and West plan to edit their lectures into video clips and make them publicly available on the UW’s YouTube channel.

“We don鈥檛 care whether our students agree with our world views,” Bergstrom said, “but we do want them to have the skills to see through nonsense, form well-founded beliefs based on evidence and make their best arguments for those beliefs.”

West added, “Now, we need to make sure it as fun as a rock concert.”

###

For more information, contact West at 206-543-2646 or jevinw@uw.edu, or Bergstrom at 206-685-3487 or cbergst@uw.edu.

]]>
‘Overwhelming’ response, global press attention for new 91探花Information School course, ‘Calling BS’ /news/2017/02/06/ovewhelming-response-global-press-attention-for-new-uw-information-school-course-calling-bs/ Mon, 06 Feb 2017 17:01:26 +0000 /news/?p=51941 It’s almost unheard-of for a university class to spark global press attention 鈥 and offers of book deals 鈥 before instruction even begins. But such is the case with the 91探花Information School’s new course, “Calling Bullshit in the Age of Big Data.”

The class was proposed by Information School assistant professor and biology professor , and has been approved as a one-credit special topics seminar for about 150 students in spring quarter, which starts on March 27.

“The response has been overwhelming. It seems to have struck a chord across the world,” said West. “We had tens of thousands of users to the class in just a few days of releasing the course.”

Full-length lectures are now available on

West told the Seattle Times the course webpage went live at midnight on Jan. 11. “We woke up the next morning and it was all over the whole planet. I’ve never seen anything like it 鈥 the response has been insane.”

Bergstrom and West say they hope to expand to a 3- or 4-credit class in the 2017-2018 school year. Those not enrolled will be able to watch as well; the two say they will have the lectures videotaped and made freely available on the internet.

West said, “It has clearly struck a chord with hundreds of thousands of people around the world.”

The course’s brief description, atop the online, is as blunt and compelling as its title: “Our world is saturated with bullshit,” it says. “Learn to detect and defuse it.”

Read recent coverage:

Seattle Times:

The Chronicle of Higher Education:

NPR Morning Edition:

KOMO radio:

KING 5:

]]>