To Advance Science, It's Time to Tackle Unconscious Bias (Op-Ed)

scientist in lab with science experiments
Advances in science can enhance human knowledge and health, but implicit bias by even the most well-meaning journal editors, science funders and peer-reviewers can undermine innovative ideas. (Image credit: Plufflyman / Shutterstock.com)

Geraldine Richmond chairs the board of directors for the American Association for the Advancement of Science. She is a professor of chemistry at the University of Oregon, where she holds the Presidential Chair in Science and is the U.S. Science Envoy to the Southeast Asian Lower Mekong River countries. Richmond contributed this article to Live Science's Expert Voices: Op-Ed & Insights

Over the past year, science has revealed the chirping song of gravitational waves (ripples in space-time that confirmed Einstein's theory of general relativity), advances in using a person's own immune system to treat cancer, new insights into climate-change impacts, and findings from the first flyby of the dwarf planet Pluto and its moon, Charon.

As the world celebrates such advances, and the power of science to enhance human knowledge as well as human lives, people should also consider the opportunities that may have been missed. Deeply ingrained biases, which scientists often deny having, can creep into our otherwise objective evaluation of a project or individual. Even among the most well-meaning journal editors, science funders and peer-reviewers, this "implicit bias" can have consequences that undermine innovative ideas, the importance of discoveries and valuable contributions from the full talent pool.

Let's talk about unconscious bias

It is time for scientists to talk openly about this problem.

Peer review — in which other experts in a field check one another's research to ensure it meets certain standards — is a time-honored process for evaluating scientific merit, performance and new discoveries. It is the backbone of modern science, and is used in a multitude of ways, such as for judging which papers should be published and which projects should get funded. This which ultimately contributes to scientific and career advancement. As highly as scientists regard the importance of peer review, however, we must also admit that it is a human endeavor. Improvements can always be made, especially when many concerns have been validated by data related to journal submissions and grant applications. Such was the topic at a recent panel discussion titled "Implicit Bias in Scientific Peer Review," organized by the American Association for the Advancement of Science (AAAS).

We have known about implicit bias for some time now. In 2012, for example, Yale University researchers provided a group of male and female scientists with a paper attributed either to "John" or "Jennifer," and asked, "Would you hire this student as a lab manager?" The results, detailed in the journal Proceedings of the National Academy of Sciences, were troubling. John was more likely to be hired than Jennifer, and he was likely to be paid15 percent more than Jennifer. Clearly, there was a gender bias in play, even though the scientists evaluating the files believed that their decisions were completely objective. Implicit bias affects everyone, no matter how objective and fair-minded they aspire to be.

Try these for a quick bias check: What if John or Jennifer were replaced by Tyrone and Andrew, or by Tulinagwe and Caroline, or by Hussein and Michael? What if a peer-reviewer Googled the author of a proposal and found her to have a physical disability? Would that alter the reviewer's thinking about the proposal? The human brain uses past experiences and surroundings to help a person make mental shortcuts in navigating decisions that, in ancient times, could have meant the difference between survival and death. It is no wonder then that people's inherent biases are more prevalent when they make snap decisions, instead of putting some time into the decision process.

Limited data about the authors of grant applications and journal submissions has so far made it difficult to understand the impact of implicit bias in peer review. Data presented at the recent forum discussion suggested that publishers have made progress in addressing potential gender bias; male and female authors have papers accepted at about the same rate in many top journals, according to research presented at the recent AAAS panel.Some journals, particularly in the social sciences, have for a number of years been conducting double-blind reviews, in which authors and reviewers are unaware of each other's identities. But most of the natural sciences have yet to pick up this practice, or even experiment with it. As for research funders, a 2015 report of the U.S. Government Accountability Office called for better data and information-sharing on the gender demographics of proposal submitters and grant recipients.

Although race and gender are often the focus of implicit biases, institutional and country biases can also cloud scientists' objectivity. This consequently undermines the visibility of critical ideas and discoveries that the world sorely needs in order to solve global challenges.

What can be done?

Simply making reviewers aware of the roots of implicit bias can backfire, causing some to believe that there is no way to avoid the problem. Training can help to reduce implicit bias, but the positive impacts of such interventions tend to be short-lived. Brian Nosek, an expert in this area from the University of Virginia, has recommended structuring processes for reviewing journal articles and grant proposals to help minimize bias. At the same time, he said, reviewers must simply be encouraged to accept and become more mindful of the problem. Panel participants discussed a range of other potential creative solutions, such as double-blind review and certifying peer-reviewers worldwide, to overcome the U.S.-centric focus of many elite journals.

More-uniform data-collection and data-sharing will be critical next steps toward understanding and minimizing implicit bias in peer review. But at the same time, scientists simply must be willing to talk about the issue. It's time to tackle implicit bias in peer review, to ensure that the best science is funded and published. 

Follow all of the Expert Voices issues and debates — and become part of the discussion — on FacebookTwitter and Google+. The views expressed are those of the author and do not necessarily reflect the views of the publisher.

This article was originally published on Live Science.

University of Oregon