Join MySLAS Social

Remedy for Reproducibility: Opening a Dialog to Explore the Complexities

Scientists have contemplated the importance of validating research for decades. At each point along the path toward publication of preclinical research, there are both checkpoints for accuracy and challenges to proving what has been discovered. Many things inhibit the task of validation, from funding limitations and the challenging task of recreating experiments that require specialized knowledge of the technology in another environment, to the pressure to move on to new research and publish.

Reproducibility is such an important topic in life sciences discovery and technology that SLAS has visited it several times on occasions that include the inaugural 2015 SLAS Leadership Forum and now an SLAS2017 Special Session. Entitled, Whose Responsibility is Research Reproducibility?”, to be held Tuesday, Feb. 7, 2017 during the SLAS International Conference and Exhibition in Washington, DC., the panel discussion focuses on a collaborative approach to finding solutions. Richard Harris, NPR science correspondent, moderates the panel.

"Tension between academia and industry often surfaces when this problem is described,” says Special Session Co-Chair Lenny Teytelman, Ph.D. “This panel does not represent one side or another. SLAS2017 is a beautiful opportunity to mix because it’s not a conference where you have academics talking with each other in one corner and big pharma researchers talking in another. Everyone mingles.”

Teytelman and Co-Chair Cathy Tralau-Stewart, Ph.D., have each held positions in industry and academia. The co-chairs understand that these paths naturally diverge when answering the question of who should take responsibility. Specific proposals for improving reproducibility are frequently contentious and laden with fears of unintended consequences for the research enterprise. Teytelman and Tralau-Stewart hope that exploring the topic collaboratively in this panel discussion will ease tensions and reveal the multi-layered challenges faced by each perspective. They predict that the audience will walk away inspired, not panicked.

“Reproducibility is an industry and an academic problem,” says Tralau-Stewart. “I have seen many industry labs have exactly the same problems as academic labs have.” In addition to her co-chair duties for this session, Tralau-Stewart will join the discussion as an industry panelist. She previously worked as associate director, pharmacology at GlaxoSmithKline (GSK) before launching academic pursuits and is currently an associate professor in therapeutics at University of California, San Francisco (UCSF) and associate director for UCSF’s Catalyst, a role in which she moves UCSF’s academic projects toward industry development.

Teytelman shares a similar crossover background. Five years ago the former postdoctoral researcher at the Massachusetts Institute of Technology (MIT) co-founded in Berkeley, CA. The start-up company offers an open-access, crowd-sourced protocol repository for the life sciences that gives scientists the ability to share and discover detailed research methods.

“The panel discussion is a wonderful occasion to blend industry, academia, funders and publishers,” says Teytelman, who is also the SLAS2017 Data Analysis and Informatics track chair. “There are good things we can do that aren’t going to pause or defund research or create regulatory headaches but are steps to improve reproducibility.”

Addressing Issues

A core issue to be addressed is how to define reproducibility. Teytelman comments that definitions became a focal point of discussion after an Amgen study published several years ago. The paper concluded that researchers were unable to replicate results of 47 out of 53 papers that were seminal to launching drug discovery programs.

“When you take a published paper from a high-profile journal and try to reproduce the results, sometimes you can’t replicate it on the first attempt. Sometimes it requires another attempt or it turns out that the results were not robust and the effect disappears,” Teytelman explains. “There are lots of reasons for it but the core problem depends on the study and how you define reproducibility versus robustness.”

Tralau-Stewart agrees. “The definition problem is massive. When you look at the Amgen studies, there were many issues with definitions that challenged the interpretations," she explains. "In order for research to translate, you have to be able to replicate and reproduce it later. You also want to build the strength of the hypothesis by replicating things in different systems. That is not the same as asking: ‘If I did exactly the same experiment, would I get exactly the same results?’ The more we learn about our complex world, the more variables we encounter that need to be controlled in experimentation. And we still don’t know everything.”

According to Tralau-Stewart, who is the SLAS2017 Assay Development and Screening track chair, the reproducibility issue continues to increase the costs of the entire research process while decreasing the efficiency and likelihood of translating anything to the clinic. “You tackle what you can in the research process. The things that you can undertake are founded in reproducibility issues, in my opinion. There is definitely a need for tightening how we do science and the scientific methodology,” she says.

Add to the problem the increasing attention to the issue in mass media. Soon after studies hit the scientific journals a few years back, media outlets outside of the science community picked up the story, expressing concern about how broadly the statistics applied to all preclinical studies. The question was raised: Is this a crisis?

Teytelman says no. “There’s been more attention to it in the media lately, it’s been in the Economist and The Wall Street Journal, but I believe that we have the same problem that we have always had. We definitely need to improve reproducibility, but I’m not convinced that the situation is actually worse now than it has ever been before. How does reproducibility today compare to reproducibility rates of the same biomedical research 30 years ago? It’s difficult to determine if the problem is worse, the same or better in comparison because we didn’t have studies challenging results 30 years ago,” he comments.

“When you call something a ‘crisis,’ it’s easy to blame and point fingers at someone,” Teytelman continues. “It’s dangerous and unproductive to panic. We don’t want Congress to think we are wasting taxpayer money. We are in a phase in which we are talking about what causes it and what methods will improve reproducibility. We first need to resolve who conducts that next step and discuss the situation openly.”

Panelists Discuss Potential Solutions

Exploring those next steps and the appropriate entities to execute them will be the panel of experts at the SLAS2017 Special Session: Ivan Oransky, M.D., co-founder of Retraction Watch, New York, NY, a blog that reports on retractions of scientific papers; Veronique Kiermer, Ph.D., executive editor for Public Library of Science (PLoS), San Francisco, CA, a nonprofit, open-access scientific library of journals and other scientific literature; Elizabeth Iorns, Ph.D., co-founder and CEO of the Science Exchange, Palo Alto, CA, an online marketplace of scientific service providers and technologies; and Tara A. Schwetz, Ph.D., senior advisor to the Principal Deputy Director at the National Institutes of Health (NIH), Bethesda, MD.

The session panelists represent comprehensive backgrounds in life sciences discovery and technology. Oransky earned an M.D. at the New York University School of Medicine, where he holds an appointment as clinical assistant professor of medicine. Kiermer, in addition to her previous editing and publishing roles with the science journal Nature, holds a Ph.D. in molecular biology from the Université libre de Bruxelles, Brussels, Belgium. Iorns, previously a breast cancer researcher and assistant professor at the University of Miami, joined forces with PLoS in 2012 using the Science Exchange to host a data repository figshare called the Reproducibility Initiative that helps scientists, institutions and funding agencies validate their critical research findings. Tralau-Stewart serves on the advisory board of the Initiative.

Rounding out the panel is Schwetz, who holds a Ph.D. in biomedical sciences from University of South Florida and has served in a variety of roles at NIH. She currently works under NIH Principal Deputy Director Lawrence A. Tabak, D.D.S., Ph.D., and represents the funding arm of research among the panelists. Tralau-Stewart anticipates that Schwetz will have an interesting perspective on problem-solving resulting from Tabak’s offices’ involvement in launching certain peer review policies and practices for funding seekers, as well as his team’s work to improve statistical and experimental design training.

Tralau-Stewart hopes the audience will see how fundamental good science is to the reproducibility solution. “All scientists need to be clear about how they do their work,” she says. “The ongoing conversation is the need and importance of good scientific method with a greater emphasis on strong statistical and experimental design training.”

She comments that it was important to include funders and the publishers in the session as well. “The future is more complex, and the issue requires everyone’s involvement – from the scientists in academia and industry to the funders and publishers of their work,” Tralau-Stewart says. “Only experiments that meet higher standards will get published.”

Teytelman agrees, adding: “No single entity is in the position to fix this, but I think we are at a stage to discuss possible solutions. The special session panel will be a reasonable discussion of some of the best ways to address reproducibility.”

Teytelman doesn’t think that one big answer for reproducibility issues will emerge from the session, but “I hope that the audience will leave with a deeper appreciation for the complexity of this issue,” he concludes.

December 12, 2016