Open Access
How to translate text using browser tools
28 July 2016 Promoting Transparency in Evolutionary Biology and Ecology
James F. Smith, T. H. Parker, S. Nakagawa, J. Gurevitch, TTEE (Tools for Transparency in Ecology and Evolution) Working Group
Author Affiliations +

A hallmark of effective science is transparency. If results are not openly shared, or if others don't know how we derived those results, the progress of science is impeded. Most of us understand this core principle, but the benefits of transparency have implications that are not always recognized. These benefits include not only the ability to interpret results accurately, but also a reduction in bias, greater capacity to include results in data syntheses, and facilitation of updating and replicating studies. However, without institutional support, practices that promote transparency are not nearly as common as they should be despite the commitment of many individuals in the scientific community.

Science is a uniquely effective way of understanding the world, and the disciplines of ecology and evolutionary biology have made and continue to make progress in resolving important questions. However, inadequate transparency can slow progress. For instance, papers often fail to report basic information such as sample sizes, directions of effects, and measures of uncertainty for at least a subset of the results they report (e.g. Fidler et al. 2006; Zhang et al. 2012; Parker 2013; Ferreira et al. 2015). Further, this under-reporting of results has been found to be more likely in cases of weak and non-significant relationships (Cassey et al. 2004; Parker 2013). This bias in data reporting can alter the interpretation of conclusions and undermine the validity of reviews and future research syntheses. Similarly, various sources of evidence suggest that weaker effects are more likely to go entirely unreported (Csada et al. 1996; Møller and Jennions 2002; Fanelli 2010), again presenting a misleading picture of scientific outcomes in the published literature. Important details regarding methods such as experimental design, study location, and statistical models are also often missing (Mislan et al. 2016), further hindering interpretation, evaluation, or replication. Bias can be introduced not just when reporting methods and data, but also when collecting data. For instance, where observers are not ‘blinded’ to treatment or expected outcome, their studies report larger effect sizes on average and a higher proportion of significant p values than in studies with blinding (van Wilgenburg and Elgar 2013; Holman et al. 2015). Although blinding is not possible in all studies in ecology and evolution, it is unfortunately quite rare even when feasible (Kardish et al. 2015).

How can we more effectively promote transparency? Journals are institutions that are well poised to play a pivotal role. Journal articles already include methods and results sections, and authors are held to strict standards as conditions for publication. Journals can thus easily ask that authors adhere to specific standards of transparency. In the digital era, there are relatively few intrinsic barriers to sharing the additional information required for improved transparency. Recognition of the role of journals in promoting transparency has led to widespread adoption of data-sharing policies by journals in ecology and evolutionary biology in recent years (Whitlock et al. 2010). Although promoting data sharing has been a strong step towards transparency, here we advocate adoption of more comprehensive transparency guidelines in ecology and evolutionary biology.

In November 2015, representatives (mostly editors-in-chief) from more than 20 journals in ecology and evolution joined researchers and funding agency panelists to identify ways to improve transparency in these disciplines. This workshop (funded by the U. S. A. National Science Foundation and by the Laura and John Arnold Foundation and hosted by the Center for Open Science) identified general principles and specific tools that journals can adopt to encourage greater transparency of the science they publish. Most of the ideas that emerged from the workshop fit well within the recently developed transparency and openness promotion (TOP) framework ( https://cos.io/top/; Table 1; Nosek et al. 2015). The TOP framework contains eight separate editorial guidelines for journals, each designed to be useful across the breadth of empirical disciplines. Some of these general guidelines require additional discipline-specific explanations. Accordingly, the workshop produced a document called ‘Tools for transparency in ecology and evolution’ (TTEE) designed to help journals in our discipline adopt TOP guidelines, and this content is now posted publicly and available for use by any journal ( https://osf.io/g65cb/). Further, both the general TOP guidelines and the discipline-specific TTEE interpretation are living documents that will be updated through formal review processes. Journals that implement the TOP framework can choose to adopt any combination of the eight guidelines and the level of stringency (from 1 most lenient, to 3 most stringent) for each guideline adopted, as well as to modify the requirements of the level of stringency to meet journal standards. Journals can also choose to award badges to acknowledge open practices ( https://osf.io/tvyxz/) (Kidwell et al. 2016) to individual papers to indicate that the paper conforms to one of three specific transparency standards: open data, open materials, or preregistration.

Some of the eight TOP guidelines (Table 1) will be more immediately familiar to ecologists and evolutionary biologists than will others. The most familiar is likely “data transparency” which encourages data archiving, a practice which is now suggested or required by many journals in our field and by a growing number of funding agencies (Whitlock 2011). In addition to archiving of data, TOP guidelines promote archiving of both analysis code and a set of detailed materials and methods. The arguments in favor of data archiving have been well made elsewhere (Tenopir et al. 2011; Whitlock 2011), and the arguments in favor of archiving analysis code (Mislan et al. 2016) and materials are similar. Furthermore, to help those who archive useful content obtain recognition for their contributions, the first TOP guideline encourages citation of archived content.

Table 1.

A list, with brief explanations, of each of the eight existing transparency and openness promotion (TOP) guidelines ( https://cos.io/top/).

t01_00.gif

Setting standards for thorough reporting of methods and results has clear benefits, and this is the purpose of the fifth TOP guideline, ‘design and analysis transparency.’ What qualifies as thorough design and analysis transparency varies among disciplines, and so this guideline requires substantial disciplinary interpretation to be useful. Providing this disciplinary interpretation for ecology and evolution is one of the primary purposes of the TTEE document posted online ( https://osf.io/g65cb/) as a supplement to TOP. The TTEE consists largely of questions that journals can provide to authors, reviewers, and/or editors as checklists to foster adherence to this and other TOP guidelines.

One of the major goals of the November 2015 workshop was to tailor transparency standards for the implementation and reporting of meta-analyses in ecology and evolution. The design and analysis transparency standards described in the previous paragraph will help make data from original studies more useful for meta-analysis, but the workshop also developed design and analysis transparency standards for meta-analyses themselves. To this end, discipline-specific checklist questions are available in the TTEE document to guide the conduct and publication of meta-analytic syntheses ( https://osf.io/g65cb/). The meta-analysis questions are presented as a separate checklist because meta-analysis is such a distinct and important undertaking. Meta-analysis is a tool used across much of ecology and evolutionary biology to assess the generality of phenomena (Koricheva et al. 2013), and so encouraging a rigorous and transparent meta-analytic process should lead to more robust inferences about generality in these disciplines.

The TOP guidelines also encourage replication of previously published studies. Not all studies merit replication, replication is sometimes impractical, and it may not be appropriate for all journals to publish replications, but in many circumstances replications can make valuable contributions to empirical progress (Nakagawa and Parker 2015). Replication is a useful tool for exploring effects of environmental variability, and along with meta-analysis, can play an important role in building confidence in our inferences.

The concept of pre-registration, which is central to two TOP guidelines, is new to our discipline, though it has existed in medical biology for well over a decade. Pre-registration involves publicly archiving (with the option of an embargo) a study design or an analysis plan prior to initiating the research. Pre-registering the plan for a study, including the questions or hypotheses to be addressed, reduces several types of bias. For example, it can identify studies on a particular topic that were initiated but never published. This can help interpretation of the published range of effect sizes or when seeking unpublished results for meta-analysis (i.e. combating the “file drawer” problem, sensu Rosenthal 1979). Pre-registering an analysis plan is even more useful, especially when working within a hypothesis-testing framework (as opposed to a hypothesis-generating framework). When interpreting tests of hypotheses, confidence in individual tests depends on the number of alternative hypotheses tested simultaneously, and whether the hypothesis was developed a priori as opposed to generated by examining the current data (Nosek et al. 2015). By providing this information, preregistered analyses substantially increase confidence in hypothesis tests. Pre-registration does not prevent changes in study design or analyses, it just makes these changes more transparent. Journals can opt to award a badge to identify papers that include pre-registered content.

The fields of ecology and evolutionary biology stand to derive major benefits as journals move to adopt transparency standards. Relying on individuals to define and adhere to transparency standards leads to inconsistent outcomes. A deliberate, institutional approach from this journal and others to promote transparency will facilitate clearer interpretation of published methods and results, reduced bias in results available to the scientific community, more effective meta-analytical synthesis, and improved opportunities to update and replicate studies. These outcomes will be an important legacy for the future of ecology and evolutionary biology.

The field of systematics has long had a history of already meeting many of these guidelines and the adoption of them will not see major changes for the journal or authors. The earliest systematics studies were the publication of data (species descriptions still quality as data). We have long required the citation of specimens that were examined for study, publication of DNA sequences, and citation of vouchers for the source of those data. We also now make use of Dryad where authors can deposit data, supplementary files, and code for analyses that were not done using standard published software. Additionally, although it is not explicitly stated in the earlier versions of instructions to authors, it is general policy that methods and results are explicit in the sampling. We recognize that there are many limitations to sampling in the systematics field and will continue to allow authors to state where and why certain samples were not included, and allow reviewers and editors to evaluate whether such sampling gaps are “fatal” or not. Replication is a part of nearly all studies to some extent in that specimens are re-examined in new taxa descriptions and revisions and form the basis for sampling in molecular studies. Likewise, many molecular studies being conducted now are re-evaluating earlier studies that used more limited data or alternative methods. The TOP guidelines that are likely to seem the most alien to Systematic Botany are guidelines 6 and 7 (Table 1), which will likely be most unusual for all ecologists and evolutionary biologists. At present we are uncertain how these may be interpreted or used and are thus leaving the option open for authors. Authors wishing to make use of pre-registration in advance of initiating a study should contact the SystematicBotany Editor-in-Chief to make arrangements.

We will adopt the following TOP:

  1. Citation standards: Level 3 — Article is not published until appropriate citation for data and materials is provided that follows journal's author guidelines.

  2. Data transparency: Level 2 — Data must be posted to a trusted repository.

  3. Analytic methods (code) transparency: Level 2 — Code must be posted to a trusted repository.

  4. Research materials transparency: Level 2 — Materials must be posted to a trusted repository.

  5. Design and analysis transparency: Level 3 — Journal requires and enforces adherence to design transparency standards for review and publication.

    To facilitate adherence, we will post a checklist for authors on the journal website and provide checklists to reviewers.

  6. Pre-registration of studies: Level 1 — Journal encourages preregistration of studies and provides link in article to preregistration if it exists.

  7. Pre-registration of analysis plans: Level 1 — Journal encourages pre-analysis plans and provides link in article to registered analysis plan if it exists.

  8. Replication: Level 2 — Journal encourages submission of replication studies and conducts blind review of results.

Acknowledgments

The workshop on Improving Inference in Evolutionary Biology and Ecology that produced the Tools for Transparency in Ecology and Evolution was funded by the Laura and John Arnold Foundation and the US National Science Foundation (DEB: 1548207). The Center for Open Science hosted the meeting and provided valuable logistical support.

Literature Cited

1.

Cassey, P. , J. G. Ewen , T. M. Blackburn , and A. P. Møller . 2004. A survey of publication bias within evolutionary ecology. Proceedings of the RoyalSociety of London. SeriesB,Biological Sciences 271: S451–S454. Google Scholar

2.

Csada, R. D. , P. C. James , and H. M. E. Richard . 1996. The “file drawer problem” of non-significant results: Does it apply to biological research? Oikos 76: 591–593. Google Scholar

3.

Fanelli, D. 2010. “Positive” results increase down the hierarchy of the sciences. PLoS One 5: e10068. Google Scholar

4.

Ferreira, V. , B. Castagneyrol , J. Koricheva , V. Gulis , E. Chauvet , and M. A. S. Graça . 2015. A meta-analysis of the effects of nutrient enrichment on litter decomposition in streams. Biological Reviewsof the Cambridge Philosophical Society 90: 669–688. Google Scholar

5.

Fidler, F. , M. A. Burgman , G. Cumming , R. Buttrose , and N. Thomason . 2006. Impact of criticism of null-hypothesis significance testing on statistical reporting practices in conservation biology. ConservationBiology 20: 1539–1544. Google Scholar

6.

Holman, L. , M. L. Head , R. Lanfear , and M. D. Jennions . 2015. Evidence of experimental bias in the life sciences: Why we need blind data recording. PLoS Biology 13: e1002190. Google Scholar

7.

Kardish, M. R. , U. G. Mueller , S. Amador-Vargas , E. I. Dietrich , R. Ma , B. Barrett , and C.-C. Fang . 2015. Blind trust in unblinded observation in ecology, evolution and behavior. Frontiers in Ecology and Evolution 3: 51. Google Scholar

8.

Kidwell, M. C. , L. B. Lazarević , E. Baranski , T. E. Hardwicke , S. Piechowski , L.-S. Falkenberg , C, Kennett, A. Slowik , C. Sonnleitner , C. Hess-Holden , T. M. Errington , S. Fiedler , and B. A. Nosek . 2016. Badges to acknowledge open practices: A simple, low-cost, effective method for increasing transparency. PLoS Biology 14: e1002456. Google Scholar

9.

Koricheva, J. , J. Gurevitch , and K. Mengersen. 2013. The handbook of meta-analysisin ecology and evolution. Princeton, New Jersey: Princeton University Press. Google Scholar

10.

Mislan, K. A. S. , J. M. Heer , and E. P. White . 2016. Elevating the status of code in ecology. Trends in Ecology & Evolution 31: 4–7. Google Scholar

11.

Møller, A. P. and M. D. Jennions . 2002. How much variance can be explained by ecologists and evolutionary biologists? Oecologia 132: 492–500. Google Scholar

12.

Nakagawa, S. and T. H. Parker . 2015. Replicating research in ecology and evolution: feasibility, incentives, and the cost-benefit conundrum. BMC Biology 13: 88. Google Scholar

13.

Nosek, B. A. , G. Alter , G. C. Banks , D. Borsboom , S. D. Bowman , S. J. Breckler , S. Buck , C. D. Chambers , G. Chin , G. Christensen , M. Contestabile , A. Dafoe , E. Eich , J. Frees , R. Glennerster , D. Gorof , D. P. Green , B. Hesse , M. Humphreys , J. Ishiyama , D. Karlan , A. Kraut , A. Lupia , P. Mabry , T. Madon , N. Malhotra , E. Mayo-Wilson , M. McNutt , E. Miguel , E. L. Paluck , U. Simonsohn , C. Soderberg , B. A. Spellman , J. Turitto , G. VandenBos , S. Vazire , E. J. Wagenmakers , R. Wilson , and T. Yarkoni . 2015. Promoting an open research culture. Science 348: 1422–1425. Google Scholar

14.

Parker, T. H. 2013. What do we really know about the signalling role of plumage colour in blue tits? A case study of impediments to progress in evolutionary biology. Biological Reviews of the CambridgePhilosophical Society 88: 511–536. Google Scholar

15.

Rosenthal, R. 1979. The “file drawer problem” and tolerance for null results. Psychological Bulletin 86: 638–641. Google Scholar

16.

Tenopir, C. , S. Allard , K. Douglass , A. U. Aydinoglu , L. Wu , E. Read , M. Manoff , and M. Frame . 2011. Data sharing by scientists: Practices and perceptions. PLoS One 6: e21101. Google Scholar

17.

van Wilgenburg, E. and M. A. Elgar . 2013. Confirmation bias in studies of nestmate recognition: A cautionary note for research into the behaviour of animals. PLoS One 8: e53548. Google Scholar

18.

Whitlock, M. C. 2011. Data archiving in ecology and evolution: Best practices. Trends in Ecology & Evolution 26: 61–65. Google Scholar

19.

Whitlock, M. C. , M. A. McPeek , M. D. Rausher , L. Rieseberg, and A. J. Moore . 2010. Data archiving. American Naturalist 175: 145–146. Google Scholar

20.

Zhang, Y. , H. Y. H. Chen , and P. B. Reich . 2012. Forest productivity increases with evenness, species richness and trait variation: A global meta-analysis. Journal of Ecology 100: 742–749. Google Scholar
© Copyright 2016 by the American Society of Plant Taxonomists
James F. Smith, T. H. Parker, S. Nakagawa, J. Gurevitch, and TTEE (Tools for Transparency in Ecology and Evolution) Working Group "Promoting Transparency in Evolutionary Biology and Ecology," Systematic Botany 41(3), 495-497, (28 July 2016). https://doi.org/10.1600/036364416X692262
Published: 28 July 2016
Back to Top