From MFKP_wiki

Jump to: navigation, search

Selection: with tag reproducible-research [100 articles] 

 

Software simplified

  
Nature, Vol. 546, No. 7656. (29 May 2017), pp. 173-174, https://doi.org/10.1038/546173a

Abstract

Containerization technology takes the hassle out of setting up software and can boost the reproducibility of data-driven research. [Excerpt] [...] Containers are essentially lightweight, configurable virtual machines — simulated versions of an operating system and its hardware, which allow software developers to share their computational environments. Researchers use them to distribute complicated scientific software systems, thereby allowing others to execute the software under the same conditions that its original developers used. In doing so, containers can remove one source of variability in ...

 

Willingness to share research data is related to the strength of the evidence and the quality of reporting of statistical results

  
PLOS ONE, Vol. 6, No. 11. (2 November 2011), e26828, https://doi.org/10.1371/journal.pone.0026828

Abstract

The widespread reluctance to share published research data is often hypothesized to be due to the authors' fear that reanalysis may expose errors in their work or may produce conclusions that contradict their own. However, these hypotheses have not previously been studied systematically. We related the reluctance to share research data for reanalysis to 1148 statistically significant results reported in 49 papers published in two major psychology journals. We found the reluctance to share data to be associated with weaker evidence ...

 

Ten simple rules for making research software more robust

  
PLOS Computational Biology, Vol. 13, No. 4. (13 April 2017), e1005412, https://doi.org/10.1371/journal.pcbi.1005412

Abstract

[Abstract] Software produced for research, published and otherwise, suffers from a number of common problems that make it difficult or impossible to run outside the original institution or even off the primary developer’s computer. We present ten simple rules to make such software robust enough to be run by anyone, anywhere, and thereby delight your users and collaborators. [Author summary] Many researchers have found out the hard way that there’s a world of difference between “works for me on my machine” and “works for ...

 

A manifesto for reproducible science

  
Nature Human Behaviour, Vol. 1, No. 1. (10 January 2017), 0021, https://doi.org/10.1038/s41562-016-0021

Abstract

Improving the reliability and efficiency of scientific research will increase the credibility of the published scientific literature and accelerate discovery. Here we argue for the adoption of measures to optimize key elements of the scientific process: methods, reporting and dissemination, reproducibility, evaluation and incentives. There is some evidence from both simulations and empirical studies supporting the likely effectiveness of these measures, but their broad adoption by researchers, institutions, funders and journals will require iterative evaluation and improvement. We discuss the goals ...

 

Position paper for the endorsement of Free Software and Open Standards in Horizon 2020 and all publicly-funded research

  
In Free Software Foundation Europe (January 2017)

Abstract

The Free Software Foundation Europe (FSFE) is a charity that empowers users to control technology by advocating for Free Software. In a digital world, Free Software is the fundament of Open Knowledge, Open Innovation and Open Science. [\n] Software is an integral part of today’s society. Our daily interactions, transactions, education, communication channels, work and life environments rely heavily on software. "Free Software" refers to all programs distributed under terms and licences that allow users to run the software for any purpose, ...

 

Running an open experiment: transparency and reproducibility in soil and ecosystem science

  
Environmental Research Letters, Vol. 11, No. 8. (01 August 2016), 084004, https://doi.org/10.1088/1748-9326/11/8/084004

Abstract

Researchers in soil and ecosystem science, and almost every other field, are being pushed—by funders, journals, governments, and their peers—to increase transparency and reproducibility of their work. A key part of this effort is a move towards open data as a way to fight post-publication data loss, improve data and code quality, enable powerful meta- and cross-disciplinary analyses, and increase trust in, and the efficiency of, publicly-funded research. Many scientists however lack experience in, and may be unsure of the benefits ...

 

Five selfish reasons to work reproducibly

  
Genome Biology, Vol. 16, No. 1. (8 December 2015), 274, https://doi.org/10.1186/s13059-015-0850-7

Abstract

And so, my fellow scientists: ask not what you can do for reproducibility; ask what reproducibility can do for you! Here, I present five reasons why working reproducibly pays off in the long run and is in the self-interest of every ambitious, career-oriented scientist. [Excerpt] [::Reproducibility: what's in it for me?] In this article, I present five reasons why working reproducibly pays off in the long run and is in the self-interest of every ambitious, career-oriented scientist. [::] Reason number 1: reproducibility helps to avoid ...

 

Enhancing reproducibility for computational methods

  
Science, Vol. 354, No. 6317. (09 December 2016), pp. 1240-1241, https://doi.org/10.1126/science.aah6168

Abstract

Over the past two decades, computational methods have radically changed the ability of researchers from all areas of scholarship to process and analyze data and to simulate complex systems. But with these advances come challenges that are contributing to broader concerns over irreproducibility in the scholarly literature, among them the lack of transparency in disclosure of computational methods. Current reporting methods are often uneven, incomplete, and still evolving. We present a novel set of Reproducibility Enhancement Principles (REP) targeting disclosure challenges ...

 

Research software sustainability: report on a knowledge exchange workshop

  
(February 2016)

Abstract

[Excerpt: Executive summary] Without software, modern research would not be possible. Understandably, people tend to marvel at results rather than the tools used in their discovery, which means the fundamental role of software in research has been largely overlooked. But whether it is widely recognised or not, research is inexorably connected to the software that is used to generate results, and if we continue to overlook software we put at risk the reliability and reproducibility of the research itself. [\n] The adoption of software is accompanied by new risks - many of ...

 

Social software

  
Nature Methods, Vol. 4, No. 3. (01 March 2007), pp. 189-189, https://doi.org/10.1038/nmeth0307-189

Abstract

Software that is custom-developed as part of novel methods is as important for the method's implementation as reagents and protocols. Such software, or the underlying algorithms, must be made available to readers upon publication. [Excerpt] "An inherent principle of publication is that others should be able to replicate and build upon the authors' published claims. Therefore, a condition of publication in a Nature journal is that authors are required to make materials, data and associated protocols available to readers promptly on request." ...

 

Why linked data is not enough for scientists

  
Future Generation Computer Systems, Vol. 29, No. 2. (February 2013), pp. 599-611, https://doi.org/10.1016/j.future.2011.08.004

Abstract

[Abstract] Scientific data represents a significant portion of the linked open data cloud and scientists stand to benefit from the data fusion capability this will afford. Publishing linked data into the cloud, however, does not ensure the required reusability. Publishing has requirements of provenance, quality, credit, attribution and methods to provide the reproducibility that enables validation of results. In this paper we make the case for a scientific data publication model on top of linked data and introduce the notion of Research ...

 

Scientists behaving badly

  
Nature, Vol. 435, No. 7043. (9 June 2005), pp. 737-738, https://doi.org/10.1038/435737a

Abstract

To protect the integrity of science, we must look beyond falsification, fabrication and plagiarism, to a wider range of questionable research practices, argue Brian C. Martinson, Melissa S. Anderson and Raymond de Vries. [\n] Serious misbehaviour in research is important for many reasons, not least because it damages the reputation of, and undermines public support for, science. Historically, professionals and the public have focused on headline-grabbing cases of scientific misconduct, but we believe that researchers can no longer afford to ignore ...

 

The natural selection of bad science

  
Open Science, Vol. 3, No. 9. (01 September 2016), 160384, https://doi.org/10.1098/rsos.160384

Abstract

Poor research design and data analysis encourage false-positive findings. Such poor methods persist despite perennial calls for improvement, suggesting that they result from something more than just misunderstanding. The persistence of poor methods results partly from incentives that favour them, leading to the natural selection of bad science. This dynamic requires no conscious strategizing—no deliberate cheating nor loafing—by scientists, only that publication is a principal factor for career advancement. Some normative methods of analysis have almost certainly been selected to further ...

 

ePiX tutorial and reference manual

  
(2008)

Abstract

[Excerpt: Introduction] ePiX, a collection of batch utilities, creates mathematically accurate figures, plots, and animations containing LATEX typography. The input syntax is easy to learn, and the user interface resembles that of LATEX itself: You prepare a scene description in a text editor, then “compile” the input file into a picture. LATEX- and web-compatible output types include a LATEX picture-like environment written with PSTricks, tikz, or eepic macros; vector images (eps, ps, and pdf); and bitmapped images and movies (png, mng, and gif). [\n] ePiX’s strengths include: [::] Quality of ...

 

The hard road to reproducibility

  
Science, Vol. 354, No. 6308. (07 October 2016), pp. 142-142, https://doi.org/10.1126/science.354.6308.142

Abstract

[Excerpt] [...] A couple years ago, we published a paper applying computational fluid dynamics to the aerodynamics of flying snakes. More recently, I asked a new student to replicate the findings of that paper, both as a training opportunity and to help us choose which code to use in future research. Replicating a published study is always difficult—there are just so many conditions that need to be matched and details that can't be overlooked—but I thought this case was relatively straightforward. ...

 

Why scientists must share their research code

  
Nature (13 September 2016), https://doi.org/10.1038/nature.2016.20504

Abstract

'Reproducibility editor' Victoria Stodden explains the growing movement to make code and data available to others. [Excerpt] [...] [::What does computational reproducibility mean?] It means that all details of computation — code and data — are made routinely available to others. If I can run your code on your data, then I can understand what you did. We need to expose all the steps that went into any discovery that relies on a computer. [::What’s the scientific value of running the same data with the ...

 

Transparency in ecology and evolution: real problems, real solutions

  
Trends in Ecology & Evolution, Vol. 31, No. 9. (September 2016), pp. 711-719, https://doi.org/10.1016/j.tree.2016.07.002

Abstract

To make progress scientists need to know what other researchers have found and how they found it. However, transparency is often insufficient across much of ecology and evolution. Researchers often fail to report results and methods in detail sufficient to permit interpretation and meta-analysis, and many results go entirely unreported. Further, these unreported results are often a biased subset. Thus the conclusions we can draw from the published literature are themselves often biased and sometimes might be entirely incorrect. Fortunately there ...

 

Stop ignoring misconduct

  
Nature, Vol. 537, No. 7618. (1 September 2016), pp. 29-30, https://doi.org/10.1038/537029a

Abstract

Efforts to reduce irreproducibility in research must also tackle the temptation to cheat, argue Donald S. Kornfeld and Sandra L. Titus. [Excerpt: Preventing misconduct] To diminish the threat that misconduct poses to science, scientists and society: [::] Authorities should acknowledge that deliberate misconduct is an important contributor to irreproducibility. [::] Mentors should be evaluated to assure quality; those who contribute to misconduct should be penalized. [::] Institutions and government agencies should have procedures to protect whistle-blowers from retaliation. [::] Senior faculty members who are found guilty of ...

 

Filesystem Hierarchy Standard

  
(2015)

Abstract

This standard consists of a set of requirements and guidelines for file and directory placement under UNIX-like operating systems. The guidelines are intended to support interoperability of applications, system administration tools, development tools, and scripts as well as greater uniformity of documentation for these systems. ...

 

1,500 scientists lift the lid on reproducibility

  
Nature, Vol. 533, No. 7604. (25 May 2016), pp. 452-454, https://doi.org/10.1038/533452a

Abstract

Survey sheds light on the ‘crisis’ rocking research. [Excerpt] More than 70% of researchers have tried and failed to reproduce another scientist's experiments, and more than half have failed to reproduce their own experiments. Those are some of the telling figures that emerged from Nature's survey of 1,576 researchers who took a brief online questionnaire on reproducibility in research. [\n] The data reveal sometimes-contradictory attitudes towards reproducibility. Although 52% of those surveyed agree that there is a significant 'crisis' of reproducibility, less than ...

 

Gotchas in writing Dockerfile

  
(2014)

Abstract

[Excerpt: Why do we need to use Dockerfile?] Dockerfile is not yet-another shell. Dockerfile has its special mission: automation of Docker image creation. [\n] Once, you write build instructions into Dockerfile, you can build the same image just with docker build command. [\n] Dockerfile is also useful to tell the knowledge of what a job the container does to somebody else. Your teammates can tell what the container is supposed to do just by reading Dockerfile. They don’t need to know login to the ...

 

An introduction to Docker for reproducible research, with examples from the R environment

  
ACM SIGOPS Operating Systems Review, Vol. 49, No. 1. (2 Oct 2014), pp. 71-79, https://doi.org/10.1145/2723872.2723882

Abstract

As computational work becomes more and more integral to many aspects of scientific research, computational reproducibility has become an issue of increasing importance to computer systems researchers and domain scientists alike. Though computational reproducibility seems more straight forward than replicating physical experiments, the complex and rapidly changing nature of computer environments makes being able to reproduce and extend such work a serious challenge. In this paper, I explore common reasons that code developed for one research project cannot be successfully executed or extended by subsequent researchers. I review current ...

 

Using Docker to support reproducible research

  

Abstract

Reproducible research is a growing movement among scientists, but the tools for creating sustainable software to support the computational side of research are still in their infancy and are typically only being used by scientists with expertise in com- puter programming and system administration. Docker is a new platform developed for the DevOps community that enables the easy creation and management of consistent computational environments. This article describes how we have applied it to computational science and suggests that it could ...

 

Reality check on reproducibility

  
Nature, Vol. 533, No. 7604. (25 May 2016), pp. 437-437, https://doi.org/10.1038/533437a

Abstract

A survey of Nature readers revealed a high level of concern about the problem of irreproducible results. Researchers, funders and journals need to work together to make research more reliable. [Excerpt] Is there a reproducibility crisis in science? Yes, according to the readers of Nature. Two-thirds of researchers who responded to a survey by this journal said that current levels of reproducibility are a major problem. [\n] [...] [\n] What does ‘reproducibility’ mean? Those who study the science of science joke that the definition ...

 

Badges to acknowledge open practices: a simple, low-cost, effective method for increasing transparency

  
PLoS Biology, Vol. 14, No. 5. (12 May 2016), e1002456, https://doi.org/10.1371/journal.pbio.1002456

Abstract

Beginning January 2014, Psychological Science gave authors the opportunity to signal open data and materials if they qualified for badges that accompanied published articles. Before badges, less than 3% of Psychological Science articles reported open data. After badges, 23% reported open data, with an accelerating trend; 39% reported open data in the first half of 2015, an increase of more than an order of magnitude from baseline. There was no change over time in the low rates of data sharing among ...

 

Promoting research resource identification at JCN

  
Journal of Comparative Neurology, Vol. 522, No. 8. (01 June 2014), pp. 1707-1707, https://doi.org/10.1002/cne.23585

Abstract

[Excerpt] [\n] [...] [\n] The attention of scientists, editors, and policymakers alike have all turned recently to the issue of reproducibility in scientific research, focusing on research spanning from the pharmaceutical industry (Begley and Ellis, 2012) to the highest levels of government (Collins and Tabak, 2014; see also McNutt, 2014). While these commentaries point out that scientific misconduct is quite rare, they do point to a confluence of factors that hinder the reproducibility of scientific findings, including the identification of key reagents, such ...

 

Equations and FORTRAN program for the Canadian Forest Fire Weather Index System

  
Vol. 33 (1985)

Abstract

Improved official equations are presented for the 1984 version of the Canadian Forest Fires Weather Index System. The most recent mathematical refinements serve to further rationalize the Fine Fuel Moisture Code and render it more compatible with other developments in the Canadian Forest Fire Danger Rating System. The effect of these changes is so slight that no problems are anticipated in converting from the previous version to this new one. Also given is a FORTRAN program intended as a standard for ...

 

The Resource Identification Initiative: a cultural shift in publishing

  
Neuroinformatics, Vol. 14, No. 2. (2016), pp. 169-182, https://doi.org/10.1007/s12021-015-9284-3

Abstract

A central tenet in support of research reproducibility is the ability to uniquely identify research resources, i.e., reagents, tools, and materials that are used to perform experiments. However, current reporting practices for research resources are insufficient to identify the exact resources that are reported or to answer basic questions such as “How did other studies use resource X?” To address this issue, the Resource Identification Initiative was launched as a pilot project to improve the reporting standards for research resources in ...

 

(INRMM-MiD internal record) List of keywords of the INRMM meta-information database - part 30

  
(February 2014)
Keywords: inrmm-list-of-tags   receptivity   record-to-update-or-delete   red-list   redd   redistributable-scientific-information   reference-manual   reforestation   refugia   regeneration   regional-climate   regional-climate-models   regional-scale   regression   regression-tree-analysis   regulating-services   reinforcement   reinforcement-learning   reinventing-weels   reiteration   relative-distance-similarity   relative-distance-similarity-ancillary   remote-sensing   renewable-energy   renewable-energy-directive   repeatability   repellent-species   replicability   reporting   representative-concentration-pathways   reproducibility   reproducible-research   reproduction   reproductive-effort   resampling   research-funding   research-funding-vs-public-outcome   research-management   research-metrics   research-team-size   reservoir-management   reservoir-services   resilience   resin   resistance   resources-exploitation   respiration   restoration   resurvey-of-semi-permanent   retraction   review   review-publication   review-scopus-european-biodiversity-indicators   revision-control-system   rewarding-best-research-practices   rhamnus-cathartica   rhamnus-catharticus   rhamnus-frangula   rhamnus-saxatilis   rhamnus-spp   rhizophora-apiculata   rhizophora-mangle   rhododendron   rhododendron-arboreum   rhododendron-ferrugineum   rhododendron-periclymenoides   rhododendron-ponticum   rhododendron-spp   rhododendron-viscosum   rhopalicus-tutela   rhus-spp   rhus-typhina   rhyacionia-buoliana   rhyacionia-frustrana   rhyssa-persuasoria   rhytisma   ribes-alpinum   ribes-rubrum   ribes-uva-crispa   ring-analysis   ring-width-chronologies   ringspot-virus   riparian-ecosystem   riparian-forest   riparian-zones   risk-analysis   risk-assessment   risk-reduction   river-flow   river-networks   river-restoration   roads   robert-hooke   robinia-pseudoacacia   robinia-spp   robust-modelling   rockfalls   rodent   romania   root-deterioration  

Abstract

List of indexed keywords within the transdisciplinary set of domains which relate to the Integrated Natural Resources Modelling and Management (INRMM). In particular, the list of keywords maps the semantic tags in the INRMM Meta-information Database (INRMM-MiD). [\n] The INRMM-MiD records providing this list are accessible by the special tag: inrmm-list-of-tags ( http://mfkp.org/INRMM/tag/inrmm-list-of-tags ). ...

 

European atlas of forest tree species

  
Keywords: bioeconomy   chorology   classification   climate   constrained-spatial-multi-frequency-analysis   data-heterogeneity   data-integration   data-uncertainty   disasters   disturbances   ecological-zones   ecology   ecosystem-services   europe   floods   forest-fires   forest-pests   forest-resources   free-software   geospatial   geospatial-semantic-array-programming   gis   gnu-bash   gnu-linux   gnu-octave   habitat-suitability   integrated-modelling   integrated-natural-resources-modelling-and-management   integration-techniques   knowledge-integration   landslides   mastrave-modelling-library   modelling-uncertainty   open-data   paleoecology   relative-distance-similarity   reproducible-research   review   science-policy-interface   science-society-interface   semantic-array-programming   semantic-constraints   semantics   semap   software-uncertainty   soil-erosion   soil-resources   species-distribution   tree-species   uncertainty   water-resources   windstorm  

Abstract

[Excerpt] The European Atlas of Forest Tree Species is the first comprehensive publication of such a unique and essential environmental resource, that is, our trees. Leading scientists and forestry professionals have contributed in the many stages of the production of this atlas, through the collection of ground data on the location of tree species, elaboration of the distribution and suitability maps, production of the photographic material and compilation of the different chapters. The European Atlas of Forest Tree Species is both ...

 

Reproducibility: a tragedy of errors

  
Nature, Vol. 530, No. 7588. (3 February 2016), pp. 27-29, https://doi.org/10.1038/530027a

Abstract

Mistakes in peer-reviewed papers are easy to find but hard to fix, report David B. Allison and colleagues. [Excerpt: Three common errors] As the influential twentieth-century statistician Ronald Fisher (pictured) said: “To consult the statistician after an experiment is finished is often merely to ask him to conduct a post mortem examination. He can perhaps say what the experiment died of.” [\n] [...] Frequent errors, once recognized, can be kept out of the literature with targeted education and policies. Three of the most common are ...

 

The integration of land change modeling framework FUTURES into GRASS GIS 7

  
In Free and Open Source Software for Geospatial - Open innovation for Europe, Vol. 12 (2015), pp. 21-24

Abstract

Many valuable models and tools developed by scientists are often inaccessible to their potential users because of non-existent sharing infrastructure or lack of documentation. Case in point is the FUTure Urban-Regional Environment Simulation (FUTURES), a patch-based land change model for generating scenario-based regional forecasts of urban growth pattern. Despite a high- impact publication, few scientists, planners, or policy makers have adopted FUTURES due to complexity in use and lack of direct access. We seek to address these issues by integrating FUTURES into GRASS GIS, a free and open source ...

References

  1. Bivand, R. (2007). Using the R–Grass interface. OSGeo Journal, 1, 36-38.
  2. Chemin, Y Petras, V., Petrasova, A., Landa, M., Gebbert, S., Zambelli, P., Neteler, M., Löwe, P., Di Leo, M. (2015). GRASS GIS: a peer-reviewed scientific platform and future research repository. Geophysical Research Abstracts 17, 8314+. INRMM-MiD:13544126
  3. Di Leo, M., de Rigo, D., Rodriguez-Aseretto, D., Bosco, C., Petroliagkis, T., Camia, A., San-Miguel-Ayanz, J. (2013). Dynamic data driven ensemble for wildfire behaviour
 

Raising the bar for reproducible science at the U.S. Environmental Protection Agency Office of Research and Development

  
Toxicological Sciences, Vol. 145, No. 1. (01 May 2015), pp. 16-22, https://doi.org/10.1093/toxsci/kfv020

Abstract

Considerable concern has been raised regarding research reproducibility both within and outside the scientific community. Several factors possibly contribute to a lack of reproducibility, including a failure to adequately employ statistical considerations during study design, bias in sample selection or subject recruitment, errors in developing data inclusion/exclusion criteria, and flawed statistical analysis. To address some of these issues, several publishers have developed checklists that authors must complete. Others have either enhanced statistical expertise on existing editorial boards, or formed distinct statistics ...

 

Statistics: P values are just the tip of the iceberg

  
Nature, Vol. 520, No. 7549. (28 April 2015), pp. 612-612, https://doi.org/10.1038/520612a

Abstract

Ridding science of shoddy statistics will require scrutiny of every step, not merely the last one, say Jeffrey T. Leek and Roger D. Peng. [Excerpt] There is no statistic more maligned than the P value. Hundreds of papers and blogposts have been written about what some statisticians deride as 'null hypothesis significance testing' (NHST; see, for example, go.nature.com/pfvgqe). NHST deems whether the results of a data analysis are important on the basis of whether a summary statistic (such as a P value) ...

 

Interactive comment (reply to Anonymous Referee 3) on Modelling soil erosion at European scale: towards harmonization and reproducibility - by Bosco et al

  
Natural Hazards and Earth System Sciences Discussions, Vol. 2 (2014), pp. C1786-C1795, https://doi.org/10.6084/m9.figshare.1379902

Abstract

Throughout the public discussion of our article Bosco et al. (Nat. Hazards Earth Syst. Sci. Discuss., 2, 2639-2680, 2014), the Anonymous Referee 3 provided (Nat. Hazards Earth Syst. Sci. Discuss., 2, C1592-C1594, 2014) a variety of insights. This work presents our replies to them. ...

 

Interactive comment (reply to Dino Torri) on Modelling soil erosion at European scale: towards harmonization and reproducibility - by Bosco et al

  
Natural Hazards and Earth System Sciences Discussions, Vol. 2 (2014), pp. C671-C688, https://doi.org/10.6084/m9.figshare.1379901

Abstract

During the public discussion of our article Bosco et al. (Nat. Hazards Earth Syst. Sci. Discuss., 2, 2639-2680, 2014), D. Torri provided numerous insights (Nat. Hazards Earth Syst. Sci. Discuss. 2, C528-C532, 2014). This work offers our replies to them. ...

 

Top tips to make your research irreproducible

  
(8 Apr 2015)

Abstract

It is an unfortunate convention of science that research should pretend to be reproducible; our top tips will help you mitigate this fussy conventionality, enabling you to enthusiastically showcase your irreproducible work. [Excerpt] [...] Irreproducibility is the default setting for all of science, and irreproducible research is particularly common across the computational sciences. [...] By following our starter tips, you can ensure that if your work is wrong, nobody will be able to check it; if it is correct, you will make everyone else do disproportionately ...

 

Nine simple ways to make it easier to (re)use your data

  
Ideas in Ecology and Evolution, Vol. 6, No. 2. (2013), https://doi.org/10.4033/iee.2013.6b.6.f

Abstract

Sharing data is increasingly considered to be an important part of the scientific process. Making your data publicly available allows original results to be reproduced and new analyses to be conducted. While sharing your data is the first step in allowing reuse, it is also important that the data be easy to understand and use. We describe nine simple ways to make it easy to reuse the data that you share and also make it easier to work with it yourself. ...

 

What is the question?

  
Science, Vol. 347, No. 6228. (20 March 2015), pp. 1314-1315, https://doi.org/10.1126/science.aaa6146

Abstract

Over the past 2 years, increased focus on statistical analysis brought on by the era of big data has pushed the issue of reproducibility out of the pages of academic journals and into the popular consciousness (1). Just weeks ago, a paper about the relationship between tissue-specific cancer incidence and stem cell divisions (2) was widely misreported because of misunderstandings about the primary statistical argument in the paper (3). Public pressure has contributed to the massive recent adoption of reproducible research ...

Visual summary


 

Reproducibility in ecological research

  

Abstract

[Excerpt] The editorial by M. McNutt (“Journals unite for reproducibility,” 7 November, p. 679, published online 5 November) describes an updated version of the solution from journals, including Science and Nature, for reproducibility in biomedical research. If the new policy is to be widely implemented by scientific journals, then the changes must be consistent and mandatory. Reproducibility is not just relevant for biomedical research. Ecology and biodiversity scientists are also increasingly concerned about issues of reproducibility and data sharing (1–3). Reproducibility ...

 

Reproducible research can still be wrong: adopting a prevention approach

  
Proceedings of the National Academy of Sciences, Vol. 112, No. 6. (11 February 2015), pp. 1645-1646, https://doi.org/10.1073/pnas.1421412111

Abstract

[Excerpt] Reproducibility—the ability to recompute results—and replicability—the chances other experimenters will achieve a consistent result—are two foundational characteristics of successful scientific research. Consistent findings from independent investigators are the primary means by which scientific evidence accumulates for or against a hypothesis. Yet, of late, there has been a crisis of confidence among researchers worried about the rate at which studies are either reproducible or replicable. To maintain the integrity of science research and the public’s trust in science, the scientific community ...

Visual summary

 

Modelling soil erosion at European scale: towards harmonization and reproducibility

  
Natural Hazards and Earth System Science, Vol. 15, No. 2. (4 February 2015), pp. 225-245, https://doi.org/10.5194/nhess-15-225-2015

Abstract

Soil erosion by water is one of the most widespread forms of soil degradation. The loss of soil as a result of erosion can lead to decline in organic matter and nutrient contents, breakdown of soil structure and reduction of the water-holding capacity. Measuring soil loss across the whole landscape is impractical and thus research is needed to improve methods of estimating soil erosion with computational modelling, upon which integrated assessment and mitigation strategies may be based. Despite the efforts, the ...

 

Data, eternal

  
Science, Vol. 347, No. 6217. (02 January 2015), pp. 7-7, https://doi.org/10.1126/science.aaa5057

Abstract

[Excerpt] During 2014, Science worked with members of the research community, other publishers, and representatives of funding agencies on many initiatives to increase transparency and promote reproducibility in the published research literature. Those efforts will continue in 2015. Connected to that progress, and an essential element to its success, an additional focus will be on making data more open, easier to access, more discoverable, and more thoroughly documented. My own commitment to these goals is deeply held, for I learned early in ...

 

Minimal make - A minimal tutorial on make

  
(2014)

Abstract

[Excerpt] I would argue that the most important tool for reproducible research is not Sweave or knitr but GNU make. Consider, for example, all of the files associated with a manuscript. In the simplest case, I would have an R script for each figure plus a LaTeX file for the main text. And then a BibTeX file for the references. Compiling the final PDF is a bit of work: [::] Run each R script through R to produce the relevant figure. [::] Run latex and ...

 

Facilitating reproducibility in scientific computing: principles and practice

  
In Reproducibility: Principles, Problems, Practices (2015)

Abstract

The foundation of scientific research is theory and experiment, carefully documented in open publications, in part so that other researchers can reproduce and validate the claimed findings. Unfortunately, the field of scientific and mathematical computing has evolved in ways that often do not meet these high standards. In published computational work, frequently there is no record of the work ow process that produced the published computational results, and in some cases, even the code is missing or has been changed significantly ...

References

  1. Linpack. Available at http://www.netlib.org/linpack
  2. NIST digital library of mathematical functions. Available at http://dlmf.nist.gov
  3. Heartbleed. 2014. Available at http://en.wikipedia.org/wiki/Heartbleed
  4. Top500 list. July 2014. Available at http://top500.org/statistics/perfdevel
  5. A. Abad, R. Barrio, and A. Dena. Computing periodic orbits with arbitrary precision. Phys. Rev. E, 84:016701, 2011
  6. D. H. Bailey. Misleading performance reporting in the supercomputing field. Scientific
 

Knowledge Freedom in computational science: a two stage peer-review process with KF eligibility access review

  

Abstract

Wide scale transdisciplinary modelling (WSTM) growingly demands a focus on reproducible research and scientific knowledge freedom. Data and software freedom are essential aspects of knowledge freedom in computational science. Therefore, ideally published articles should also provide the readers with the data and source code of the described mathematical modelling. To maximise transparency, replicability, reproducibility and reusability, published data should be made available as open data while source code should be made available as free software. Here, a two-stage peer review process ...

Visual summary

 

Journals unite for reproducibility

  
Nature, Vol. 515, No. 7525. (5 November 2014), pp. 7-7, https://doi.org/10.1038/515007a

Abstract

[Excerpt] Consensus on reporting principles aims to improve quality control in biomedical research and encourage public trust in science. Reproducibility, rigour, transparency and independent verification are cornerstones of the scientific method. Of course, just because a result is reproducible does not make it right, and just because it is not reproducible does not make it wrong. A transparent and rigorous approach, however, will almost always shine a light on issues of reproducibility. This light ensures that science moves forward, through independent verifications ...

 

An average soil erosion rate for Europe: Myth or reality?

  
Journal of Soil and Water Conservation, Vol. 53, No. 1. (01 January 1998), pp. 46-50

Abstract

A recent proposal for an average erosion rate for Europe is challenged. The proposed figure is shown to be derived by a tortuous route from plot experiments in Belgium. The original researcher, A. Bollinne, was careful not to extrapolate his results; subsequent workers have not been so circumspect. Further, the concept of an average rate for any continental-size area—Europe or North America—is unsound, because rates vary vary in time and space. There is also a shortage of reliable data for anything ...

 

A swan in the making

  
Science, Vol. 345, No. 6199. (22 August 2014), pp. 855-855, https://doi.org/10.1126/science.1259740

Abstract

Reproducibility is the ugly duckling of science. It provokes distress, denial, and passionate calls for action. With $1.5 trillion spent globally each year on R&D,* the idea that 80% of it is irreproducible† can cause downright dread. It threatens the foundations and credibility of the scientific enterprise. But look past the surface, and reproducibility may well be a swan in the making. ...

 

Announcement: reducing our irreproducibility

  
Nature, Vol. 496, No. 7446. (24 April 2013), pp. 398-398, https://doi.org/10.1038/496398a

Abstract

[Excerpt] Over the past year, Nature has published a string of articles that highlight failures in the reliability and reproducibility of published research (collected and freely available at go.nature.com/huhbyr). The problems arise in laboratories, but journals such as this one compound them when they fail to exert sufficient scrutiny over the results that they publish, and when they do not publish enough information for other researchers to assess results properly. From next month, Nature and the Nature research journals will introduce editorial ...

This page of the database may be cited as:
Integrated Natural Resources Modelling and Management - Meta-information Database. http://mfkp.org/INRMM/tag/reproducible-research

Result page: 1 2 Next

Publication metadata

Bibtex, RIS, RSS/XML feed, Json, Dublin Core

Meta-information Database (INRMM-MiD).
This database integrates a dedicated meta-information database in CiteULike (the CiteULike INRMM Group) with the meta-information available in Google Scholar, CrossRef and DataCite. The Altmetric database with Article-Level Metrics is also harvested. Part of the provided semantic content (machine-readable) is made even human-readable thanks to the DCMI Dublin Core viewer. Digital preservation of the meta-information indexed within the INRMM-MiD publication records is implemented thanks to the Internet Archive.
The library of INRMM related pubblications may be quickly accessed with the following links.
Search within the whole INRMM meta-information database:
Search only within the INRMM-MiD publication records:
Full-text and abstracts of the publications indexed by the INRMM meta-information database are copyrighted by the respective publishers/authors. They are subject to all applicable copyright protection. The conditions of use of each indexed publication is defined by its copyright owner. Please, be aware that the indexed meta-information entirely relies on voluntary work and constitutes a quite incomplete and not homogeneous work-in-progress.
INRMM-MiD was experimentally established by the Maieutike Research Initiative in 2008 and then improved with the help of several volunteers (with a major technical upgrade in 2011). This new integrated interface is operational since 2014.