From MFKP_wiki

Jump to: navigation, search

Selection: with tag scientific-communication [at least 200 articles] 


Opinion: on being an advisor to today’s junior scientists

Proceedings of the National Academy of Sciences, Vol. 114, No. 21. (23 May 2017), pp. 5321-5323,


[Excerpt] Young scientists often have the same long-term goal: use one’s smarts and drive to gain insights into a problem of interest. Typically, these scientists draw upon a long-standing and time-tested scientific process: formulate a hypothesis, design experiments to test this hypothesis, collect data, interpret the data, revisit and modify the hypothesis, and so on. [\n] Unfortunately, the reality isn’t quite so straightforward. The hours are long and the rewards short. And the challenges for fledgling scientists seem to be growing. Attractive ...


Escape from the impact factor

Ethics in Science and Environmental Politics, Vol. 8, No. 1. (2008), pp. 5-7


As Editor-in-Chief of the journal Nature, I am concerned by the tendency within academic administrations to focus on a journal’s impact factor when judging the worth of scientific contributions by researchers, affecting promotions, recruitment and, in some countries, financial bonuses for each paper. Our own internal research demonstrates how a high journal impact factor can be the skewed result of many citations of a few papers rather than the average level of the majority, reducing its value as an objective measure ...


A concise review on the role of author self-citations in information science, bibliometrics and science policy

Scientometrics, Vol. 67, No. 2. (2006), pp. 263-277,


The objective of the present study is twofold: (1) to show the aims and means of quantitative interpretation of bibliographic features in bibliometrics and their re-interpretation in research policy, and (2) to summarise the state-of-art in self-citation research. The authors describe three approaches to the role of author self-citations and possible conflicts arising from the different perspectives. From the bibliometric viewpoint we can conclude that that there is no reason for condemning self-citations in general or for removing them from macro ...


A climate policy pathway for near- and long-term benefits

Science, Vol. 356, No. 6337. (05 May 2017), pp. 493-494,


The Paris Climate Agreement under the United Nations Framework Convention on Climate Change (UNFCCC) explicitly links the world's long-term climate and near-term sustainable development and poverty eradication agendas. Urgent action is needed, but there are many paths toward the agreement's long-term, end-of-century, 1.5° to 2°C climate target. We propose that reducing short-lived climate pollutants (SLCPs) enough to slow projected global warming by 0.5°C over the next 25 years be adopted as a near-term goal, with many potential benefits toward achieving Sustainable ...


Unmask temporal trade-offs in climate policy debates

Science, Vol. 356, No. 6337. (04 May 2017), pp. 492-493,


Global warming potentials (GWPs) have become an essential element of climate policy and are built into legal structures that regulate greenhouse gas emissions. This is in spite of a well-known shortcoming: GWP hides trade-offs between short- and long-term policy objectives inside a single time scale of 100 or 20 years (1). The most common form, GWP100, focuses on the climate impact of a pulse emission over 100 years, diluting near-term effects and misleadingly implying that short-lived climate pollutants exert forcings in ...


Building confidence in climate model projections: an analysis of inferences from fit

WIREs Clim Change, Vol. 8, No. 3. (1 May 2017), n/a,


Climate model projections are used to inform policy decisions and constitute a major focus of climate research. Confidence in climate projections relies on the adequacy of climate models for those projections. The question of how to argue for the adequacy of models for climate projections has not gotten sufficient attention in the climate modeling community. The most common way to evaluate a climate model is to assess in a quantitative way degrees of ‘model fit’; that is, how well model results ...


The importance of free and open source software and open standards in modern scientific publishing

Publications, Vol. 1, No. 2. (26 June 2013), pp. 49-55,


In this paper we outline the reasons why we believe a reliance on the use of proprietary computer software and proprietary file formats in scientific publication have negative implications for the conduct and reporting of science. There is increasing awareness and interest in the scientific community about the benefits offered by free and open source software. We discuss the present state of scientific publishing and the merits of advocating for a wider adoption of open standards in science, particularly where it ...


Good colour maps: how to design them

(12 Sep 2015)


Many colour maps provided by vendors have highly uneven perceptual contrast over their range. It is not uncommon for colour maps to have perceptual flat spots that can hide a feature as large as one tenth of the total data range. Colour maps may also have perceptual discontinuities that induce the appearance of false features. Previous work in the design of perceptually uniform colour maps has mostly failed to recognise that CIELAB space is only designed to be perceptually uniform at very low spatial frequencies. The most ...


Why we use bad color maps and what you can do about it

Electronic Imaging (February 2016), pp. 1-6,


We know the rainbow color map is terrible, and it is emphatically reviled by the visualization community, yet its use continues to persist. Why do we continue to use a this perceptual encoding with so many known flaws? Instead of focusing on why we should not use rainbow colors, this position statement explores the rational for why we do pick these colors despite their flaws. Often the decision is influenced by a lack of knowledge, but even experts that know better ...


Diverging color maps for scientific visualization

In Advances in Visual Computing, Vol. 5876 (2009), pp. 92-103,


One of the most fundamental features of scientific visualization is the process of mapping scalar values to colors. This process allows us to view scalar fields by coloring surfaces and volumes. Unfortunately, the majority of scientific visualization tools still use a color map that is famous for its ineffectiveness: the rainbow color map. This color map, which naïvely sweeps through the most saturated colors, is well known for its ability to obscure data, introduce artifacts, and confuse users. Although many alternate ...


Colour schemes

SRON Technical Note, No. 2.2. (December 2012), SRON/EPS/TN/09-002


[Excerpt: Introduction] Graphics with scientific data become clearer when the colours are chosen carefully. It is convenient to have a good default scheme ready for each type of data, with colours that are distinct for all readers, including colour-blind people. This document shows such schemes as a function of the number of colours needed, with some examples. It also gives a conversion of colour coordinates to simulate approximately how any colour is seen if you are colour-blind. [\n] [...] ...


Post-normal institutional identities: quality assurance, reflexivity and ethos of care



[Highlights] [::] Given the current crises of legitimacy and quality in mainstream science, institutions that produce and govern science and those that provide scientific advice to policy need to change their modus operandis; we advocate for an ethos of care. [::] Post-normal science and other frameworks of scientific knowledge production may inspire trustfulness in institutions that provide scientific advice to policy. [::] In Europe, the Joint Research Centre of the European Commission has the necessary scaffolding to advise policy in view of public interest, ...


Communication: science censorship is a global issue

Nature, Vol. 542, No. 7640. (08 February 2017), pp. 165-165,


[Excerpt] [...] Regrettably, suppression of public scientific information is already the norm, or is being attempted, in many countries [...]. We fear that such gagging orders could encourage senior bureaucrats to use funding as a tool with which to rein in academic freedoms. [...] The response of scientists to this type of coercion has been to share scientific information widely and openly using such legal means as social media to defend facts and transparency [...] ...


Keep it complex

Nature, Vol. 468, No. 7327. (23 December 2010), pp. 1029-1031,


When knowledge is uncertain, experts should avoid pressures to simplify their advice. Render decision-makers accountable for decisions, says Andy Stirling. ...


When a preprint becomes the final paper



A geneticist's decision not to publish his finalized preprint in a journal gets support from scientists online. [Excerpt] Preprint papers posted on servers such as arXiv and bioRxiv are designed to get research results out for discussion before they are formally peer reviewed and published in journals. But for some scientists, the term is now a misnomer — their preprint papers will never be submitted for formal publication. [...] One of the major services of traditional journals is that papers are peer ...


Model-based uncertainty in species range prediction

Journal of Biogeography, Vol. 33, No. 10. (October 2006), pp. 1704-1711,


[Aim]  Many attempts to predict the potential range of species rely on environmental niche (or ‘bioclimate envelope’) modelling, yet the effects of using different niche-based methodologies require further investigation. Here we investigate the impact that the choice of model can have on predictions, identify key reasons why model output may differ and discuss the implications that model uncertainty has for policy-guiding applications. [Location]  The Western Cape of South Africa. [Methods]  We applied nine of the most widely used modelling techniques to model potential ...


Five selfish reasons to work reproducibly

Genome Biology, Vol. 16, No. 1. (8 December 2015), 274,


And so, my fellow scientists: ask not what you can do for reproducibility; ask what reproducibility can do for you! Here, I present five reasons why working reproducibly pays off in the long run and is in the self-interest of every ambitious, career-oriented scientist. [Excerpt] [::Reproducibility: what's in it for me?] In this article, I present five reasons why working reproducibly pays off in the long run and is in the self-interest of every ambitious, career-oriented scientist. [::] Reason number 1: reproducibility helps to avoid ...


The mismeasurement of science

Current Biology, Vol. 17, No. 15. (07 August 2007), pp. R583-R585,


[Excerpt:Impact factors and citations] Crucially, impact factors are distorted by positive feedback — many citations are not based on reading the paper but by reading other papers, particularly reviews. One study even suggested that, of cited articles, only some 20% had actually been read. [...] Nevertheless, citations are now being used to make quantitative comparisons between scientists. [...] [Changes in behaviour] Unfortunately, the use of these measures is having damaging effects on perceptions and on behaviour; these I list below. Please note that ...


Is grey literature ever used? Using citation analysis to measure the impact of GESAMP, an international Marine scientific advisory body

Canadian Journal of Information and Library Science, Vol. 28, No. 1. (2004), pp. 45-65


Citation analysis was used to measure the impact of GESAMP, the Joint Group of Experts on the Scientific Aspects of Marine Environmental Protection, which since 1969 has published reports for the United Nations and seven of its agencies. Web of Science was used to search for citations to 114 publications, of which 15 are journal articles or books. Citations to grey literature can be difficult to locate and interpret, but two-thirds of the 1436 citations, in 1178 citing papers, are to ...


Statistical analysis

In Science: editorial policies (2016)


[Excerpt: Statistical analysis] Generally, authors should describe statistical methods with enough detail to enable a knowledgeable reader with access to the original data to verify the results. [::] Data pre-processing steps such as transformations, re-coding, re-scaling, normalization, truncation, and handling of below detectable level readings and outliers should be fully described; any removal or modification of data values must be fully acknowledged and justified. [::] [...] [::] The number of sampled units, N, upon which each reported statistic is based must be stated. [::] For continuous ...


Rainbow color map critiques: an overview and annotated bibliography

MathWorks Technical Articles and Newsletters, Vol. 25 (2014), 92238v00


A rainbow color map is based on the order of colors in the spectrum of visible light—the same colors that appear in a rainbow. Rainbow color maps commonly appear in data visualizations in many different scientific and engineering communities, and technical computing software often provides a rainbow color map as the default choice. Although rainbow color maps remain popular, they have a number of weaknesses when used for scientific visualization, and have been widely criticized. [\n] This paper summarizes the criticisms of ...


Take the time and effort to correct misinformation

Nature, Vol. 540, No. 7632. (6 December 2016), pp. 171-171,


Scientists should challenge online falsehoods and inaccuracies — and harness the collective power of the Internet to fight back, argues Phil Williamson. [Excerpt] [...] Most researchers who have tried to engage online with ill-informed journalists or pseudoscientists will be familiar with Brandolini’s law (also known as the Bullshit Asymmetry Principle): the amount of energy needed to refute bullshit is an order of magnitude bigger than that needed to produce it. Is it really worth taking the time and effort to challenge, correct and clarify ...


Theory of citing

In Handbook of Optimization in Complex Networks, Vol. 57 (11 Sep 2012), pp. 463-505,


We present empirical data on misprints in citations to twelve high-profile papers. The great majority of misprints are identical to misprints in articles that earlier cited the same paper. The distribution of the numbers of misprint repetitions follows a power law. We develop a stochastic model of the citation process, which explains these findings and shows that about 70-90% of scientific citations are copied from the lists of references used in other papers. Citation copying can explain not only why some misprints become popular, but also why some ...


Editorial: evidence-based guidelines for avoiding the most prevalent and serious APA error in journal article submissions - The citation error

Research in the Schools, Vol. 17, No. 2. (2010), pp. i-xxiv


In a previous editorial, Onwuegbuzie, Combs, Slate, and Frels (2010) discussed the findings of Combs, Onwuegbuzie, and Frels (2010), who identified the 60 most common American Psychological Association (APA) errors—with the most common error being incorrect use of numbers that was committed by 57.3% of authors. However, they did not analyze citation errors, which stem from a failure “to make certain that each source referenced appears in both places [text and reference list] and that the text citation and reference list ...


Errors in bibliographic citations: a continuing problem

The Library Quarterly, Vol. 59, No. 4. (1 October 1989), pp. 291-304,


Bibliographic references are an accepted part of scholarly publication. As such, they have been used for information retrieval, studies of scientific communication, collection development decisions, and even determination of salary raises, as well as for their primary purpose of documentation of authors' claims. However, there appears to be a high percentage of errors in these citations, seen in evidence from the mid-nineteenth century to the present. Such errors can be traced to a lack of standardization in citation formats, misunderstanding of ...


Accuracy of cited references: the role of citation databases

College & Research Libraries, Vol. 67, No. 4. (01 July 2006), pp. 292-303,


The nature and extent of errors made by Science Citation Index ExpandedTM (SCIE) and SciFinder® ScholarTM (SFS) during data entry have been characterized by analysis of more than 5,400 cited articles from 204 randomly selected cited-article lists published in three core chemistry journals. Failure to map cited articles to target-source articles was due to transcription errors, target-source article errors, omitted cited articles, and reason unknown. Mapping error rates ranged from 1.2 to 6.9 percent. SCIE and SFS also were found to ...


Characteristics of doctoral students who commit citation errors

Library Review, Vol. 55, No. 3. (March 2006), pp. 195-208,


[Purpose] The purpose of this study was to investigate the citation error rate and quality of reference lists in doctoral dissertation proposals. This research also sought to examine the relationship between perfectionism and frequency of citation errors and the adherence of the reference list to the fidelity of the chosen citation style among doctoral students. Also of interest was to determine which demographic variables predict citation errors and quality of the reference list. [Design/methodology/approach] Participants were 64 doctoral students from various disciplines enrolled in ...


Copyright contradictions in scholarly publishing

First Monday, Vol. 7, No. 11. (04 November 2002), 1006,


This paper examines contradictions in how copyright works with the publishing of scholarly journals. These contradictions have to do with the protection of the authors’ interest and have become apparent with the rise of open access publishing as an alternative to the traditional commercial model of selling journal subscriptions. Authors may well be better served, as may the public which supports research, by open access journals because of its wider readership and early indications of greater scholarly impact. This paper reviews ...


Ethics among scholars in academic publishing

In 2012 Proceedings of the Information Systems Educators Conference (2012), 1948


This paper offers a survey of the contemporary and common-place ethical breaches concerning authorship, research, and publishing in today’s scholarly production, as juxtaposed with some of the predominant standards and guidelines that have been developed to direct academic publishing practices. While the paper may suggest the need for an updated and comprehensive set of guidelines for multiple discipline areas, the purpose here is to prepare the theoretical framework for a future computing discipline-specific study of ethical authorship and related concepts in ...


Programmers, professors, and parasites: credit and co-authorship in computer science

Science and Engineering Ethics In Science and Engineering Ethics, Vol. 15, No. 4. (2009), pp. 467-489,


This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be ...


Post-truth: a guide for the perplexed

Nature, Vol. 540, No. 7631. (28 November 2016), pp. 9-9,


If politicians can lie without condemnation, what are scientists to do? Kathleen Higgins offers some explanation. [Excerpt] The Oxford Dictionaries named ‘post-truth’ as their 2016 Word of the Year. It must sound alien to scientists. Science’s quest for knowledge about reality presupposes the importance of truth, both as an end in itself and as a means of resolving problems. How could truth become passé? [\n] [...] [\n] Post-truth refers to blatant lies being routine across society, and it means that politicians can lie without ...


The false academy: predatory publishing in science and bioethics

Medicine, Health Care and Philosophy (2016), pp. 1-8,


This paper describes and discusses the phenomenon ‘predatory publishing’, in relation to both academic journals and books, and suggests a list of characteristics by which to identify predatory journals. It also raises the question whether traditional publishing houses have accompanied rogue publishers upon this path. It is noted that bioethics as a discipline does not stand unaffected by this trend. Towards the end of the paper it is discussed what can and should be done to eliminate or reduce the effects ...


ePiX tutorial and reference manual



[Excerpt: Introduction] ePiX, a collection of batch utilities, creates mathematically accurate figures, plots, and animations containing LATEX typography. The input syntax is easy to learn, and the user interface resembles that of LATEX itself: You prepare a scene description in a text editor, then “compile” the input file into a picture. LATEX- and web-compatible output types include a LATEX picture-like environment written with PSTricks, tikz, or eepic macros; vector images (eps, ps, and pdf); and bitmapped images and movies (png, mng, and gif). [\n] ePiX’s strengths include: [::] Quality of ...


JRC data policy

Vol. 27163 EN (2015),


[Executive summary] The work on the JRC Data Policy followed the task identified in the JRC Management Plan 2014 to develop a dedicated data policy to complement the JRC Policy on Open Access to Scientific Publications and Supporting Guidance, and to promote open access to research data in the context of Horizon 2020. [\n] Important policy commitments and the relevant regulatory basis within the European Union and the European Commission include: the Commission Decision on the reuse of Commission documents, Commission ...


Encourage governments to heed scientific advice

Nature, Vol. 537, No. 7622. (28 September 2016), pp. 587-587,


To stop evidence-based policy losing its clout, researchers need to engage with policymakers and understand their needs, says Bill Colglazier. [Excerpt] [...] Most governments do want to consider and harness science, technology and innovation. [...] Why, then, is science losing its clout in the current political debates? In my view, the explanation is relatively simple. In the short term, politics, or more precisely value judgements, trump science. This is especially true when there are scientific uncertainties. [\n] Value judgements come in three varieties. ...


More accountability for big-data algorithms

Nature, Vol. 537, No. 7621. (21 September 2016), pp. 449-449,


To avoid bias and improve transparency, algorithm designers must make data sources and profiles public. [Excerpt] [...] Algorithms, from the simplest to the most complex, follow sets of instructions or learn to accomplish a goal. In principle, they could help to make impartial analyses and decisions by reducing human biases and prejudices. But there is growing concern that they risk doing the opposite, and will replicate and exacerbate human failings [...]. And in an era of powerful computers, machine learning and big data, ...


Social semantics: altruism, cooperation, mutualism, strong reciprocity and group selection

Journal of Evolutionary Biology, Vol. 20, No. 2. (1 March 2007), pp. 415-432,


From an evolutionary perspective, social behaviours are those which have fitness consequences for both the individual that performs the behaviour, and another individual. Over the last 43 years, a huge theoretical and empirical literature has developed on this topic. However, progress is often hindered by poor communication between scientists, with different people using the same term to mean different things, or different terms to mean the same thing. This can obscure what is biologically important, and what is not. The potential for ...


Scientific advances: fallacy of perfection harms peer review

Nature, Vol. 537, No. 7618. (31 August 2016), pp. 34-34,


[Excerpt] [...] The history of science has taught us that most progress has come from exploring flawed hypotheses and imperfect models. We must always strive for the better study, the better model, the better analysis. As experienced reviewers, however, we contend that seeking ultimate perfection is not the same as accepting nothing less here and now. Scientific progress depends on such compromise — provided that potential caveats are recognized. [\n] If a model is the most technically and ethically feasible approach available, ...


Opinion: science in the age of selfies

Proceedings of the National Academy of Sciences, Vol. 113, No. 34. (23 August 2016), pp. 9384-9387,


[Excerpt] [\n] [...] [\n] Here there is a paradox: Today, there are many more scientists, and much more money is spent on research, yet the pace of fundamental innovation, the kinds of theories and engineering practices that will feed the pipeline of future progress, appears, to some observers, including us, to be slowing [...]. Why might that be the case? [\n] One argument is that “theoretical models” may not even exist for some branches of science, at least not in the ...


Define the Anthropocene in terms of the whole Earth

Nature, Vol. 536, No. 7616. (17 August 2016), pp. 251-251,


Researchers must consider human impacts on entire Earth systems and not get trapped in discipline-specific definitions, says Clive Hamilton. [Excerpt] The Anthropocene was conceived by Earth-system scientists to capture the very recent rupture in Earth’s history arising from the impact of human activity on the Earth system as a whole. Read that again. Take special note of the phrases ‘very recent rupture’ and ‘the Earth system as a whole’. Understanding the Anthropocene, and what humanity now confronts, depends on a firm grasp of ...


Why doesn't your model pass information to mine?

In Workshop on Digital Mapping Techniques 2009 (2009)


For several decades geologists have been making three-dimensional (3D) models. Various proprietary and open software tools have been developed which allow geoscientists to produce reasonable 3D representation of the geological system that they are studying. The model they produce is quite often an ‘island’ of independent information. For a long time this didn't matter as there were so few models that there were unlikely to be any adjacent models forming islands in the same sea area. However, that is changing, the ...


The past, present and future of the PhD thesis

Nature, Vol. 535, No. 7610. (6 July 2016), pp. 7-7,


Writing a PhD thesis is a personal and professional milestone for many researchers. But the process needs to change with the times. [Excerpt] According to one of those often-quoted statistics that should be true but probably isn’t, the average number of people who read a PhD thesis all the way through is 1.6. And that includes the author. More interesting might be the average number of PhD theses that the typical scientist — and reader of Nature — has read from start ...


The battle lines are drawn

Science, Vol. 353, No. 6294. (30 June 2016), pp. 38-38,


[Excerpt] [\n] [...] In his new book, The War on Science, Shawn Otto documents the modern clash between what he calls the “authoritarians” (governments, large corporations, and religious groups) and the “antiauthoritarians” (scientists and other liberal thinkers). Drawing on recent examples ranging from the evolution debate to vaccine skepticism, Otto describes the emergence of an antiscience movement whose focus is to disrupt the creation of evidence-based policy for the sake of preserving profitable business models or entrenched religious dogma. [\n] Otto is at his ...


Bring climate change back from the future

Nature, Vol. 534, No. 7608. (21 June 2016), pp. 437-437,


The ‘shock’ over an Australian extinction shows that we still don’t accept that global warming is a problem for now, says James Watson. [Excerpt] Climate change has claimed its first mammal casualty, with the reported extinction of the Bramble Cay melomys (Melomys rubicola). The last of these Australian marsupials is thought to have disappeared around 2009, but the release last week of a report by the Queensland government stating the probable extinction of the species and the cause — sea-level rise induced ...


Sailing from the seas of chaos into the corridor of stability: practical recommendations to increase the informational value of studies

Perspectives on psychological science : a journal of the Association for Psychological Science, Vol. 9, No. 3. (01 May 2014), pp. 278-292,


Recent events have led psychologists to acknowledge that the inherent uncertainty encapsulated in an inductive science is amplified by problematic research practices. In this article, we provide a practical introduction to recently developed statistical tools that can be used to deal with these uncertainties when performing and evaluating research. In Part 1, we discuss the importance of accurate and stable effect size estimates as well as how to design studies to reach a corridor of stability around effect size estimates. In ...


Promoting research resource identification at JCN

Journal of Comparative Neurology, Vol. 522, No. 8. (01 June 2014), pp. 1707-1707,


[Excerpt] [\n] [...] [\n] The attention of scientists, editors, and policymakers alike have all turned recently to the issue of reproducibility in scientific research, focusing on research spanning from the pharmaceutical industry (Begley and Ellis, 2012) to the highest levels of government (Collins and Tabak, 2014; see also McNutt, 2014). While these commentaries point out that scientific misconduct is quite rare, they do point to a confluence of factors that hinder the reproducibility of scientific findings, including the identification of key reagents, such ...


The Resource Identification Initiative: a cultural shift in publishing

Neuroinformatics, Vol. 14, No. 2. (2016), pp. 169-182,


A central tenet in support of research reproducibility is the ability to uniquely identify research resources, i.e., reagents, tools, and materials that are used to perform experiments. However, current reporting practices for research resources are insufficient to identify the exact resources that are reported or to answer basic questions such as “How did other studies use resource X?” To address this issue, the Resource Identification Initiative was launched as a pilot project to improve the reporting standards for research resources in ...


Tales of future weather

Nature Climate Change, Vol. 5, No. 2. (28 January 2015), pp. 107-113,


Society is vulnerable to extreme weather events and, by extension, to human impacts on future events. As climate changes weather patterns will change. The search is on for more effective methodologies to aid decision-makers both in mitigation to avoid climate change and in adaptation to changes. The traditional approach uses ensembles of climate model simulations, statistical bias correction, downscaling to the spatial and temporal scales relevant to decision-makers, and then translation into quantities of interest. The veracity of this approach cannot ...


On the cutting edge: teaching help for geoscience faculty

Science, Vol. 327, No. 5969. (25 February 2010), pp. 1095-1096,


In contrast to science, which makes progress at the level of the community and where individual work builds on all that has come before, teaching science has often been an individual enterprise. Typically, faculty create courses in isolation, without the benefit of knowledge of others' classroom experiences or research on how students learn (1, 2). Building a culture of sharing and communal improvement in support of undergraduate geoscience teaching is the goal of the On the Cutting Edge professional development program. ...


Education: animating possible worlds

Science, Vol. 308, No. 5718. (01 April 2005), pp. 29e-29e,


Global warming's future impact depends on factors such as human population growth and fossil fuel use. High school and introductory college classes can learn how these and other variables might influence temperatures, sea levels, and more at a new tutorial hosted by California State University, Los Angeles. The Java applet helps students work through scenarios for the future sketched by the Intergovernmental Panel on Climate Change. For example, animations illustrate flooding in areas such as Florida and Indonesia under different sets ...

This page of the database may be cited as:
Integrated Natural Resources Modelling and Management - Meta-information Database.

Result page: 1 2 3 4 5 Next

Publication metadata

Bibtex, RIS, RSS/XML feed, Json, Dublin Core

Meta-information Database (INRMM-MiD).
This database integrates a dedicated meta-information database in CiteULike (the CiteULike INRMM Group) with the meta-information available in Google Scholar, CrossRef and DataCite. The Altmetric database with Article-Level Metrics is also harvested. Part of the provided semantic content (machine-readable) is made even human-readable thanks to the DCMI Dublin Core viewer. Digital preservation of the meta-information indexed within the INRMM-MiD publication records is implemented thanks to the Internet Archive.
The library of INRMM related pubblications may be quickly accessed with the following links.
Search within the whole INRMM meta-information database:
Search only within the INRMM-MiD publication records:
Full-text and abstracts of the publications indexed by the INRMM meta-information database are copyrighted by the respective publishers/authors. They are subject to all applicable copyright protection. The conditions of use of each indexed publication is defined by its copyright owner. Please, be aware that the indexed meta-information entirely relies on voluntary work and constitutes a quite incomplete and not homogeneous work-in-progress.
INRMM-MiD was experimentally established by the Maieutike Research Initiative in 2008 and then improved with the help of several volunteers (with a major technical upgrade in 2011). This new integrated interface is operational since 2014.