From MFKP_wiki

Jump to: navigation, search

Selection: with tag bibliometrics [31 articles] 

 

Citations must default to the online publication date

  
Nature, Vol. 558, No. 7711. (27 June 2018), pp. 519-519, https://doi.org/10.1038/d41586-018-05387-4

Abstract

[Excerpt] With online delivery increasingly dominating scientific publishing, most long-established journals run papers in both print and online formats — but not necessarily simultaneously. This can affect how researchers are given scientific priority. [...] In our experience, the time lag between the two can be as long as 6 months. This might be crucial for annual research evaluations, for instance, when a paper is published online at the end of one year and in print the year after [...] ...

 

Reviewers are blinkered by bibliometrics

  
Nature, Vol. 544, No. 7651. (26 April 2017), pp. 411-412, https://doi.org/10.1038/544411a

Abstract

[Excerpt] [...] Although journal impact factors (JIFs) were developed to assess journals and say little about any individual paper, reviewers routinely justify their evaluations on the basis of where candidates have published. [...] As economists who study science and innovation, we see engrained processes working against cherished goals. Scientists we interview routinely say that they dare not propose bold projects for funding in part because of expectations that they will produce a steady stream of papers in journals with high impact ...

 

Escape from the impact factor

  
Ethics in Science and Environmental Politics, Vol. 8, No. 1. (2008), pp. 5-7

Abstract

As Editor-in-Chief of the journal Nature, I am concerned by the tendency within academic administrations to focus on a journal’s impact factor when judging the worth of scientific contributions by researchers, affecting promotions, recruitment and, in some countries, financial bonuses for each paper. Our own internal research demonstrates how a high journal impact factor can be the skewed result of many citations of a few papers rather than the average level of the majority, reducing its value as an objective measure ...

 

Academia’s never-ending selection for productivity

  
Scientometrics In Scientometrics, Vol. 103, No. 1. (15 February 2015), pp. 333-336, https://doi.org/10.1007/s11192-015-1534-5

Abstract

[Excerpt] Over the last decade, a debate has been emerging on “Academia’s obsession with quantity” (Lawrence 2007; Fischer et al. 2012a) and the subsequent Impact Factor Race, an unhealthy game played by scientists (Cherubini 2008; Brischoux and Cook 2009). Despite being widely despised by the scientific community (but see Loyola et al. 2012), the “publish or perish” dogma and the use of productivity indices (e.g., journal’s impact factor, number of published articles) to assess a researcher’s output seem to hold on, ...

 

Theory of citing

  
In Handbook of Optimization in Complex Networks, Vol. 57 (11 Sep 2012), pp. 463-505, https://doi.org/10.1007/978-1-4614-0754-6_16

Abstract

We present empirical data on misprints in citations to twelve high-profile papers. The great majority of misprints are identical to misprints in articles that earlier cited the same paper. The distribution of the numbers of misprint repetitions follows a power law. We develop a stochastic model of the citation process, which explains these findings and shows that about 70-90% of scientific citations are copied from the lists of references used in other papers. Citation copying can explain not only why some misprints become popular, but also why some ...

 

Editorial: evidence-based guidelines for avoiding the most prevalent and serious APA error in journal article submissions - The citation error

  
Research in the Schools, Vol. 17, No. 2. (2010), pp. i-xxiv

Abstract

In a previous editorial, Onwuegbuzie, Combs, Slate, and Frels (2010) discussed the findings of Combs, Onwuegbuzie, and Frels (2010), who identified the 60 most common American Psychological Association (APA) errors—with the most common error being incorrect use of numbers that was committed by 57.3% of authors. However, they did not analyze citation errors, which stem from a failure “to make certain that each source referenced appears in both places [text and reference list] and that the text citation and reference list ...

 

Errors in bibliographic citations: a continuing problem

  
The Library Quarterly, Vol. 59, No. 4. (1 October 1989), pp. 291-304, https://doi.org/10.1086/602160

Abstract

Bibliographic references are an accepted part of scholarly publication. As such, they have been used for information retrieval, studies of scientific communication, collection development decisions, and even determination of salary raises, as well as for their primary purpose of documentation of authors' claims. However, there appears to be a high percentage of errors in these citations, seen in evidence from the mid-nineteenth century to the present. Such errors can be traced to a lack of standardization in citation formats, misunderstanding of ...

 

Influence of omitted citations on the bibliometric statistics of the major Manufacturing journals

  
Scientometrics, Vol. 103, No. 3. (2015), pp. 1083-1122, https://doi.org/10.1007/s11192-015-1583-9

Abstract

Bibliometrics is a relatively young and rapidly evolving discipline. Essential for this discipline are bibliometric databases and their information content concerning scientific publications and relevant citations. Databases are unfortunately affected by errors, whose main consequence is represented by omitted citations, i.e., citations that should be ascribed to a certain (cited) paper but, for some reason, are lost. This paper studies the impact of omitted citations on the bibliometric statistics of the major Manufacturing journals. The methodology adopted is based on a ...

 

Accuracy of cited references: the role of citation databases

  
College & Research Libraries, Vol. 67, No. 4. (01 July 2006), pp. 292-303, https://doi.org/10.5860/crl.67.4.292

Abstract

The nature and extent of errors made by Science Citation Index ExpandedTM (SCIE) and SciFinder® ScholarTM (SFS) during data entry have been characterized by analysis of more than 5,400 cited articles from 204 randomly selected cited-article lists published in three core chemistry journals. Failure to map cited articles to target-source articles was due to transcription errors, target-source article errors, omitted cited articles, and reason unknown. Mapping error rates ranged from 1.2 to 6.9 percent. SCIE and SFS also were found to ...

 

Characteristics of doctoral students who commit citation errors

  
Library Review, Vol. 55, No. 3. (March 2006), pp. 195-208, https://doi.org/10.1108/00242530610655993

Abstract

[Purpose] The purpose of this study was to investigate the citation error rate and quality of reference lists in doctoral dissertation proposals. This research also sought to examine the relationship between perfectionism and frequency of citation errors and the adherence of the reference list to the fidelity of the chosen citation style among doctoral students. Also of interest was to determine which demographic variables predict citation errors and quality of the reference list. [Design/methodology/approach] Participants were 64 doctoral students from various disciplines enrolled in ...

 

Beat it, impact factor! Publishing elite turns against controversial metric

  
Nature, Vol. 535, No. 7611. (8 July 2016), pp. 210-211, https://doi.org/10.1038/nature.2016.20224

Abstract

Senior staff at leading journals want to end inappropriate use of the measure. [Excerpt] [...] Calculated by various companies and promoted by publishers, journal impact factors (JIFs) are a measure of the average number of citations that articles published by a journal in the previous two years have received in the current year. [\n] They were designed to indicate the quality of journals, but researchers often use the metric to assess the quality of individual papers — and even, in some cases, their ...

 

The unsung heroes of scientific software

  
Nature, Vol. 529, No. 7584. (4 January 2016), pp. 115-116, https://doi.org/10.1038/529115a

Abstract

Creators of computer programs that underpin experiments don’t always get their due — so the website Depsy is trying to track the impact of research code. [Excerpt] For researchers who code, academic norms for tracking the value of their work seem grossly unfair. They can spend hours contributing to software that underpins research, but if that work does not result in the authorship of a research paper and accompanying citations, there is little way to measure its impact. [\n] [...] Depsy’s creators hope that their ...

 

Resource disambiguator for the web: extracting biomedical resources and their citations from the scientific literature

  
PLoS ONE, Vol. 11, No. 1. (5 January 2016), e0146300, https://doi.org/10.1371/journal.pone.0146300

Abstract

The NIF Registry developed and maintained by the Neuroscience Information Framework is a cooperative project aimed at cataloging research resources, e.g., software tools, databases and tissue banks, funded largely by governments and available as tools to research scientists. Although originally conceived for neuroscience, the NIF Registry has over the years broadened in the scope to include research resources of general relevance to biomedical research. The current number of research resources listed by the Registry numbers over 13K. The broadening in scope ...

 

Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact

  
Journal of Medical Internet Research, Vol. 13, No. 4. (16 December 2011), e123, https://doi.org/10.2196/jmir.2012

Abstract

[Background] Citations in peer-reviewed articles and the impact factor are generally accepted measures of scientific impact. Web 2.0 tools such as Twitter, blogs or social bookmarking tools provide the possibility to construct innovative article-level or journal-level metrics to gauge impact and influence. However, the relationship of the these new metrics to traditional metrics such as citations is not known. [Objective] (1) To explore the feasibility of measuring social impact of and public attention to scholarly articles by analyzing buzz in social media, ...

 

Impact, not impact factor

  
Proceedings of the National Academy of Sciences, Vol. 112, No. 26. (30 June 2015), pp. 7875-7876, https://doi.org/10.1073/pnas.1509912112

Abstract

[Excerpt] When the English philosopher Herbert Spencer introduced the phrase “survival of the fittest” in 1864, he could not have imagined that it would summarize the plight of young scientists years later (1). As competition for coveted faculty appointments and research funding continues to intensify, today’s researchers face relentless pressure to publish in scientific journals with high impact factors. But only a few decades ago, when I began my scientific career as a virologist in the 1970s, the common outlets in ...

 

Data reuse and the open data citation advantage

  
PeerJ, Vol. 1 (01 October 2013), e175, https://doi.org/10.7717/peerj.175

Abstract

[Background] Attribution to the original contributor upon reuse of published data is important both as a reward for data creators and to document the provenance of research findings. Previous studies have found that papers with publicly available datasets receive a higher number of citations than similar studies without available data. However, few previous analyses have had the statistical power to control for the many variables known to predict citation rate, which has led to uncertain estimates of the “citation benefit”. Furthermore, ...

 

The measure of research merit

  
Science, Vol. 346, No. 6214. (05 December 2014), pp. 1155-1155, https://doi.org/10.1126/science.aaa3796

Abstract

Each year, \$1.4 trillion are invested in research by governments, foundations, and corporations. Hundreds if not thousands of high-profile prizes and medals are awarded to the best researchers, boosting their careers. Therefore, establishing a reliable predictor of future performance is a trillion-dollar matter. Last month, the Alexander von Humboldt Foundation convened an international assembly of leaders in academia, research management, and policy to discuss “Beyond Bibliometrics: Identifying the Best.” Current assessment is largely based on counting publications, counting citations, taking note ...

 

On the Shoulders of Giants: The Growing Impact of Older Articles

  
(2 Nov 2014)

Abstract

In this paper, we examine the evolution of the impact of older scholarly articles. We attempt to answer four questions. First, how often are older articles cited and how has this changed over time. Second, how does the impact of older articles vary across different research fields. Third, is the change in the impact of older articles accelerating or slowing down. Fourth, are these trends different for much older articles. To answer these questions, we studied citations from articles published in 1990-2013. We computed the fraction of citations ...

References

  1. Robert E Burton and RW Kebler. The half-life of some scientific and technical literatures. American Documentation, 11(1):18–22, 1960.
  2. James A Evans. Electronic publication and the narrowing of science and scholarship. Science, 321(5887):395–399, 2008.
  3. Google Scholar Metrics help page. http://scholar.google.com/intl/en/scholar/metrics.html , 2014.
  4. Paul Huntington, David Nicholas, Hamid R Jamali, and Carol Tenopir. Article decay in the digital environment: an analysis of usage of OhioLINK by date
 

Global scientific production on GIS research by bibliometric analysis from 1997 to 2006

  
Journal of Informetrics, Vol. 2, No. 1. (January 2008), pp. 65-74, https://doi.org/10.1016/j.joi.2007.10.001
Keywords: bibliometrics   geospatial   gis  

Abstract

A bibliometric analysis was applied in this work to evaluate global scientific production of geographic information system (GIS) papers from 1997 to 2006 in any journal of all the subject categories of the Science Citation Index compiled by Institute for Scientific Information (ISI), Philadelphia, USA. ‘GIS’ and ‘geographic information system’ were used as keywords to search parts of titles, abstracts, or keywords. The published output analysis showed that GIS research steadily increased over the past 10 years and the annual paper ...

 

Google Scholar pioneer on search engine’s future

  

Abstract

[Excerpt] As the search engine approaches its 10th birthday, Nature speaks to the co-creator of Google Scholar. [...] By 'crawling' over the text of millions of academic papers, including those behind publishers' paywalls, it has transformed the way that researchers consult the literature online. In a Nature survey this year, some 60% of scientists said that they use the service regularly. Nature spoke with Anurag Acharya, who co-created the service and still runs it, about Google Scholar's history and what he ...

 

Citations and the h index of soil researchers and journals in the Web of Science, Scopus, and Google Scholar

  
PeerJ, Vol. 1 (22 October 2013), e183, https://doi.org/10.7717/peerj.183

Abstract

Citation metrics and h indices differ using different bibliometric databases. We compiled the number of publications, number of citations, h index and year since the first publication from 340 soil researchers from all over the world. On average, Google Scholar has the highest h index, number of publications and citations per researcher, and the Web of Science the lowest. The number of papers in Google Scholar is on average 2.3 times higher and the number of citations is 1.9 times higher ...

 

Bibliometrics: the citation game

  
Nature, Vol. 510, No. 7506. (25 June 2014), pp. 470-471, https://doi.org/10.1038/510470a

Abstract

Jonathan Adams takes the measure of the uses and misuses of scholarly impact. ...

 

Context-aware Citation Recommendation

  
In Proceedings of the 19th International Conference on World Wide Web (2010), pp. 421-430, https://doi.org/10.1145/1772690.1772734

Abstract

When you write papers, how many times do you want to make some citations at a place but you are not sure which papers to cite? Do you wish to have a recommendation system which can recommend a small number of good candidates for every place that you want to make some citations? In this paper, we present our initiative of building a context-aware citation recommendation system. High quality citation recommendation is challenging: not only should the citations recommended be relevant ...

 

What's in a number?

  
Journal of Applied Physiology, Vol. 111, No. 4. (01 October 2011), pp. 951-953, https://doi.org/10.1152/japplphysiol.00935.2011

Abstract

[Excerpt] the scientific publishing world is being influenced by the Impact Factor (IF) just as the wine industry has been, and continues to be, influenced by a certain Robert Parker. There is little doubt that, just as Parker's personal taste in wine has caused winemakers in droves to change their procedures and wine styles, the IF has driven at least some journals to considerably alter their publications practices to raise IF. The tail is wagging the dog big time, and this ...

 

China's publication bazaar

  
Science, Vol. 342, No. 6162. (29 November 2013), pp. 1035-1039, https://doi.org/10.1126/science.342.6162.1035

Abstract

Science has exposed a thriving academic black market in China involving shady agencies, corrupt scientists, and compromised editors—many of them operating in plain view. The commodity: papers in journals indexed by Thomson Reuters' Science Citation Index, Thomson Reuters' Social Sciences Citation Index, and Elsevier's Engineering Index. ...

 

(INRMM-MiD internal record) List of keywords of the INRMM meta-information database - part 5

  
(February 2014)
Keywords: betula-populifolia   betula-potamophila   betula-psammophila   betula-pubescens   betula-raddeana   betula-recurvata   betula-skvorsovii   betula-spp   betula-sunanensis   betula-szaferi   betula-utilis   betula-zinserlingii   betulaceae   bias   bias-correction   bias-disembodied-science-vs-computational-scholarship   bias-toward-primacy-of-theory-over-reality   bibliometrics   bifurcation-analysis   big-data   binomial-distribution   bio-based-economy   biochemical-product   bioclimatic-envelope-models   bioclimatic-predictors   biocontrol-agents   biodiversity   biodiversity-hotspot   biodiversity-impacts   biodiversity-indicator   biodiversity-offsets   bioeconomy   bioenergy   bioethanol   biofilm   biofiltration   biofuel   biogenic-volatile-organic-compounds   biogeography   bioinformatics   biological-control   biological-invasions   biology   biomass   biomass-burning   biomass-production   biomass-to-energy   biome   biomonitoring   bioscience   biosecurity   biotechnology   biotic-effects   biotic-factors   biotic-homogenization   biotic-interactions   birches   bird-conservation   bird-dispersal   bird-pollination   birds   biscogniauxia-atropunctata   biscogniauxia-mediterranea   biscogniauxia-nummularia   bismarckia-nobilis   bison-bonasus   bixa-orellana   black-aphid   black-carbon   black-pine   black-poplar   black-sea-region   blechnum-spicant   blitz   blue-tits   blue-water   bogs   boiss   bombacopsis-quinata   bombax-malabaricum   bone-attachment   boolean-expressions   bootstrap   bootstrapping   borassus-flabellifer   borch-forest   border-effect   boreal-continental-forest   boreal-forest   boreal-forests   boreal-mountain-system   bosnia-herzegovina   boswellia-sacra   botanical-macro-remains   botany   botryosphaeria-spp   bottom-up   brachylaena-huillensis   brachylaena-rotundata   inrmm-list-of-tags  

Abstract

List of indexed keywords within the transdisciplinary set of domains which relate to the Integrated Natural Resources Modelling and Management (INRMM). In particular, the list of keywords maps the semantic tags in the INRMM Meta-information Database (INRMM-MiD). [\n] The INRMM-MiD records providing this list are accessible by the special tag: inrmm-list-of-tags ( http://mfkp.org/INRMM/tag/inrmm-list-of-tags ). ...

 

Expert Failure: Re-evaluating Research Assessment

  
PLoS Biol, Vol. 11, No. 10. (8 October 2013), e1001677, https://doi.org/10.1371/journal.pbio.1001677

Abstract

It is unlikely that there is any single objective measure of merit, so research assessment therefore requires new multivariate metrics that reflect the context of research, regardless of discipline. ...

 

IEEE statement on correct use of bibliometrics

  
IEEE PSPB Quarterly Newsletter, Vol. 6, No. 3. (October 2013), 1

Abstract

[excerpt] An increasing number of voices in the scientific community have recently expressed concerns on the inappropriate use of journal bibliometric indicators — mainly the well-known Impact Factor (IF). More specifically, they are being used as a proxy: (a) to judge the impact of a single paper published in a journal; (b) to evaluate the scientific impact of a scientist for hiring, tenure, promotion, salary increase and even project evaluations. As is well documented in the bibliometric literature, journal bibliometric indicators are simply not ...

 

How journal rankings can suppress interdisciplinary research: A comparison between Innovation Studies and Business & Management

  
Research Policy, Vol. 41, No. 7. (September 2012), pp. 1262-1282, https://doi.org/10.1016/j.respol.2012.03.015

Abstract

This study provides quantitative evidence on how the use of journal rankings can disadvantage interdisciplinary research in research evaluations. Using publication and citation data, it compares the degree of interdisciplinarity and the research performance of a number of Innovation Studies units with that of leading Business & Management Schools (BMS) in the UK. On the basis of various mappings and metrics, this study shows that: (i) Innovation Studies units are consistently more interdisciplinary in their research than Business & Management Schools; ...

 

Publishing: Open citations

  
Nature, Vol. 502, No. 7471. (16 October 2013), pp. 295-297, https://doi.org/10.1038/502295a

Abstract

Make bibliographic citation data freely available and substantial benefits will flow, says David Shotton, director of the Open Citations Corpus. ...

 

Quantifying Long-Term Scientific Impact

  
Science, Vol. 342, No. 6154. (04 October 2013), pp. 127-132, https://doi.org/10.1126/science.1237825

Abstract

[Editor's Summary] Citation Grabbers. Is there quantifiable regularity and predictability in citation patterns? It is clear that papers that have been cited frequently tend to accumulate more citations. It is also clear that, with time, even the most novel paper loses its currency. Some papers, however, seem to have an inherent “fitness” that can be interpreted as a community's response to the research. Wang et al. (p. 127; see the Perspective by Evans) developed a mechanistic model to predict citation history. The model ...

This page of the database may be cited as:
Integrated Natural Resources Modelling and Management - Meta-information Database. http://mfkp.org/INRMM/tag/bibliometrics

Publication metadata

Bibtex, RIS, RSS/XML feed, Json, Dublin Core

Meta-information Database (INRMM-MiD).
This database integrates a dedicated meta-information database in CiteULike (the CiteULike INRMM Group) with the meta-information available in Google Scholar, CrossRef and DataCite. The Altmetric database with Article-Level Metrics is also harvested. Part of the provided semantic content (machine-readable) is made even human-readable thanks to the DCMI Dublin Core viewer. Digital preservation of the meta-information indexed within the INRMM-MiD publication records is implemented thanks to the Internet Archive.
The library of INRMM related pubblications may be quickly accessed with the following links.
Search within the whole INRMM meta-information database:
Search only within the INRMM-MiD publication records:
Full-text and abstracts of the publications indexed by the INRMM meta-information database are copyrighted by the respective publishers/authors. They are subject to all applicable copyright protection. The conditions of use of each indexed publication is defined by its copyright owner. Please, be aware that the indexed meta-information entirely relies on voluntary work and constitutes a quite incomplete and not homogeneous work-in-progress.
INRMM-MiD was experimentally established by the Maieutike Research Initiative in 2008 and then improved with the help of several volunteers (with a major technical upgrade in 2011). This new integrated interface is operational since 2014.