Scientific Journals: Challenges & Trends
JASIST & ARIST Editor-in-Chief: Journal of the American Society for Information Science and Technology 2009- Editor: Annual Review of Information Science and Technology 2001-2011.
Subscribers (Libraries) Authors Readers Editorial Board JASIST Editor-in-Chief Reviewers ASIS&T (The Society) Managing Editors, Production Editor, Copy Editors & Proofreaders Infrastructure Provider (Scholar One) Wiley- Blackwell (The Publisher)
Slow Publishing Policing to prevent salami-slicing/lpu/ self-plagiarism Plagiarism fraud, error detection retraction
Peer Review Expertise mapping, referee fatigue, system stress Single vs. double blind vs. open vs. hybrid Bias?
Language Linguistic hegemony Vernacular vs. vehicular language Effort-cost of maintaining standards
Data Accessibility of data for checking, replication, re-use, exploitation Accessibility of instruments, protocols, software, primary data
How to cite data How to reward data creators How to peer review data sets Need for sharing of data management plans How to reward secondary analysis of data
Journal Impact Factor high IF journals are losing their stronghold as the sole repositories of high-quality papers, so there is no legitimate basis for extending the IF of a journal to its papers, much less to individual researchers. Lozano, Lariviere & Gingras, JASIST (2012)
Coercive Citation Editors force authors to cite articles from their journal
Big Business (~US$15billion)
arxiv: From Strength to Strength
Proportion of WoS papers on arxiv, by specialty (2010-2011). Inset: Proportion of WoS papers on arxiv, by specialty, 1995-2011. Larivière, Macaluso, Sugimoto, Milojevic, Cronin, & Thelwall (2013) Astronomy & Astrophysics Nuclear & Particle Physics General Physics General Mathematics Solid State Physics Miscellaneous Mathematics Miscellaneous Physics Probability & Statistics Applied Mathematics Fluids & Plasmas Chemical Physics Acoustics Computers Applied Physics Optics Percentage of articles 80% 70% 60% 50% 40% 30% 20% 10% 0% Nuclear & Particle Physics Astronomy & Astrophysics 1995 2000 2005 2010 0% 10% 20% 30% 40% 50% 60% 70% Percentage of WoS-papers (2010-2011)
Accés Obert Removal of tolls & filters Bibliodiversity APCs: cost shifting Open peer review/ wisdom of the crowds Visible impact measures
Open Access (humanly readable PDFs) Open Data (machine readable, mineable, reusable, licensable)
Workers Revolt? http://thecostofknowledge.com/ 600,000 authors,~ $1 billion profits ~13,000 signatures
Article Processing Charges (APCs) I think publishing is a cost of research in the same way as buying a centrifuge is a cost of research Sir Mark Wolpert (WT) April 2012
Perceptions & Reality Dallmeier-Tiessen et al., 2011 About 2/3 of (~9,000) OA journals do not charge APCs. Medical sciences highest; A & H lowest. M. Kozak
OA Journal Publishing, 1993-2009 (M. Laasko et al., 2011: http://www.plosone.org/article/info%3adoi%2f10.1371%2fjournal.pone.0020961)
$9 million start-up support 7 Gold OA journals Efficiencies of scale Costs down, charges up? PLoS One revenues provide cross-subsidy (cash cow?)
PLoS: A Publishing Phenomenon ~2,000 papers per month APC = $1,350 Peer reviewed mega journal 3% of STM literature 65% acceptance rate
STM publisher of 220 OA, online, peer-reviewed journals. Acquired by Springer. Revenues = $20M (?) Authors pay APC, retain copyright and license work under CC-BY Membership options
From hybrid (2004) to fully OA (2007) APCs range from $0 -> $1,500 500+ peer reviewed journals Institutional membership Rejection rate ~60% 10% no APCs Accelerated peer review (4-8 weeks)
Hindawi Revenues Source: R. Poynter (2012)
Sage Open Peer-review Gold OA journal covering social & behavioral sciences and the humanities Discounted APC ==> $99 70% of authors pay APC personally
http://www.elifesciences.org/ Howard Hughes + Welcome + Max Planck Senior scientists on editorial board Free for now Seeks out high quality/impact papers
F1000Research is an original open access publishing program for life scientists, offering immediate publication, transparent peer review (post-publication) and full data deposition and sharing. All scientifically sound articles are accepted, including single findings, case reports, protocols, replications, null/negative results and more traditional articles.
Predatory Journals Low quality journals Dubious editorial practices Scholarly vanity presses ~4,000?
Accés Obert -- Libres OA book publisher Not for profit Rigorous peer review Free online Small PDF charge
Elsevier s Article of the Future
PLOS One
Make tables, graphs, stats interactive Make code, data etc. reader-verifiable From (fixed) paper to (dynamic) process to (multiple) products
Identifying great research in biology and medicine F1000 is an in-depth directory to the top articles in biology and medicine, as recommended by our Faculty of over 5,000 expert scientists and clinical researchers, assisted by 5,000 associates. Post-publication review & ratings
Liquid Publications blogs, scientific experiments, comments on somebody else s paper, reviews, slides, videos, demos, even data evolutionary, collaborative Baez & Casati, n.d.
Force 11 Manifesto http://force11.org/white_paper We see a future in which every claim, hypothesis, argument every significant element of the discourse can be explicitly represented, along with supporting data, software, workflows, multimedia, external commentary, and information about provenance.
What is a Nanopub? http://www.nanopub.org/ A nanopublication is the smallest unit of publishable information: an assertion about anything that can be uniquely identified and attributed to its author. Find, connect, aggregate, curate
Nano Publication http://www.nanopub.org/
Status Quo Mons et al. 2011
Nano-centric Publication Mons et al. (2011)
An Open Access Evidence Rack (Peter Suber, December 2012) 1. Identify basic propositions in a paper/subfield/field 2. Create separate OA webpage for each proposition 3. Fill each webpage with evidence supporting the proposition
Force 11 Manifesto http://force11.org/white_paper Notions such as journal impact factor are poor surrogates for measuring the true impact of scholarship we need to derive new mechanisms that will allow us more accurately to measure true contributions.
Assessing Research Quality Impact is defined as an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia
Beyond Bibliometrics Citations miss important traces are lagged reference managers, blogs, bookmarking, slide-sharing services and social media are real-time
Scholars read, use and then cite Non-scholars read and use, but don t cite Citations measure a particular kind of use; they only partially capture impact.
Invoked on the Web (Cronin et al., 1998) genres of invocation polymorphous mentioning presence density diverse ways in which academic influence is exercised and acknowledged
Acknowledgments Data citation counts Micro-attributions for data curation Social media mentions Recommendations Downloads Links/hits/click-throughs Mentions in extrascientific texts Press coverage Alternative Metrics
Alt.metrics Priem (2011)
Elsevier s Scopus
Platforms Transparency? Usability? Persistence? Cost?
Complements or Correlatives? Citations in Wikipedia correlate with JCR data (Nielsen, 2007) Articles in the top quartile of tweets were 11 times more likely to be in top quartile of citations 2 years later (Eysenback, 2012) High positive correlation between F1000 score and JIF (Nature Neuroscience, 2005) Positive correlations between inclusion in reference managers and citation counts (Bar-Ilan, 2012; Li etal. 2011; Priem et al., 2012) Downloads predict /correlate with subsequent citation (Brody et al., 2006).
Downloads vs. Citations in Science Direct (source: Elsevier)
Quantified Control Lock & Martins, 2011 an Orwellian surveillance net Sosteric, 1999 contemporary metricization of the academy Burrows, 2012
The data speak for themselves. Or do they?
Manipulating Google Scholar Citations and Google Scholar Metrics: simple, easy and tempting Emilio Delgado López-Cózar, Nicolás Robinson-García, Daniel Torres-Salinas arxiv.org, Submitted on 4 Dec 2012 We created six documents authored by a faked author and we uploaded them to a researcher s personal website under the University of Granada s domain. The result of the experiment meant an increase of 774 citations in 129 papers (six citations per paper) increasing the authors and journals H index.
The Holy Grail of Holism Multidimensional indicators of impact, influence, worth and trends A matrix of established & alternative metrics A unified measure/composite score?
Trends Liberation theology Incremental, interactive, evolutionary products Atomization of effort, outputs Multiple impact measures Fetishization of metrics Transparency vs. triviality