Hamburg, 25 to 26 March 2015 Metrics 2.0 for a Science 2.0 Isidro F. Aguillo isidro.aguillo@csic.es
AGENDA Science 2.0 Integrating and opening the whole research cycle for everyone Diagnostics for a new accountability of Science From DORA, bad bibliometrics and a few suggestions A new level: Author profiling to Rankings and the Ranking Web (Webometrics) Metrics 2.0 2
SCIENCE 2.0 INTEGRATING the whole research cycle and its stakeholders OPENING the whole set of data, tools, results and metrics Globalization Sharing everything with everybody Digital natives scientists Transparency Open Access Blurring the lines Formal & Informal Communication Open Data Big Data = Open Data Data-intensive Big Open Data New Stakeholders Citizen Science Accountability Metrics 2.0 3
A humble suggestion Big Science is no longer the only provider of Big Data Hint: Badly needed Easy access to more powerful computer infrastructures 4
Evolving from Bibliometrics: Current scenario BIBLIOMETRICS Tested tools and indicators ALTMETRICS Trusteness, Meaning Interpretation STRENGTHS WEAKNESSES SWOT Analysis OPPORTUNITIES THREATS OPEN ACCESS Big Data Social Networks BAD BIBLIOMETRICS DORA Declaration 5
Requirements for a quantitative-based assessment Multisource New bibliometric sources, specially from developing countries Non-bibliometric sources too Multidimension Involving all aspects of the research cycle Involving all facets of communication and impact Better attribution Of authors roles and contributions Of resources, funding and policies involved Excellenceguided More relative indicators and ranks From 25% (quartiles) to Top 1% 6
Changing the focus Author PROFILES University RANKINGS GS Citations ResearcherID ResearchGate Article-level METRICS Leiden, Scimago NTU, URAP Journal-Level METRICS CPP, MNCS PLoS ALM Infamous IF SJR, SNIP 7
I-Metrics BIBLIOMETRICS Number of times the bibliographic record (identifier) of a paper/book is cited in another similar formally published paper Document-level metrics Title Source:Journal/Book/Patent Publication Year Citations Identifier: URL/DOI/Handle Subject/Tags WEBOMETRICS Number of times the URL of a document/author webpage is linked from another webpage ALTMETRICS Number of times the elements are mentioned (shared) in websites, wikis, blogs, social bookmarks and networks (incl. Twitter) or search engines Author-level metrics Author(s) Institution (Affiliation) Publication Year Discipline/Tags Web profiles Web 2.0 profiles USAGEMETRICS Number of times the document is read/visited/downloaded from its publishing place (incl. websites) 8
About changing the altmetrics organization model Problems An ugly name (webtwometrics?) Even worst: Confusing tangled set of mixed value tools A rose is a rose Bibliometrics to bibliometrics: Mendeley, ResearchGate, F1000, Tool-related naming: Twittermetrics, wikimetrics, Segregate Usage is a very rich environment: Visits, visitors, downloads, Usagemetrics! 9
10 Another humble suggestion
11 Bibliometrics issues: (Incomplete) Diagnostics
Bibliometrics issues: The (lack of) Context GEOPOLITICAL BIASES PAPERS CITATIONS PATENTS ALT DISCIPLINE BIASES Hint: Check the ACUMEN EU-project proposal for a Personal Portfolio 12
13 Bibliometric Issues: The Sources
14 Bibliometric Issues: The Coverage
15 Bibliometric Issues: The Authorship Attribution
Three suggestions for bibliometric community Two sets Size dependent indicators, but.. add size independent indicators too! Authorship Kill the full count! Pure fractional count? Move to one size fits all : From 100% to 50% (?) Ranks Composite indicators? From publish or perish to stick and carrot 16
A controversial suggestion * Papers signed by 2 or more authors 17
And perhaps, even more controversial * Papers signed by 2 or more orgs 18
Adding author-level metrics PUBLIC AUTHOR PROFILES BETTER AUTHOR ATTRIBUTTION CITATION (& MENTION) BASED METRICS 19
20 A model?
A preliminary assessment The good One shop Bibliometric, altmetric (usagemetrics) indicators Relative indicator The ugly Composite indicator No API 21
22 A second model?
Another assessment The good Public, free Huge coverage The ugly Incomplete (voluntary) Only bibliometric indicators Free text for disciplines and affiliations No API Easy to manipulate 23
24 Institutional CRIS as another candidate model
CRIS The good Public, free Complete institutional (self-sourcing) coverage The ugly No standards for comparison purposes Basic indicators For bureaucrats (and librarians) 25
26 Stick and carrot
Adding relative indicators Isidro F. Aguillo 56 432 7.71 2 (h>=38) Author/Discipline-Country 2.18% 3.78% 105% 5.26% 27 http://www.scimagojr.com/countryrank.php
About Composite Indicators: The Rankings High School Exercise Cut & Paste Exercise Bad Bibliometrics No Composite indicator 28
Ranking Web: Mission, Vision PERFORMANCE Web presence: The Whole University VISIBILITY Web Impact: Huge, global and diverse audiences IMPACT On the colleagues and beyond COMMITMENT An feasible way to measure quality in first mission OPENNESS Institutional repositories are the treasure of the University EXCELLENCE Top research TRANSPARENCY Fighting unethical INTERNATIONALIZATION Attracting talent MULTISOURCE Bibliometrics Altmetrics Webometrics A MODEL (1:1) for the Composite Indicator True GLOBAL Coverage 24,000 HEIs OPEN Governance to Society to Industry Access Data ONE RANKING Unique for the all the missions 29
Current coverage of the Ranking Web 3624 166 3798 16 6172 209 705 3 987 2 23887 500 8311 84 290 20 WR January 15 30
Sources for the Ranking (2.0) Web SCIMAGO Excellence indicator: Top 10% cited papers in 21 disciplines GOOGLE SCHOLAR Number of (open) documents GOOGLE Number of webpages under common web domain MAJESTIC / AHREFS Number of backlinks and referring domains to the common domain Webometrics Bibliometrics Composite Indicator Altmetrics Mentions in ACADEMIA FACEBOOK LINKEDIN MENDELEY RESEARCHGATE SCRIBD SLIDESHARE TWITTER WIKIPEDIA (all) WIKIPEDIA (English) YOUTUBE 31
More facts about the Ranking Web (Webometrics) authority Edited by public-funded research body, not a private for-profit company Scientific Evolving methodology, not guided for stability target currency It ranks universities, not websites Popularity (visits, visitors) are not taken into account Web design (usability) is irrelevant for the ranking Ranks are based on current data, collected every 6 months 32
Metrics 2.0 Recovering metrics prestige TRANSPARENCY GLOCAL CUSTOMISATION MORE METRICS BETTER PROFILES Hint: Leiden Declaration (in prep) by the European Network of Indicator Designers 33
Questions? SaCSbeoAAAAJ 124106 A-7280-2008 3481717 0000-0001-8927-4873 6507380175 @isidroaguillo IsidroAguillo isidro-f-aguillo Thank you! 34