1. Altmetrics for Measuring
Research Output
Robin Featherstone, MLIS
Research Librarian,
Alberta Research Centre for Health Evidence,
Department of Pediatrics, University of Alberta
@rmfeatherstone
http://www.slideshare.net/featherr
http://www.ualberta.ca/ARCHE/
@arche4evidence
SOHLIN, Fall 2014
2. Conflict of interest disclosure
⢠I have had free access to the Altmetric.com
Explorer since September 2013
3. Image by Nik Papageorgiou: http://theupturnedmicroscope.com
4. Altmetrics (AKA Social Media Metrics,
Alternative Assessment Metrics) âŚ
[âŚ] capture ways in which articles are
disseminated throughout the expanding
scholarly ecosystem, and reach beyond the
scope of traditional trackers and filters.
[âŚ] measure research impact by including
references outside of traditional scholarly
publishing.
1. Public Library of Science (PLOS). Altmetrics. [29 April 2014].
Available from http://article-level-metrics.plos.org/alt-metrics/
2. Baynes G. Scientometrics, bibliometrics, altmetrics: Some introductory advice for the lost and
bemused. Insights. 2012;25(3):311-5. doi: 10.1629/2048-7754.25.3.311.
5. Altmetric tools measure data
collected from:
⢠Tweets
⢠Blog mentions
⢠Facebook posts
⢠Presentations
⢠News articles
⢠Shared citations (e.g., Mendeley, CiteULike)
⢠Data uploads
⢠Etc.
8. Whatâs your impact1?
1. Emerald Group Publishing. Impact of Research [11 April 2014].
Available from: http://www.emeraldgrouppublishing.com/authors/impact/index.htm
http://blogs.lse.ac.uk/impactofsocialsciences/2011/07/14/publishers-measuring-impact/
9. Agenda
⢠Researcher-level vs. article-level altmetrics
⢠Product overview
⢠Measurable mentions & unique IDs
⢠Gaming the system
⢠Altmetric research
⢠Altmetrics for biomedical and health sciences
⢠Future directions: standards, research,
assessment
10. Researcher case study â
Heather Piwowar
⢠h-index: 8
â 23 articles cited 258 times
http://www.scopus.com/authid/detail.url?authorId=25122628200
11. Researcher Case Study â
Heather Piwowar
⢠Slideshare followers: 85
http://www.slideshare.net/hpiwowar
⢠ImpactStory research products: 117
http://impactstory.org/HeatherPiwowar
⢠Figshare views: 1384
http://figshare.com/authors/Heather_Piwowar/96399
⢠GitHub contributions: 1410
https://github.com/hpiwowar
⢠Google Scholar h-index: 13 (51 publications cited 678
times)
http://scholar.google.ca/citations?user=1YLq0XMAAAAJ&hl=en
⢠ResearchGate score: 14.33
⢠Research Gate Impact Points: 50.29
https://www.researchgate.net/profile/Heather_Piwowar2
12. Article Case Study â
Marshall 2013
⢠Web of Science times cited: 1
⢠Google Scholar times cited: 21
⢠Scopus times cited: 6
20. Q: Which twitter post will be
counted by altmetrics
measurement instruments?
A. Great review on pet ownership for children with
autism by Gretchen Carlisle
B. Great review by @gretchgretch15 on pet ownership
- #autism
C. Great review -
http://dx.doi.org/10.1016/j.pedn.2013.09.005
21. What is gaming?
⢠Alice has a new paper out. She asks those grad
students of hers who blog to write about it.
⢠Alice has a new paper out. She believes that it
contains important information for diabetes patients
and so signs up to a â100 retweets for $$$â service.
5. Case examples from: Adie, E. Gaming Altmetrics. [29 April 2014].
Available from http://www.altmetric.com/blog/gaming-altmetrics/
23. Spectrum of social media self-promotion
6. Adie, E. Gaming Altmetrics. [29 April 2014].
Available from http://www.altmetric.com/blog/gaming-altmetrics/
24. Altmetrics research
⢠PLOS Altmetrics Collection:
http://www.ploscollections.org/altmetrics
⢠Many studies compared altmetrics with
citation counts and Impact Factors
⢠Positive but weak correlations found
25. Altmetrics in the health sciences
⢠âBiomedical and health sciencesâ showed
highest share of publications with
Altmetric.com scores7
⢠Increasing numbers of PubMed citations were
being tweeted8
2010 2.4%
2011 10.9%
2012 20.4%
7. Costas R, Zahedi Z, Wouters P. Do altmetrics correlate with citations? Extensive comparison of
altmetric indicators with citations from a multidisciplinary perspective. arXiv preprint arXiv:14014321.
2014.
8. Haustein S, Peters I, Sugimoto CR, Thelwall M, Larivière V. Tweeting biomedicine: An analysis of
tweets and citations in the biomedical literature. Journal of the Association for Information Science
and Technology. 2013.
26. Using Altmetrics.com rankings
â Pediatric Journals
Rank Score Article Title Year Journal
1 1350 Early Television Exposure and
Subsequent Attentional Problems in
Children
2004 Pediatrics
2 558 Do Television and Electronic Games
Predict Childrenâs Psychosocial
Adjustment? Longitudinal Research
Using the UK Millennium Cohort
Study
2013 Archives of
Disease in
Childhood
3 521 Microbial Contamination of Human
Milk Purchased via the Internet
2013 Pediatrics
4 457 Effective Messages in Vaccine
Promotion: A Randomized Trial
2014 Pediatrics
5 383 Gun Violence Trends in Movies 2013 Pediatrics
27.
28. Using Altmetrics.com rankings
- ARCHE publications
Rank Score Article Title Journal Notes
1 152 How confidence
intervals become
confusion intervals
BMC Medical
Research
Methodology
One of the highest ever
scores in this journal (ranked
#1 of 364) â article was
mentioned by 219 tweeters
2 148 Lifestyle Interventions
for Patients With and at
Risk for Type 2
Diabetes: A Systematic
Review and Meta-analysis
Annals of Internal
Medicine
Among the highest ever
scored in this journal
(ranked #37 of 2470) â
article was mentioned by
128 tweeters
3 122 Social media use among
patients and caregivers:
a scoping review
BMJ Open Among the highest ever
scored in this journal
(ranked #22 of 1501) â
article was mentioned by
150 tweeters
31. Future Directions â
Product development
⢠Increase transparency of self-promotion
activites
⢠Implement anti-spamming/anti-gaming
software solutions
32. Future Directions â
Standards development
⢠NISO Alternative
Assessment Metrics
(Altmetrics)
Initiative
33. Future Directions â Research
⢠Is there a correlation between altmetrics
rankings andâŚ
â full-text downloads?
â qualitative measures of interest?
â âŚ
34. Future Directions - Assessment
⢠Will be used by:
â funding agencies
â research centres
â libraries
â Knowledge Translation (KT) researchers
36. Robin Featherstone, MLIS
@rmfeatherstone
feathers@ualberta.ca
Slides available: http://www.slideshare.net/featherr
The Alberta Research Centre for Health Evidence
Edmonton, Alberta, Canada
Hinweis der Redaktion
Speaker Bio:Â Robin Featherstone is an embedded librarian for the Alberta Research Centre for Health Evidence (ARCHE) in the Department of Pediatrics at the University of Alberta. Robin was an academic health sciences librarian at McGill and Western Universities, and a hospital librarian for the McGill University Health Centre. Robin's interest in alternative metrics developed through her work promoting and assessing the impact of her research centre's publications.Â
Because it is relevant to this presentation, Iâll point out my two Twitter handles: one for my research centre (@arche4evidence), and one for myself as a researcher (@rmfeatherstone). It is somewhat fitting that Iâm including both handles on this introductory slide. Iâll be speaking today about altmetrics as both a researcher who is working to establish my own impact within a community of information scientists, and as an embedded health research librarian who is responsible for the social media presence of my employer and assessing our centreâs social media communication strategies.
The free access Altmetric.com granted me to their Explorer justifies a conflict of interest disclosure. Because of that access, the Explorer is the primary altmetric tool that I use for my work and is (likely) over-represented in the presentation today.
Before I begin, I just want to ask the audience whatâs missing from this cartoon? Which personality is now a regular presence at conferences?
The tweeters. They might be commenting on the content in the presentation, sharing notes with a wide audience of followers, or just scheduling lunch plans for after the session. But they are a part of the researcher community, and the comments they post are being collected and analyzed. The output and activity of those Tweeters is part of what Iâm here to discuss this afternoon.
Altmetrics measure research impact by including references outside of traditional scholarly publishing (4). These social web metrics were first proposed in 2010 as a response to scholars moving their work online.
There are limits to what IF and h-index figures (traditional research metrics) can tell us. A junior faculty member may have created and shared hundreds of captivating lectures online but only published a few articles. That teaching is not reflected in their h-index. They may author a widely-followed blog in which they engage with an audience of academic peers, but there is no IF for the blog. As numbers of Twitter followers or Facebook friends quantify social media activity, altmetrics measure and rank researcher output, impact and influence from the social web.
In these definitions of altmetrics, there is an emphasis on getting âbeyondâ or âoutsideâ the traditional ways of measuring research impact.
Altmetrics synthesize data collected from tweets, blogs, presentations, news articles, comments, or any social commentary about a diverse group of scholarly activities that are captured on the web.
The obvious caveat about altmetrics is that they are only valid and valuable for the most recent publications.
Social media mentions are rare for articles published prior to 2011, and altmetric products often exclude older datasets in their analysis.
The fast pace of social media -- the exponential output -- makes it difficult to collect data sets from information telecommunications networks. The numbers are enormous and part of a Big Data challenge. Academic computer gives us better data collection measures and big storage solutions make it possible to analyze.
But how else do we measure information that facilitates change? Surveys. Resources permitting, we should always ask human beings. Altmetrics are most cost-effective, and administrators and accountants value these numbers they can readily pull from the web.
Nanopublication: an assertion about anything that can be uniquely identified and attributed to its author.
I first heard about the concept of impact zones from Lisa Given, an information scientists an expert on qualitative methodology who studies research impact. Impact zones have also been discussed on the London School of Economics Impact Blog.
The model was developed by Emerald Publishing Group and describes 6 zones where your research can have impact. Traditional citation metrics really only tell us about one zone: knowledge. One of the reasons that Iâm excited about alternative metrics is that I believe they can help us measure impact in some of the other 5 zones.
Aside from simply defining and justifying alternative metrics for measuring research impact, Iâll be presenting case studies that illustrate how different altmetric products measure the impact of researchers and articles. Iâll share some tips for making sure your social media mentions are counted by altmetric tools. Iâll discuss a particular challenge for altmetric tools of dealing with instances of âgamingâ (or manipulating) rankings, and the spam and pay-for-systems that inflate online mentions. Iâll provide a brief overview of altmetrics research and how I use altmetrics in health research. Finally, Iâll describe some future priorities for altmetric products and the opportunities for librarians to use and educate colleagues about altmetrics.
Heather, who is a computer scientist and one of the developers of the altmetric tool ImpactStory, was generous in giving me permission to use her researcher profiles for this case example.
Heatherâs h-index, one of the recognized impact measurement tools, is an eight. That number tells us about the 23 selectively-indexed academic journal articles that sheâs published and the number of times theyâve been cited (in selectively-indexed academic journals). But itâs not a complete picture of her academic output.
Heather also posts her conference presentations to SlideShare and has 85 followers. She has an impressive number of âresearch productsâ that incorporates publications from both inside and outside of traditional academic journals. Her uploads to Figshare have been viewed over 1000 times, and her contributions to data repositories as measured by GitHub are equally impressive. Heather has a Google Scholar profile that showcases citations and publications that are not calculated as part of her h-index. She is part of an online academic community through ResearchGate and has been evaluated highly by her peers. These different altmetric measurements create a very different picture of Heather than her h-index
We can see a similar disconnect between traditional and alternative metrics for this article by Joanne Marshall and her colleagues. In the world of health librarianship, this was a highly anticipated publication. It was discussed in my journal club as soon as it was available. But the citation counts donât reflect that high-level of interest. These citation counts need time to grow before they can tell us anything meaningful about this article or how it compares to other publications within health librarianship.
The Altmetric.com ranking for the Marshall paper tells a different story. Since the article was published in 2013, 4 people have blogged about it and 63 tweets have mentioned it. Clearly people are discussing and sharing this article. Just a reminder that these are dynamic rankings â the scores for the Marshall paper may be higher today than when I prepared these slides.
As you can probably tell from these two case studies of a researcher and a publication, different altmetric tools tell different stories.
Since weâre looking at some Altmetric.com data, Iâll start by talking about them. And then move to some of the other major players.
Altmetric.com scores articles with embeddable, donut-shaped badges. Subscription costs vary and the company has been generous to librarians (like myself) in offering free accounts during their start-up phase. Their application programming interface (API) is available for free to any web developer who wants to embed Altmetric.com badges on their site. The company has published extensively on the subject of altmetrics and their data sets are contributing to bibliometric/sociometric/webometric research. Altmetric.com collects and analyzes mentions on social media sites, particularly Twitter and Facebook. The Altmetric.com Explorer searches datasets by keywords and subject headings, but works best with PubMed Identifiers (PMIDs), International Standard Serial Numbers (ISSNs), or Digital Object Identifiers (DOIs).
There are many products, some of which existed (like F1000) before the term âaltmetricsâ was coined. Currently, the landscape is full of start-ups positioning themselves as investment-worthy knowledge providers. Some altmetric tools tell us about individual articles, and others tell us about researchers, and some tell us about institutions. Researcher-focused products, like ImpactStory and ResearchGate, resemble familiar social networking sites in that they rely on contributors creating and maintaining personal profiles.
The evolution from Facebook to LinkedIn to ImpactStory makes logical sense. User-contributed profiles became online resumes and then dynamic curricula vitae with embedded metrics for research products. For researcher-focused altmetric tools, older publications, presentations and products can be manually added. These products that tell us about researchers are more likely to include contributions prior to 2011, and for that reason are superior for analyzing research output over time than article-focused altmetric tools.
ImpactStory.org is an open-source product that connects PMIDs, DOIs, GoogleScholar citations, ORCID identifiers (unique researcher identifiers) and SlideShare profiles to count âResearch Products.â ImpactStory.org creates a free public profile for the individual researcher that includes their Wikis and blogs, and praises their Open Access publications with a medal ranking. ImpactStory helps scholars create and disseminate online resumes, in a similar way to LinkedIn.
ResearchGate.net also claims to measure impact in a new way, and ranks âscientific reputationâ through their RG Score. ResearchGate hosts an open platform for researchers to share and discuss their work. Products from researcher profiles contribute to RG Scores, as do evaluations of those products by ResearchGate peers. Aggregated RG Scores are also presented for institutions based on member contributions.
Aggregated RG Scores are also presented for institutions based on member contributions. I find this view of the comparative rankings of academic/scientific institutions based on the cummulative scores of their individual members fascinating. There are Chinese, Brazilian, American, Russian, Swedish, and French institutions in their top ten rankings. ResearchGate really gives you a sense of the international landscape for academic research impact. The results are also customized for Canadian audiences, and will also show me national and North American institutionsâ rankings.
From researcher and institution-specific metrics, the altmetric product landscape includes producers of article-level metrics. One of these article-level altmetrics providers is Altmetric.com that I already described. But another worth mentioning is from the publisher PLOS.
PLOS Article-Level Metrics examines the overall âperformance and reachâ of articles, and is available for every article published by PLOS (Public Library of Science). PLOS Article-Level Metrics aggregate usage data (i.e., downloads), citations, ratings, social networking mentions, blogs and media mentions. Like Altmetric.com, PLOS distributes a free API to share their article metrics on third-party websites.
The development and growth of these online academic communities can be seen in the recent development of PubMed Commons. Not only does it provide a forum for discussion about research, but it provides us with a quantiative measurement in terms of being able to count the number of comments PubMed citations have received.
One of the criticisms of efforts to measure research impact is that they are overly reliant on quantiative assessments. What weâre missing are easily accessible qualitative measures. Itâs easy to count someoneâs articles or datasets or the comments theyâve received on a paper, but it is much more difficult to gather evidence to support the argument that research has had a positive impact on a community.
Robin
Anyone in academia knows that the environment is changing and that weâre constantly having to justify the importance of our research. In some countries, like Australia and the UK, there are formal mechanisms from government agencies to measure the impact of research funded by tax-payers. For a new generation of researchers, it will be imperative that they share their work. And they will want to make sure citations, mentions and uploads are counted.
As expert searchers of grey literature will recognize, it is a nightmare task to capture every social media mention, tweet, blog comment, SlideShare upload, etc. on a particular researcher or publication. The lack of standardization in social media communication results in questionable data accuracy by altmetric providers, and bibliographic analyses using altmetrics include lengthy discussions of the limitations of their data sources.
To achieve accurate records of scholarly output, altmetric products rely on PMIDs, DOIs and ORCID identifiers. One lesson to take away from this presentation: smart self-promoters include a PMID or DOI when they tweet or blog about their research publications. Including unique identifiers is the best way to ensure that social media output is counted by almetric products.
Ethically, is there a difference b/w Aliceâs two self-promotion activities? I would say yes. The first scenario shows intent to self-promote, but is also adding value to the research community in that the graduate students have an opportunity to comment and build upon Aliceâs research through their publications. The second scenario shows intent, but adds no value. Alice is flooding Twitter with empty self-cites.
Instances of âsock puppetâ (or fabricated) social media profiles are an inevitable consequence of open platforms that rely on user contributions. Since altmetrics rely on social networking sites, their data are vulnerable to gaming.
Auto-spamming is another incident to consider. This is an automated Twitter âbotâ set up to tweet the water level in a reservoir in South Africa. Each time it does, so it includes a link to a paper.
The mentions are unintentional (the authors of the papers may be unaware that this program is tweeting the same link to their publication over and over). Itâs not quite as bad as paying a company to set up âsock puppetâ accounts, but it is an example of the kind of spamming which is far more common.
Self-citing/tweeting/blogging, intentional gaming and unintentional spamming are all challenges for altmetric product developers. There are definitely some grey areas between good and bad social media promotion. Producers, like Altmetric.com, have to draw lines and determine if and how to present questionable data.
Iâm very impressed by the efforts by Altmetric.com to acknowledge and develop methods for dealing with gaming with transparency. To gain credibility, producers must gain trust by weeding out suspicious data. A comparison can be made to researchers using data quality control mechanisms and accounting for those measures in their methods and discussions sections of their papers.
An excellent collection of research using altmetrics data is maintained by PLOS. Numerous investigations highlighted in the collection have focused on the relationship between altmetrics, citation counts and IF scores. Findings from these studies suggest a positive but weak correlation between altmetrics and traditional impact measures. Owing to the variety of altmetrics rankings and the different methods of comparison, these analyses are limited. And while proponents of altmetrics compare new tools against the standard citation measures, the evidence does not support replacing traditional citation metrics with altmetrics. However, information scientists agree citations and altmetrics measure different types of impact. In a society where social media is pervasive, it means something for a scholarly article mention to receive a million likes on Facebook, independent and regardless of any eventual citation count.
Â
Altmetrics are not limited to any particular field of study. But they are particular relevant for the biomedical and health sciences areas. The article-level altmetrics data shows that social media mentions in this area is high and increasing steadily.
I wanted to share a few of the uses of article-level altmetric data for our research centre. One example was a data collection exercise to assist with a research prioritization project. By analyzing the articles with the highest altmetrics scores within pediatric journals, our centre saw that research about television exposure to children was of particular interest during the previous year.
This kind of information provides us with topic areas for knowledge synthesis.
With the Altmetric.com Explorer, we are able to see the dissemination patterns via news media and social networks for a particular article. We can analyze the discourse around that article.
In this case, I was interested in understanding why an outlier article from 2004 continued to be mentioned on social media. I can tell that news agencies were citing and writing this article long after the original publication date. A news story from 2011 inspired a cascade of media mentions that resulted in people Tweeting and blogging and talking about early television exposure through social media. This particular article inspired a lot of discussion around the world.
Whatever you want to call it, I believe this information suggest real impact for researchers, parents, and health providers.
Another example of how we use Altmetrics is to chart the performance of our own publications. This can give us a sense of which areas of research are of particular interest and which journals promote greater social media activity.
In the case of this article about confidence intervals, analysis of the altmetric data shows us high interest from Great Britain and to identify communities of researchers around the world who are discussing our research. Based on profile information, we can learn what percentages of our audience for a particular topic are members of the public, scientists, practitioners, or science communicators (i.e., journalists). We can look at the individual tweets of blog articles or Facebook mentions and learn more about how research is received by these audiences.
As illustrated by the promotion of impact zones by Emerald Publishing Group, and PLOS Article-Level Metrics, publishers are among the most ardent of altemtrics adopters. Altmetricsâ application programming interfaces (APIs) are now a common feature on journalsâ table of content pages, as in this example from the Cochrane Library. Publishers will increasingly promote these metrics.
As far as promotion goes, altmetrics are a cost-effective way of showing a publicationâs value. APIs are also easy ways of making website dynamic and appealing.
No one blames the researcher who wants to give their publication a little boost with a tweet or a Facebook mention, but the methods for collecting and displaying evidence of researcher output ought to make such activities transparent. Just as self-citing does not qualify as unethical, neither does self-tweeting; but those self-tweets should have lesser value or be regarded in a separate context from legitimate sharing by unbiased experts. In addition, data collector spamming needs to be detected and eliminated by altmetrics providers through transparent methods. Automated programs that âgameâ the analysis and inflate rankings hamper producers. Anyone using altmetrics should recognize this gaming phenomenon and the potential for research rankings to be artificially inflated.
Among future challenges for altmetrics start-ups will be standardizing methods of counting the online artifacts of research output. An analogy can be made between the need to have COUNTER compliant statistics for journal usage. Without standardization in altmetrics, these tools are just comparing apples to oranges (or pears). A researcher could have over 1000 âproductsâ in one altmetric measurement, but a low ârankâ and fewer âartifactsâ in another. When an altmetric product achieves equivalent recognition of the gold standard IF, then deans can review tenure dossiers with these values.
Information scientists agree citations and altmetrics measure different types of impact.
A more fruitful approach to webometrics research will combine IF and other available datasets with altmetrics to examine social media use and knowledge dissemination strategies. For example, using altmetrics with publisher data could tell us how many times an article was tweeted compared to how many times the full-text was downloaded. We can compare downloads for high and low altmetrics scores, or between and across fields for different social media sites.
Research can also help us to understand what altmetrics are measuring. If they are measuring researcher interest, then qualitative research can confirm that assertion.
Among the potential users of almetrics for assessment will be funding agencies. Altmetrics are readily-available (i.e., cost-effective) quantitative indicators of a return on funding investments. In the case of research centres, altmetrics can help us target promotion activities or even prioritize future research. Librarians can use altmetrics to track performance of particular journals, or to help make acquisitions decisions. Alternative metrics also provide insight into the results of social media engagement strategies and deserve to be integrated into Knowledge Translation (KT) assessment.
I hope efforts to measure research output will take into consider an expanded view of what it means to make a positive impact. I also hope that more qualitative measurement methods will compliment the overly quantitative landscape of simply counting âmentions,â âproducts,â or âactivities.â
And I hope too that youâve enjoyed this introduction to altmetrics. Thank you for your attention, and Iâm happy to answer any questions.