AbstractThis paper presents three case studies describing the use of altmetrics across three research-intensive higher education institutions in the UK and USA. We conduct interviews to learn more about how altmetrics are applied by various teams in a university, with a particular focus on the use of tools built by Altmetric, including the Altmetric for Institutions platform. We interviewed a Research Support Services Librarian, Head of Research Information and a Principal Investigator. This is an exploratory study that seeks to find out more about the general awareness, application and potential for using altmetrics across a range of university departments. We identify exemplar use cases for wider adoption in institutions and uncover future potential use cases. The findings of this study are that although use of altmetrics is at a relatively early stage when compared to traditional bibliometrics, a number of professional groups are using the tools and underlying qualitative data to enhance existing processes. This includes monitoring attention to research papers in previously difficult to track sources, reporting to funders, inclusion in grant applications and offering alongside existing bibliometrics services. Potential limitations of the study are the focus on a small sample size in three large institutions and the use of a single tool. Interviewing a range of researchers across a number of institutions in order to highlight further altmetric use cases would develop the research further. The following case studies, however, offer a detailed insight into the current applications of altmetrics and Altmetric for Institutions.
Digital scholarship and the relocation of research dissemination from paper to the web has led to a growing body of conversation about research taking place online in a range of non-traditional sources. The momentum of the alternative metrics or ‘altmetrics’ movement – tracking attention to research papers across sources such as policy documents, news sources, blogs and social media – signals a demand to demonstrate the potential impact of a research paper much more quickly than is possible with traditional citations. Limitations of bibliometrics such as the Journal Impact Factor have also contributed to the need for researchers and higher education institutions to understand online attention to research. As a result, techniques for measuring attention to scholarly papers online are developing fast, with the introduction of altmetrics tools such as Altmetric for Institutions (Altmetric for Institutions), Impact Story (impactstory.org) and Plum X (plu.mx), providing software to enable users to extract data and evidence of online attention to research. Studies have demonstrated the way in which bibliometrics analysis, particularly the Journal Impact Factor, is limited (Kurmis 2003; Ogden and Bartley 2008) and there have been calls to improve the evaluation process, notably in the San Francisco Declaration on Research Assessment (DORA) (ASCB, 2012). In addition, altmetrics open up the possibilities of tracking attention to other research outputs such as data sets – providing researchers with an indicator of engagement with research outputs beyond the journal article.
Research funders are increasingly calling on institutions to engage with the public about their research and subsequently demonstrate societal impact outside academia (Wellcome Trust 2014; National Science Foundation 2014; Research Councils UK 2014). Institutions must as a result design effective ways to illustrate and gather evidence about research engagement. The Higher Education Funding Council for England (HEFCE) introduced an impact element to the Research Excellence Framework (REF), which was conducted across UK higher education institutions 2014. This assessment is fundamental for institutions in securing quality-related (QR) funding over a period of several years. REF 2014 incorporated this analysis of impact by requiring all institutions to provide impact statements for one in ten outputs submitted for assessment. This element accounted for 20% of the overall assessment score. The Wellcome Trust have also discussed the potential of altmetrics to ‘understand the impact of the work they support’, complement existing processes such as peer review and traditional bibliometrics analysis and help inform funding strategy (Dinsmore, Allen, and Dolby 2014). Inclusion of metrics in future iterations of the REF is also under discussion as part of the HEFCE task force undertaking the ‘Independent review of the role of metrics in research assessment’ (HEFCE 2014).
It’s important to emphasise that altmetrics signal online activity surrounding research rather than the quality of an output. Data collected by altmetrics providers act as an indicator of attention rather than the quality of the research. We should always dig deeper to the underlying qualitative data to understand the type of attention a scholarly output has received and to gauge the significance and potential impact of such attention.
Altmetric for Institutions
Altmetric for Institutions launched in July 2014, enabling institutions to monitor and report on the reach and attention to research publications. The platform builds on the Altmetric Explorer and enables institutions to measure online conversations about publications by researchers affiliated with their institution. Altmetric is an aggregator of online attention to scholarly papers and each week captures hundreds of thousands of tweets, blog posts, news stories, policy documents and other content that mention scholarly articles, resulting in over 3.5 million scholarly outputs tracked across a variety of sources in the database as of March 2015. Altmetric for Institutions aims to fill that ‘evaluation gap’, (Wouters 2014) in order to demonstrate the broader societal impact of research and present this attention all in one place. It also addresses a number of institution-wide issues, including monitoring non-traditional attention to publications to inform impact analysis and investment decisions. Altmetric for Institutions also supports the gathering of evidence for data-driven assessments both internally and externally. A number of researchers have already discussed using altmetrics data in CVs to ‘tell the whole story about a paper’s influence’ (Kwok 2013).
We conducted three interviews to uncover practical approaches to using altmetrics currently in application in three universities, with a particular focus on the use of Altmetric tools, including the Altmetric for Institutions platform. We interviewed Dr Juergen Wastl from the Research Information Team at the University of Cambridge for the research office perspective. Scott Taylor, Research Services Librarian at the University of Manchester provided the librarian’s perspective. Professor Terrie Moffitt from Duke University was interviewed to present the researcher use case. Interviews were conducted over the phone and via email.
By sharing our findings we demonstrate the use of altmetrics in practice, in order to highlight current use cases and potential new developments for both application and development of altmetric tools. The results of this paper are not intended to be exhaustive due to the nature of an explorative case study across a small sample size. This study takes into account three use cases and we focus on the use of the Altmetric for Institutions tool. Conducting case studies with institutions of varying sizes across a number of tools would develop the research further. In addition, it would be important to highlight altmetrics use cases from additional groups such as communications offices, knowledge exchange professionals and alumni teams, and researchers of different levels and disciplines from early career researcher to tenured professor.
Case Study 1: The Research Office Perspective
Institution profile: University of Cambridge
The University of Cambridge is one of the oldest institutions in the UK with more than 18000 students, 9000 staff, 1500 tenured academics, 3500 contracted research staff, 31 Colleges, 150 departments, faculties, schools and other institutions. The university received £332m in research grants and contracts in 2012/13 (University of Cambridge 2014) and is the highest UK recipient of European Commission research and technological development funding.
User profile: Dr Juergen Wastl, Head of Research Information, University of Cambridge
Dr Juergen Wastl is Head of Research Information at the University of Cambridge. The Research Information team is part of the Research Strategy Office, in the Academic Division within the institution. Working across all disciplines and faculties, and operating closely with researchers, departmental and research group administrators, network coordinators and research facilitators, the Research Information team is tasked with providing information and metadata relating to submissions for governmental returns. The team also provides administrative support for tools related to managing information about research funding, impact and publications. This includes Altmetric for Institutions, Research Professional – a funding opportunities database and Symplectic Elements – the research publications management system.
In addition, the team provides training and support for administrators and academics with an initial focus on enabling knowledge transfer within the institution. For example, running train the trainer sessions for department administrators who are then able to cascade further training down to researchers. The team also facilitates reporting at the institution to funders such as Research Councils UK and the Wellcome Trust on the outcomes of research funding grants. As discussed above, funders are increasingly requiring institutions to engage with the public about their research and demonstrate societal impact outside academia. This is an important element of the service provided by the Research Strategy Office.
Awareness of altmetrics
Researchers and staff at the University of Cambridge were introduced to altmetrics via the Altmetric donut and accompanying altmetrics details that appeared alongside each article in the research information management system, Symplectic Elements, used by Cambridge for the REF 2014. Users also came across the donut visualisations on a number of publisher sites, and were keen to learn how they might be able to use altmetric data across the institution. Dr Juergen Wastl also reported there were a number of ‘early adopter’ researchers at various levels who were familiar with altmetrics, and had discussed the potential uses at Cambridge with the Research Information team.
Due to the importance of reporting outcomes to research funders, a key strategy at the University of Cambridge was to improve the way information about research impact could be collected and reported upon. Tasked with gathering and providing evidence surrounding the broader impact of the institution’s research, it was essential there was a process by which to identify and report impact activities to funders using reliable and auditable data. In the early stages, the team found that bringing together this impact evidence was a largely manual process and it was often difficult to locate, track and combine across sources and platforms. It was unclear how much valuable data was being missed, such as mentions of University of Cambridge research in policy documents. This data would then need to be reported to the appropriate funder in the required format. As a result, the team were motivated to provide researchers and administrators with appropriate tools to gather and analyse research impact data effectively. This would enable the institution to present the impact of their work clearly to funders, peers and broader society.
Implementing Altmetric for Institutions
The Research Information team began by looking more closely at the Altmetric data in order to understand its coverage and establish how it could help in reaching their goals. Initially, the team worked to determine the distinction between altmetrics and traditional bibliometrics, the associated limitations and considered the additional insight altmetrics data could offer the institution in terms of identifying early attention to research papers. The University of Cambridge signed up to be a development partner on the Altmetric for Institutions product and worked closely with the team over a period of five months, giving feedback, testing data and provide directional guidance.
Tracking the impact of research on public policy
The University of Cambridge was particularly supportive of the inclusion of mentions in public policy documents. This, Dr Juergen Wastl noted, was a missing link between research and practice and shows a measureable impact that they were unable to capture through any other system. This information can now be uncovered and collated in the Altmetric for Institutions platform, and is vital for better evidencing the tangible societal impact of publicly funded research.
The Research Strategy Office has identified over 400 mentions of University of Cambridge research papers in the policy documents tracked by Altmetric to date. In some instances, this research had never been cited or shared elsewhere. Other examples, such as a paper published in The Lancet in 2014 (Ng et al. 2014) received 1258 mentions from a wide range of sources including a citation in an Oxfam GB Policy and Practice document (Oxfam International 2014) and 66 news outlets, 18 blogs, over 1100 tweeters and 88 Mendeley readers at the time of writing, thus demonstrating broad attention to the paper across a range of sources. The paper has received 79 citations, according to Scopus data accessed in March 2015. Here, Altmetric data portrays an alternative view of attention to research shortly after publication.
Future planning: Collaboration and benchmarking
The University of Cambridge are keen to encourage inter-departmental working groups, and plan to use the subject filtering in the Altmetric for Institutions platform to identify where there might be opportunities for this analysis. This will help connect academics across different departments researching similar areas and enable opportunities for collaboration.
In a similar vein, the institution are also planning to conduct further network analysis and profiling; who are their departments working with globally, and who is talking about which research? This in turn will give them the opportunity to identify potential areas of development and enable them to adjust their strategy accordingly. For example, Altmetric for Institutions will help researchers to find potential collaborators by searching existing conversations about research and then forge links for research collaborations. The Research Information team intends to map these trends over time and to report on their progress in specific countries or communities.
Data held in Altmetric for Institutions will also enable the University of Cambridge to benchmark research outputs in a number of ways – for example comparing the volume and type of mentions in policy documents versus those found in traditional journal citations. The team will also be looking to see how specific disciplines and the institution as a whole compares to that of their competitors, making use of the full Altmetric database and all research outputs with online mentions to run this analysis.
Case Study 2: The Library Perspective
Institution Profile: University of Manchester
The University of Manchester is the largest single site university in the UK, with over 38,000 students, 4000 academics and 1975 teaching-only staff. The University is divided into faculties, schools, institutes and hundreds of specialist research groups, all of which undertake multidisciplinary research. The university attracted £280 million in external research funding in 2013/14 (University of Manchester 2014a), from a variety of sources including UK research councils, charities, government departments, industry and commerce, and overseas.
User profile: Scott Taylor
Scott Taylor is a Librarian within the Research Services Team at the University of Manchester Library. The team provides academic support for research tools across the entire institution, and makes it their aim to recognise and implement new innovations that could most benefit research activity. The team provides a well-developed bibliometrics reporting service for researchers with dedicated staff, including reporting on publications data using a range of traditional bibliometric indicators drawn from SciVal. Library staff also provides training and communications support to researchers to advise on best practice when sharing and promoting information about research publications. The service is offered at all levels across the institution, including individual researchers, administrative teams, whole departments and senior management groups.
Awareness of altmetrics
The Research Services team first became aware of the altmetrics movement via professional networks and through discussion on mailing lists. Scott also attended a meeting and demonstration with the Altmetric team, which prompted him to investigate Altmetric for Institutions further, and from there became an early adopter of the tool. The team were keen to extend the existing bibliometrics service and to offer an alternative view of online conversations around research.
A strategic aim of the Research Services team in the University of Manchester Library was to provide ways to identify outreach across Schools and faculty – with a particular interest in using a system that would offer insight into qualitative altmetric data alongside the quantitative numbers and scores. This would allow them to view the underlying mentions of University of Manchester research and understand the type of attention around their outputs. It was also important that the service would enable researchers to find and provide evidence of broader engagement more efficiently, particularly when submitting content for funding bids and when showcasing their research. The University of Manchester aims to position itself as a forward-thinking institution and is often an early adopter of new tools to support research activities. Key performance indicators (University of Manchester 2014b) were developed by central institutional strategy teams as a means to benchmark research activities at the institution. A significant key performance indicator is ensuring 20% of the University of Manchester’s research outputs are in the top 10% of cited papers in their field by 2020.
Table 1 University of Manchester key performance indicators for research performance. Source: University of Manchester, 2014b
Key performance indicator
1. World ranking
To be in the top 25 of the Shanghai Jiao Tong Academic Ranking of World Universities by 2020.
2. Research grant and contract income
To increase total research income by 30% by 2015 and to double it by 2020, ensuring an increase in both international and business income as a percentage of total income and an increase in Manchester’s share of UK research grant and contract income.
To improve the quality of research outputs, ensuring that 70% of staff are judged as world-leading or internationally excellent by peer review through REF or our own exercises, and to ensure that 20% of Manchester publications fall in the top 10% of cited papers in their field by 2020.
4. IP commercialisation
A weighted portfolio of measures monitoring invention disclosures, licences, spin-outs and other IP commercialisation activities, ensuring that the UMI group is also a value-for money operation.
Table 1 demonstrates how citation counts are integral to the University of Manchester’s measurement of research performance. This key performance indicator sits alongside additional measureable outputs including world rankings, research income and commercialisation ventures. It also reinforces the importance of REF rankings for the institution, with the goal of 70% of staff with world-leading (3*) or internationally excellent (4*) publications.
Implementing Altmetric for Institutions
Altmetric for Institutions was first launched at the University of Manchester as a pilot project involving a selected group of administrators and faculty across several disciplines, giving the Research Services team the opportunity to explore the Altmetric data and the reporting functionality offered by the platform. This also helped the team to develop an understanding of potential use cases of relevance for their institution in the future. The Altmetric team hosted a workshop to train library staff and researchers on the platform and run through use cases.
Gathering evidence and context
As discussed above, researchers and administrators are facing increasing demands from management and government reviews to provide evidence of the broader impact and value of their work. Altmetric for Institutions is being used at Manchester as a place to access all of this data collated for each of their research outputs – with an at a glance summary which helps them identify notable, but perhaps little-cited, items within their own portfolios – an important step in making it easier to evidence non-traditional impact. The Research Services team is considering embedding summary departmental reports from Altmetric for Institutions alongside traditional bibliometrics reports put together for departments.
A key benefit for University of Manchester is the functionality to see all mentions for the institution, grouped by author and department, surfaced in the Altmetric for Institutions interface. It is this level of qualitative data, instead of just counts of coverage or mentions, and the capacity to search by institutional groupings, which make the information particularly valuable to them. Faculties are able to extract the material to include in impact reports or funding applications to add context and background to their submissions.
Identifying interesting content
As well as their original institutional goals, the Research Services team found the data useful for a number of other applications – including its possible use in identifying journals that see a lot of attention, complementary to citation data when reviewing titles for collection development. Scott Taylor reported: “We can instantly see which journals in our collection have generated a lot of high-impact coverage over the last year, and identify others for faculty to review for future inclusion.” Furthermore, the team plan to use Altmetric for Institutions to identify papers to make open access in their institutional repository, Manchester eScholar Services.
Driving academic engagement
The Academic Engagement Librarians team – responsible for liaison with specific Schools – intend to use the data to demonstrate to researchers the effects of their outreach activity. To this purpose, the Academic Engagement Librarians have set up automated summary alerts within the University of Manchester’s instance of Altmetric for Institutions, and will alert authors departments and department heads with information about additional coverage and online attention. The team aim to identify patterns and key sources of attention to help researchers form more effective strategies for future engagement. This will help University of Manchester academics identify key contacts to work with in future to ensure their research findings reach the relevant audiences.
Future planning: Training and reporting
Scott Taylor and the Research Services team now plan to roll out the Altmetric for Institutions platform more widely across all disciplines and faculty. This will be a collaborative effort in conjunction with the Altmetric team to provide training and education to their research staff to ensure a good understanding of what the Altmetric for Institutions data can demonstrate, and will continue to explore how the data can be best incorporated into their reporting structure.
Case Study 3: The Researcher Perspective
Institution Profile: Duke University
Duke University is based in North Carolina, USA and is home to 3398 faculty, 14,850 students and a total of 35,998 employees and incorporates Duke School of Medicine and Duke School of Nursing (Duke University 2014a). The institution received over $1 billion (£700 million) in external research grants in 2013/14 (Duke University 2014b).
User Profile: Professor Terrie Moffitt
Professor Terrie Moffitt is the Knut Schmidt Nielsen Professor at Duke University and is part of Duke Psychology and Neuroscience Psychiatry and Behavioral Sciences. Specialising in the interplay between nature and nurture in the development of problematic behaviours, she manages a team of twelve researchers at Duke and is also Associate Director of the Dunedin Multidisciplinary Health and Development study in New Zealand. In addition, Professor Moffitt oversees the Environmental Risk Longitudinal Twin Study, which monitors the progress of over 1,000 British families with twins. Professor Moffitt is often required to submit impact statements to funders across a number of publications from a lab or a funded project.
Awareness of altmetrics
Professor Moffitt first discovered altmetrics when a student recommended the free Altmetric bookmarklet as a way of viewing altmetrics data for individual articles. She had also seen the Altmetric badges embedded in publisher websites. After using the bookmarklet, she contacted the Altmetric team asking how she could view the data for several papers simultaneously, in order to include in a lab progress report.
Interested in demonstrating the reach of her research internationally and simplifying the process of sending data to government funders on an annual basis, Professor Moffitt used the Altmetric for Institutions platform to address some of these pain points. She has included data exported from Altmetric for Institutions in a National Institute of Health budget report – communicating the attention her research projects have received. She has also used the platform to compare the attention to articles across different research disciplines. Altmetrics are also embedded on her CV.
In addition, Professor Moffitt has used altmetrics data to demonstrate where her research has captured the public imagination as well as having an impact in academic circles. A paper published in the Proceedings of the National Academy of Sciences of the United States of America (Meier et al. 2012) generated significant levels of engagement on Twitter, Facebook, in news outlets, Reddit, YouTube, in academic reference management systems such as Mendeley. Exposing this data surrounding public engagement with research alongside reference manager readership helps tell a broader story about potential impact.
By reading the underlying qualitative data held in the Altmetric mentions, such as mentions on social media sources, Professor Moffitt differentiates between members of the general public or the academic community when understanding the audience her work has attracted. She said, “we used to envisage the college professor colleague down the hall, but now that we can use altmetrics, we can see what kinds of readers are tweeting and blogging about our papers. We need to communicate with them in a different way.” She also said that being able to identify different types of demographics within an audience can allow researchers to “show evidence that their work is being noticed beyond the ivory tower.” After using Altmetric for Institutions, she was also able to see the international visibility of her research, by viewing the summary reports for articles and using the interactive Twitter demographics map, which enables users to break down Twitter data by country.
Advantages of altmetrics
Professor Moffitt also described how altmetrics provide a useful timesaving tool for researchers upon which there is increasing pressure to keep full administrative records of their work, and report to funders and departments on the way in which their research is being shared and disseminated, reducing time for research and writing papers. She explained how she finds altmetrics data useful to analyse alongside traditional metrics, as the impression of impact can be more immediate.
Early career researchers
Discussing the use of altmetrics by early career researchers, Professor Moffitt described the way in which new researchers can use the data in order to be recognised for contributions in their field. She argued that early career researchers are often at a disadvantage in terms of representation, as their recent research has not yet accrued high citation counts. Tracking altmetrics data helps maintain enthusiasm for their study programs, and can sometimes provide an ego-boost for “rapid positive reinforcement”. University promotion or tenure committees can also use altmetrics, Professor Moffitt described, particularly for early career researchers whose cases may be borderline in order to get a sense of how their research is gaining traction.
Future use cases
In terms of planning future use cases for altmetrics data at her institution, Professor Moffitt described how the tools could potentially benefit many other teaching and research departments at Duke University. In order to increase awareness of altmetrics and Altmetric tools, she has given several presentations to research colleagues and students. She also commented, however, that it would be useful to increase the coverage of paper mentions in policy documents released by government organisations, and that the search function could be improved for viewing this type of attention.
Professor Moffitt also recognises potential uses of altmetrics for undergraduate students. The data could be used as a teaching resource, to enable her students discover useful papers to read using altmetrics as a filter. She plans to set tasks such as searching for specific keywords around the subjects they are studying in the Altmetric platform, and encouraging students to create their own secondary reading lists from the articles they find. She explained this process of hands-on learning would enhance student motivation.
Our case studies provide evidence of altmetrics and Altmetric for Institutions in use at various institutional levels and across professions. We see examples of how altmetrics can provide a more complete indicator of impact when used alongside existing forms of peer review and traditional bibliometric analysis. A recurring example uncovered during our study was using the tool to report to funders – an application discussed by all interviewees. Gathering evidence for funders is clearly a prevalent activity within institutions and creates a driver to improve existing systems and processes in order to report efficiently. Processes to gather, report and normalise this data are of institutional significance as funders begin to consider metrics beyond traditional citations as a means of understanding research engagement.
Extending existing services
By conducting multiple case studies, we found that institutional-level altmetrics platforms can benefit a variety of university departments and enable professional teams to augment current services. Librarians can use the data to extend existing bibliometrics support, particularly when helping faculty compare and promote the impact of research from different departments. Faculty, staff and students can filter the data down to a more individual level, and monitor the attention for specific publications. Research offices can use altmetrics to understand research engagement, report to funders, improve benchmarking activities and promote opportunities to find collaborators.
At the University of Cambridge and the University of Manchester, we saw how altmetrics are embedded as part of the bibliometrics offering and the funder reporting support offered by each team respectively. Both institutions conduct world-class research, and it is in part the innovative approach taken by support services to adapt, which enables the institutions to draw upon and showcase the rich underlying altmetrics data surrounding their scholarly content. Furthermore, Duke University receives the most research funding of the institutions studied, standing at over $1 billion per year. This brings Professor Moffitt’s motivation to report efficiently to funders and researcher into focus, exemplifying the need to access altmetrics data in flexible, aggregated and visualised formats.
Showcasing research engagement
An additional core use case we found in the study was use of the tool to enable institutions to showcase engagement with research across a range of non-traditional sources. This includes both academic circles via metrics aggregated from sources such as Mendeley, and in broader society – for example in news sources, policy documents or social media. We also uncovered further use cases: Professor Moffitt’s description of embedding altmetrics data in teaching activities illustrates how students can make use of Altmetric for Institutions to contribute to curate data-driven reading lists within their assigned learning syllabus. The studies also demonstrate that altmetrics can provide immediate and relevant data to illustrate broader attention to research. This was evidenced in the University of Cambridge journal article, with hundreds of tracked mentions only four months after publication.
The University of Manchester’s approach to assessing research performance and engagement using key performance indicators also poses a potential new use case, by opening the opportunity to use qualitative altmetrics data to enhance this type of internal assessment. With the Manchester Research Services team running Altmetric for Institutions reports for departments, and using the data to inform collection development, this helps the library to answer questions and inform purchasing decisions. In addition, an interesting use case identified was using the data to identify popular papers to convert to open access in the institutional repository. The tool also helps Academic Engagement Librarians demonstrate the value of conducting research outreach work to departments, thus expanding the library academic liaison service, integral to building relationships with researchers and promoting library support.
Future research should investigate the way in which altmetrics are being applied in higher education institutions more broadly and across a number of tools. It would be valuable to conduct case studies with smaller institutions, for example with a humanities and social science focus to explore the ways in which altmetrics data and tools can be applied to uncover online attention to non-STEM research outputs. In addition, it would be beneficial to illustrate use cases from additional professional teams in higher education institutions such as communications offices, knowledge exchange staff and alumni teams. In terms of the researcher perspective, it is also important to understand the different applications of altmetrics data and tools at the various career stages and disciplines. Furthermore, it would be useful to see the tangible outcomes of using altmetrics data at an institutional level in order to measure demonstrable improvements for institutional benchmarking, researcher funding applications and the success of library research support services.
ASCB. 2012. “San Francisco Declaration on Research Assessment (DORA).” http://www.ascb.org/dora-old/files/SFDeclarationFINAL.pdf.
Dinsmore, Adam, Liz Allen, and Kevin Dolby. 2014. “Alternative Perspectives on Impact: The Potential of ALMs and Altmetrics to Inform Funders about Research Impact.” PLoS Biology 12 (11). Public Library of Science: e1002003. doi:10.1371/journal.pbio.1002003.
Duke University. 2014a. “Quick Facts About Duke.” http://newsoffice.duke.edu/all-about-duke/quick-facts-about-duke.
———. 2014b. “Financial Statements 2013/14.” https://finance.duke.edu/resources/docs/financial_reports.pdf.
HEFCE. 2014. “Independent Review of the Role of Metrics in Research Assessment.” http://www.hefce.ac.uk/whatwedo/rsrch/howfundr/metrics/.
Kurmis, Andrew P. 2003. “Understanding the Limitations of the Journal Impact Factor.” The Journal of Bone and Joint Surgery. American Volume 85-A (12): 2449–54. http://www.ncbi.nlm.nih.gov/pubmed/14668520.
Kwok, Roberta. 2013. “Research Impact: Altmetrics Make Their Mark.” Nature 500 (7463). Nature Publishing Group: 491–93. doi:10.1038/nj7463-491a.
Liu, Jean, and Euan Adie. 2014. “Realising the Potential of Altmetrics within Institutions.” Ariadne, no. 72. http://www.ariadne.ac.uk/issue72/liu-adie.
Meier, Madeline H, Avshalom Caspi, Antony Ambler, HonaLee Harrington, Renate Houts, Richard S E Keefe, Kay McDonald, Aimee Ward, Richie Poulton, and Terrie E Moffitt. 2012. “Persistent Cannabis Users Show Neuropsychological Decline from Childhood to Midlife.” Proceedings of the National Academy of Sciences of the United States of America 109 (40): E2657–64. doi:10.1073/pnas.1206820109.
National Science Foundation. 2014. “Grant Proposal Guide.” http://www.nsf.gov/pubs/policydocs/pappguide/nsf14001/gpg_3.jsp - IIIA2b.
Ng, Marie, Tom Fleming, Margaret Robinson, Blake Thomson, Nicholas Graetz, Christopher Margono, Erin C Mullany, et al. 2014. “Global, Regional, and National Prevalence of Overweight and Obesity in Children and Adults during 1980-2013: A Systematic Analysis for the Global Burden of Disease Study 2013.” Lancet 384 (9945): 766–81. doi:10.1016/S0140-6736(14)60460-8.
Ogden, Trevor L, and David L Bartley. 2008. “The Ups and Downs of Journal Impact Factors.” The Annals of Occupational Hygiene 52 (2): 73–82. doi:10.1093/annhyg/men002.
Oxfam International. 2014. “Hidden Hunger in South Africa.” http://www.oxfam.org/en/research/hidden-hunger-south-africa.
Research Councils UK. 2014. “Pathways to Impact.” http://www.rcuk.ac.uk/ke/impacts/.
University of Cambridge. 2014. “University of Cambridge: Facts and Figures January 2014.” http://www.admin.cam.ac.uk/offices/planning/information/statistics/facts/poster2014.pd\nf.
University of Manchester. 2014a. “Facts and Figures 2014.” http://documents.manchester.ac.uk/display.aspx?DocID=19310.
———. 2014b. “Manchester 2020: The Strategic Plan for The University of Manchester.” http://documents.manchester.ac.uk/display.aspx?DocID=11953.
Wellcome Trust. 2014. “Public Engagement.” http://www.wellcome.ac.uk/funding/public-engagement/.
Wouters, Paul. 2014. “A Key Challenge: The Evaluation Gap.” The Citation Culture. https://citationculture.wordpress.com/2014/08/28/a-key-challenge-the-evaluation-gap/.
Showing 2 Reviews
These are very useful case studies of how Altmetrics can be used by institutions: just the sort of thing I was looking for but could not find in the literature last year! I particularly value the case study of the researcher's use where the tool enables her to plan how to communicate further about her research, thus not only measuring engagement/impact but also possibly helping to achieve it. The researcher knows best the context of the altmetrics data.
The authors are right to suggest that further research is necessary into the use of other altmetrics tools, as it is noticeable in the paper that all three case studies are using the same company's tool.
From reading this paper, it seems clear that altmetrics have a key role in identifying broader impact of research papers. Some more discussion of how impact is variously defined by funders and evaluators of research, and of how else that impact might be measured, beyond altmetrics and other indicators based on papers (eg other activity undertaken by researchers) would add to the picture.
For those wishing to expand beyond the scope of this particular paper, Kristi Holmes wrote a great blog post here: http://libraryconnect.elsevier.com/articles/2014-05/going-beyond-bibliometric-and-altmetric-counts-u... where she discusses using Plum Analytics and identifying indicators that are beyond the things that are easy to count.
(I once co-presented a webinar with Kristi Holmes, which is where I learnt about her work.)
I hope The Winnower will more prominently feature author conflicts of interest, as this article about Altmetric for Institutions is written by at least one Altmetric employee (the second author has no affiliation that I see). I have no competing interests.
These three case studies are rather limited, so the authors appropriately recognize this and do not make unsupported claims. In addition, despite the conflict of interest, the authors mostly focus on altmetrics (with a small "a") rather than what this particular tool can do. The authors do a good job describing the advantages of altmetrics, such as capturing inbound links to a variety of research objects (not just articles), and emphasize that altmetrics are not a proxy for research quality. From an institutional perspective, altmetrics offer a clear advance in uncovering evidence of impact, particularly from policy documents. Some skepticism about finding potential collaborators seems in order. In many institutions this information might already be known to the researcher; in addition, it seems dependent on the subject categorization system. For libraries, the use of altmetrics in identifying open access opportunities is an interesting one, since university PR offices often don't link or link to a paywalled version of attention-getting articles. On the other hand, I'm skeptical altmetrics will be of much use in collection development. I think the researcher, Dr. Moffitt, may be an exception. I wonder how many researchers will dig into the data, and I'm even more skeptical that T&P committees will use altmetrics directly. However, the authors show that others will be helping researchers in order to save their time, particularly in the research office and library.
In relaying these case studies, the authors cannot be expected to ask the larger questions that came to my mind while reading this. For example, should all research be expected to have some public impact, and in what time frame? Is the increasing quantification of research and researchers a good thing? These are subjects for other articles, but the authors here have done a good job of giving a basic outline of altmetric uses.
This article and its reviews are distributed under the terms of the Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and redistribution in any medium, provided that the original author and source are credited.