Skip to main content
Journal of the Medical Library Association : JMLA logoLink to Journal of the Medical Library Association : JMLA
. 2015 Jan;103(1):63–64. doi: 10.3163/1536-5050.103.1.019

PlumX

Reviewed by: Robin Champieux 1
10 Estes Street, Ipswich, MA, 01938. team@plumanalytics.com; https://plu.mx; institutional pricing only; contact vendor for pricing Plum Analytics, EBSCO Information Services.
PMCID: PMC4279944

PlumX is a web-based tool that provides data on the use and impact of research and scholarly products. It belongs to the small but increasingly influential community of altmetric data providers. For those unfamiliar with the term, altmetrics refers to measures of research impact based on online activity—such as saving of papers in Mendeley, downloads, and tweets—and the study and use of these measures [1]. Altmetrics also include a wide variety of scholarly products, such as articles, patents, datasets, figures, and videos. As measures, altmetrics offer evidence about how and where research is being shared and discussed, and by whom. Increasingly, researchers, funders, and universities are using these data to understand and tell fuller stories about their scientific impact and investments. In addition to being involved in these efforts, libraries and librarians are using altmetric data and research to know the online tools and spaces that researchers and the general public are using to engage with science and scholarship.

To provide a complete overview of PlumX, especially for those unfamiliar with such tools, its main features are described below and organized by: (1) how a subscriber can add and organize its research products for metric tracking, (2) the metrics and data sources that it supplies and mines, and (3) the options and visualizations that it provides for data outputs and analysis.

Account administrators at the subscribing institution can create profiles in the PlumX dashboard for individual researchers and groups. Groups can represent researcher relationships within different organizations—such as a lab, department, and institute—or collections of research outputs. The associated metrics can be accessed and analyzed at these different levels, making it a relevant tool for multiple audiences. Research products in PlumX are called artifacts and include essentially any kind of research output available online with a unique identifier, such as International Standard Book Number (ISBN), digital object identifier (DOI), or PubMed ID. For example, a researcher's profile can include articles, datasets, figures, patents, and clinical trials. PlumX facilitates batch importing of research outputs through a variety of mechanisms, including ORCID, Scopus and Web of Science research information system (RIS) and BibTex files, SlideShare profile IDs, and Github profile IDs. DOIs, uniform resource locators (URLs), ISBNs, and other unique identifiers can be added to researcher or group profiles as well. Researcher and group pages can include images, biographical information, and contact information. The subscribing institution can choose to make its profile data public or private.

PlumX provides five categories of metrics: usage, captures, mentions, social media, and citations. Usage includes downloads, hypertext markup language (HTML) page views, and library holdings. Captures refers to bookmarks, readers, and code forks, among others. Blog posts, comments, and Wikipedia links are classified as mentions. Social media includes likes and tweets, and citations include citation counts from Web of Science and Scopus.

PlumX's data sources include online environments and activities associated with both research and general audiences, facilitating discovery about how a particular study or research finding is being informally and formally discussed in public and scientific circles. Examples of sources include Mendeley, Github, PLOS, Scopus, Web of Science, Delicious, Reddit, Goodreads, Topsy, CrossRef, and Facebook. A complete list of PlumX metrics and metric sources can be found at http://www.plumanalytics.com/metrics.html.

As previously described, researchers and collections of artifacts can be organized into meaningful groups, as defined by the subscribing institution. The artifacts and metrics associated with each group are represented and easily navigated in the PlumX dashboard. At all group levels, an artifact summary is provided graphically that includes counts of research output types, such as articles, presentations, and videos. Metrics can be viewed by each or across all artifact types. Metrics are grouped by the categories described above (i.e., usage, captures, mentions, social media, and citations), and artifact-level data are by default presented in a clean, tabular format, which can be downloaded to a comma separated variable (csv) file. The dashboard is interactive, allowing the user to move through different levels of metric data, from categories to specific data sources, a powerful feature for exploring questions about different types of impact. The metrics can also be viewed and similarly navigated via a “sunburst” visualization (Figure 1). This feature allows users to easily spot and investigate artifacts with high traffic across all or particular metric categories.

Figure 1. PlumX sunburst visualization.

Figure 1

Finally, it is worth noting that PlumX provides both an application program interface (API) and widgets. The latter allows institutions, individuals, and groups to embed PlumX data in websites and institutional repositories.

PlumX is a powerful tool: both the data and the user interface can support multiple use cases and research questions. For example, Oregon Health & Science University (OHSU) Library is exploring how to use PlumX to help individual researchers uncover the full impact of their work, especially among different populations. OHSU wants to enable researchers to tell more nuanced stories about their science, especially in the context of funding proposals and the new biosketch format that the National Institutes of Health is piloting. In this sense, successfully using PlumX and altmetric data more generally is very dependent on asking the right questions and thinking creatively and analytically about what the metrics are indicating.

Reference

  • 1.Priem J. Altmetrics. In: Cronin B, Sugimoto C. R, editors. Beyond bibliometrics: harnessing the multidimensional indicators of scholarly impact. Cambridge, MA: MIT Press; 2014. pp. 263–81. [Google Scholar]

Articles from Journal of the Medical Library Association : JMLA are provided here courtesy of Medical Library Association

RESOURCES