Qualitative Data Provides Depth in Resource Evaluation and Negotiation Planning

Katharine Macy, IUPUI University Library


Why we need both quantitative and qualitative data

Much of the data typically discussed when considering e-resource and journal-package analysis tends to be classified as quantitative data. The Organization for Economic Cooperation and Development (OECD) defines quantitative data as “data expressing a certain quantity, amount or range,” and it often uses some sort of measurement unit. In collections, quantitative data are usage, cost, cost/use, searches, and so on. The OECD goes on to define qualitative data as

“Data describing the attributes or properties that an object possesses. The properties are categorized into classes that may be assigned numeric values. However, there is no significance to the data values themselves, they simply represent attributes of the object concerned.” 

Gathering qualitative data is necessary for libraries to understand their users and how those users are engaging with scholarly communication. It allows libraries to understand how effective the collection and related services (i.e., research, instruction, outreach) are and informs decision-making and strategic planning, including stakeholder communications.

Qualitative data provide contextual depth when used in concert with quantitative data,  explaining the why behind quantitative analysis. Qualitative data collection is a key component in high-quality program evaluation of collection management, which frankly is what libraries are doing when evaluating journal packages and planning negotiations with publishers. 

Help! Collecting and maintaining qualitative data feels overwhelming!

If you think that collecting qualitative data for your entire collection is too much and cannot be done, you are probably right. That is why it is important to set priorities and collect data on e-resources (databases, journal publishers) that would have the most impact. One way to do this is to apply a Pareto analysis, which uses the 80/20 rule. Nisonger (2008) wrote about how this sort of analysis can be used with usage data to set priorities. Pareto analysis can also be used with subscription cost data (Macy, 2018). In regard to journal packages, libraries can conduct Pareto analysis by publisher when setting priorities for strategic negotiation planning. That is, determine what 20% of publishers represent 80% of your library’s spend to identify  which publishers would be most important to tackle first. This application of Pareto analysis to cost allows libraries to prioritize publishers and resources for deeper review by gathering qualitative data that help determine true organizational value beyond cost.

Gathering qualitative information

Gathering qualitative data will require collaboration with others across the library, particularly from librarians that support instruction and research services as they provide necessary contacts and outreach for gathering information with key stakeholders.

When creating a plan to gather this type of data, the first step is to determine what information would be valuable to know and where you might find this information. Likely it will come from a variety of sources. Common methods for gathering qualitative data are through open-ended questions in surveys and interviews.

Surveys and interviews

When creating the Collection Scorecard at Indiana University Purdue University – Indianapolis (IUPUI), the collections working group (CWG), a small group of librarians focused on collections decision-making, reached out to subject liaisons to present some qualitative data gathered on resources including content type and competitive products (if available, not really an option with journals) (Macy & Baich, 2019) . The CWG asked liaison to verify and augment the information found based on their expertise, as well as describe if knowledgeable: 

  • How is this used by faculty? [provide context in regard to research and teaching]
  • How is this used in your library instruction? [directed at librarians, e.g., used in instruction sessions, incorporated in LibGuides, etc.]

This data was gathered in a group-maintained spreadsheet for the first iteration of the scorecard. Feedback quality from subject liaisons was mixed. This data will need to be periodically updated, which will likely use a survey tool to gather data in the future.

Sample Survey to Subject Liaisons:

  • What specific disciplines does this resource support?
  • Is this resource used by your faculty for research, teaching & learning, both?
    • Please describe
  • Is this resource used in specific courses? Yes or No
    • Please list the courses
  • Do you use this resource within your library instruction?
    • Synchronous class sessions (in-person or online)
    • Asynchronous instruction 
    • Incorporated into LibGuides
    • Incorporated into learning objects (e.g., videos, tutorials)
  • [For databases] Do you know of any similar resources or competitive products?
  • What is unique about the content of this resource?
  • Is there any content that is duplicated by another resource? [Overlap analysis is also useful in uncovering this.]
  • Any other information we should know about this resource to understand the value it provides?

A key learning from this experience was that subject liaisons may need guidance on how to gather this information. When provided tools including sample communications, this can create a positive subject liaison outreach experience where data are gathered through email or in person. Draft written communications that can be easily modified to aid subject liaisons in setting up short coffee meetings with key faculty or for sending out short open-ended interview surveys. Note that in-person interviews, when well facilitated, can allow the liaisons to probe deeper into responses, gathering high quality information, but this can take significant time to capture by the liaison. Meanwhile email surveys are more convenient and likely will take less time, but you trade off the possibility of probing more deeply. Of course, one way to mitigate this is to ask the survey taker if it is okay to follow up at a later date.

Document analysis

Document analysis is another method that is useful. Useful documents to examine in regards to teaching and learning include evaluating course sites (e.g., Canvas, Blackboard, Moodle) to understand what journals are common in reading lists, as well as examining course assignments and syllabi for journal recommendations. In regard to scholarly communication and research, find out if the supported schools/disciplines refer to any particular journal lists within their promotion and tenure process. (This also presents the opportunity to gather data that can help in planning scholarly communication and open access and open educational outreach later!) A third option is to ask subject liaisons to search the literature for collection-benchmarking reports on resources used by institutions in specific subject areas.


Another key piece of qualitative data that will be integrated into a future iteration of the Collection Scorecard at IUPUI is accessibility. Specifically, is the content provided in an accessible format so that all students, faculty, and staff are able to use it in their scholarly endeavors? German and Harnett (2017) provide an excellent case study of how Texas A&M University Libraries have approached accessibility and collections, including discussing gathering VPATs (Voluntary Product Accessibility Template) and using them in the negotiation process. Laura Delancey created a VPAT Repository in 2016 to make accessibility documentation more easily findable for libraries.

Revealing price sensitivity using qualitative & quantitative Data

Michael Porter (2008) defines price sensitivity as the extent to which buyers are sensitive to price increases. Qualitative data used with quantitative can determine how sensitive libraries are to price changes according to four factors (Macy, 2018).

Below is a decision matrix for librarian/faculty response (qualitative) versus use data (quantitative).

Quantitative–Qualitative Cancellation Decision Matrix

Librarian/Faculty Response
Indicated necessary Not mentioned
High Use Subscribe, encourage publishing the articles as open if an option (e.g., provide APC grants, deposit in institutional repositories). Educate faculty on resources that are being used, informing research and/or teaching.Positive outreach opportunity for subject librarians.
Low Use May be due to prestige.Change management/ communication strategy needed to help faculty let go of the resource and to find alternative access.Pursue avenues such as on-demand access and cancel if $/use is higher than cost of on-demand access or ILL, depending on level of cut. Cancel if $/use is higher than cost to get via ILL.


Okay, use is low, but librarians/faculty indicate they think it’s necessary…

It is time to explore opportunities for alternative access. Alternative access increases our bargaining power because it will illuminate two factors that affect price sensitivity: the level of differentiation between products and the level of competition among end users (Macy, 2018).

To reveal alternative-access options, consider doing an overlap analysis, and analyzing open access availability of historically used journals. If you offer an on-demand access mechanism such as the Get It Now service, explore the cost of switching.



German, E., & Hartnett, E. (2017). Disability inclusion and library collections: Initiatives for greater access for all. In K. P. Strauch, B. R. Bernhardt, L. H. Hinds, & L. Meyer (Eds.), What’s Past is Prologue: Charleston Conference Proceedings, 2017 (pp. 205–210).

Macy, K. V. (2018). Information creates relative bargaining power in vendor negotiations. The Bottom Line, 31(2), 137–149.

Macy, K. V., & Baich, T. (2019, March 3–6). Holistic e-resource analysis to support changing acquisition models [Conference presentation]. Electronic Resources & Libraries Conference, Austin, TX, United States.

Nisonger, T. E. (2008). The “80/20 rule” and core journals. Serials Librarian, 55(1/2), 62–84,

Porter, M. E. (2008). The five competitive forces that shape strategy, Harvard Business Review, 86(1), 78–93.

Learn more about our work