Statistics of usage — Can the figures of 2 or 3 databases be useful in benchmarking?

For comparison of usage figures, we have compared our lists of the available databases in our universities. While comparing them we have realised that our lists of available databases are very different. It is rather surprising to realise that three European health sciences libraries are not very similar even from this point of view.

There are many databases that two of us have but one does not, and this differs between all the three libraries so that it can be any of the three that does not have a database the two others do have. There are more than the two we have chosen, that we all provide access to, but the usage of all databases is not available or comparable for various reasons. For example, PubMed, naturally, is widely used in all the three universities but, as a freely accessible interface, it cannot provide this kind of information for us.

wp_20141104_047

Students using databases at UCL

The databases we have chosen are SciFinder and Scopus.

  • SciFinder is an information retrieval system for searching several databases about chemistry and related areas. It includes CAPLUS: literature reference database which contains references to more than 9000 journals on chemistry and related subjects, patents, technical reports, books, conference proceedings, and academic dissertations; CAS REGISTRY: factual database of over 50 million chemical substances (e.g. trade names, chemical names, chemical structures); CASREACT: contains 8 million single-phase and multiphase chemical reactions; CHEMCATS: trade catalogue of 7 million chemicals; CHEMLIST: catalogue of 232 000 regulated chemicals; MEDLINE: biomedical journal article reference database.
  • Scopus is a large abstract and citation database of research literature and quality web sources. Scopus offers nearly 19,000 peer-reviewed journals from more than 5,000 publishers. It contains also EMBASE and MEDLINE databases. It includes also book series and conference papers. The number of records is 46 million. Moreover, Scopus covers 433 million quality web sources, including 24 million patents.

This is how the figures of usage (number of searches) look for the year 2015:

2015 NTNU UCL UEF
SciFinder Searches   28 146   25 847   9 842
Scopus Searches 131 440 182 384 68 752

For the year 2016  they look like this:

2016 NTNU UCL UEF
SciFinder Searches    21 097   25 090    9 473
Scopus Searches  143 272 722 761  85 167

How does this compare to the number of potential users? In all, NTNU had about 22 000 students and a staff of 6 700, while UCL had 24 000 students and a staff of 5 840, and finally, UEF — being half the size of the other two — had approximately 11 000 students and a staff of 2 800. Not all of them are potential users of the rather specific SciFinder but they all are potential users of the multi-disciplinary Scopus. So, here is a comparison of Scopus usage in 2016 per student and per staff member in each university:

2016 NTNU UCL UEF
Scopus Searches per potential user  5 24  5

The difference is significant. What could it mean? Should NTNU and UEF start marketing Scopus more? Probably. Or is there an alternative choice on their list of databases that UCL does not have?

The Web of Science (WoS) is the one that comes into mind as another multi-disciplinary database. What do the figures of WoS usage show?

2016 NTNU UEF
Web of Science Searches  151 608  65 640
Web of Science Searches per potential user   5  5

In NTNU and UEF WoS is probably one reason for the lower usage of Scopus but counting together Scopus and WoS usage per user in NTNU and UEF does not come even close to the figure of only Scopus usage in UCL. It is difficult to know what is behind these figures.

At UCL, but also NTNU and UEF, students are taught to use Scopus during library trainings. Moreover, research evaluation (for international, national, regional and university funding) and individual academic promotion are largely based on bibliometrics, when francophone universities in Belgium do not offer WoS. (Neither does the Belgian scientific fund, FNRS-FRS).

If we could publish the information about on how much each of the universities pays for these databases, we might be able to say something about value for money, but at the moment that is not possible. Maybe the paradigm of open science will progress to that direction, too, at least if the funders are the tax-payers. Once again, comparing and benchmarking internationally is not easy at all. It is complicated and time-demanding but also interesting and thought-provoking.

wp_20141031_047

Happy benchmarkers at NTNU main library building

Advertisements

Outcomes from ICML+EAHIL2017 workshop

At ICML+EAHIL2017 we facilitated a workshop called Cooperation and benchmarking – finding the value and impact together.

We invited the participants to take part in our benchmarking project. We wanted them to help us to identify more future oriented indicators* and also to discuss how — or if — benchmarking can provide tools for creating an evidence base for health librarianship. The goal of the workshop was to find some new and exciting ideas to take further. We used brainwriting as a tool to find and refine the ideas.

working

Preparations the day before.

The first part of the session was spent on identifying new ideas for indicators to help measuring impact and value for international (health) library benchmarking.

The best ideas for indicators — quotations from post-it notes — from this brainwriting sessions were:

  • Number of high “grade” student essays/exam papers in relation to librarian time spent teaching/tutoring
  • How has the literature search been used to change practice?
  • Impact on national health policies index/indicator
  • When host organisation cites the library’s contribution in press releases or publicity
  • What is the new role of a librarian? Non-traditional work
  • Publications from the faculty; visibility in altmetrics
  • Can the customer get the grant he/she applies
  • Time saved by faculty e.g. lecture writing, student remediation
  • Proportion of knowledge syntheses that reach publication
  • Increase in application usage after a conference
  • Chocolate/biscuits/cards — how many gifts (you get from customers)
interactive workshop

Identifying new indicators

During the second part of the session the participants discussed how (or if) benchmarking can provide tools for creating an evidence base for health librarianship. There were five questions and the participants came up with lots of ideas and then voted for the ones that they liked best. Here are the ideas and proposals that were most popular — again quotations from post-it notes:

1. How can benchmarking provide tools for creating an evidence base for health librarianship?

  • Develop indicators which are clearly articulated and detailed to enable consistent application across different services.

2. How can cooperation and benchmarking be seen as research activities?

  • Will help develop “industry standards” to be adopted by the profession.

3. How can cooperation and benchmarking be used for measuring the impact of libraries and librarians?

  • It will give you “evidence” by comparing both qualitative and quantitative measures.

4. How to inspire staff for change?

  • By empowering them, trusting them, and giving them freeway to make decisions.

5. Something else?

  • More national and international collaboration.
WOWs

The best ideas

We are very grateful for all the participants for working hard and being so active. We hope everyone got something to bring back, some food for thought, found new connections and were able to extend their professional network.

* ISO 11620 (2014) definition of indicator: Expression used to characterize activities both in quantitative and qualitative terms in order to assess the value of the activities characterized, and the associated method.

Benchmarking workshop in Dublin

ICML+EAHIL2017 Wednesday 14th June, 2017 15:00-16:30 Workshop 5  — Cooperation and benchmarking – finding the value and impact together.

In this workshop we invite the participants to take part in a benchmarking project of three health libraries. We want you to help us to identify more future oriented indicators* and also to discuss how — or if — benchmarking can provide tools for creating evidence base for health librarianship. The goal of the workshop is to find some new and exciting ideas to take further. We will use different brainwriting tools to find and refine the ideas.

postits

What is brainwriting

Brainwriting is an idea-generating method that involves the participants in a group activity. In the more familiar brainstorming a group generates creative ideas verbally, on the other hand brainwriting enables the group to generate ideas and solutions on paper. It is easier for the less vocal people to participate in brainwriting. In the process, the  participants build on each other’s ideas, and that gives an extra dimension to the discussions.

The basics are a group of people sitting together to write down ideas on index cards or Post-It notes. Participants are invited to consider out-of-the-box ideas. At the end of a set period of time (e.g., 5-10 minutes) the ideas are collected, organized into groups, and presented to the rest of the group.  Then there can be a second (or even more) round to generate and present more ideas.

There are different variations of brainwriting – we plan to use two methods:

  • BrainWriting 6-3-5: The name comes from the process of having 6 people write 3 ideas on Post-It notes in 5 minutes.
  • BrainWriting Pool: Each person, using Post-It notes or small cards, writes down ideas, and places them in the center of the table. Everyone is free to pull out one or more of these ideas for inspiration. Group members can create new ideas, variations or piggyback on existing ideas.

Workshop on benchmarking

 In our session you will discuss and develop two themes:

  1. Identify new kinds/types of indicators – future oriented instead of based on what has been done – in order to measure impact and value for international (health) library benchmarking.
  2. Our profession benefits from an evidence-based, research-focused foundation. We want you to discuss how (or if) benchmarking can provide tools for creating an evidence base for health librarianship.

If you will be attending our workshop session in Dublin, please, before Tuesday 13th June, 2017, introduce yourself very shortly (name, organization, main tasks) by commenting this post. If you want, you can also very shortly explain why you chose this session.

And remember to bring your brain!


*According to ISO 11620:2014, an indicator is an expression (which can be numeric, symbolic, or verbal) used to characterize activities (events, objects, persons) both in quantitative and qualitative terms in order to assess the value of the activities characterized, and the associated method.