Statistics of usage — Can the figures of 2 or 3 databases be useful in benchmarking?

For comparison of usage figures, we have compared our lists of the available databases in our universities. While comparing them we have realised that our lists of available databases are very different. It is rather surprising to realise that three European health sciences libraries are not very similar even from this point of view.

There are many databases that two of us have but one does not, and this differs between all the three libraries so that it can be any of the three that does not have a database the two others do have. There are more than the two we have chosen, that we all provide access to, but the usage of all databases is not available or comparable for various reasons. For example, PubMed, naturally, is widely used in all the three universities but, as a freely accessible interface, it cannot provide this kind of information for us.

wp_20141104_047

Students using databases at UCL

The databases we have chosen are SciFinder and Scopus.

  • SciFinder is an information retrieval system for searching several databases about chemistry and related areas. It includes CAPLUS: literature reference database which contains references to more than 9000 journals on chemistry and related subjects, patents, technical reports, books, conference proceedings, and academic dissertations; CAS REGISTRY: factual database of over 50 million chemical substances (e.g. trade names, chemical names, chemical structures); CASREACT: contains 8 million single-phase and multiphase chemical reactions; CHEMCATS: trade catalogue of 7 million chemicals; CHEMLIST: catalogue of 232 000 regulated chemicals; MEDLINE: biomedical journal article reference database.
  • Scopus is a large abstract and citation database of research literature and quality web sources. Scopus offers nearly 19,000 peer-reviewed journals from more than 5,000 publishers. It contains also EMBASE and MEDLINE databases. It includes also book series and conference papers. The number of records is 46 million. Moreover, Scopus covers 433 million quality web sources, including 24 million patents.

This is how the figures of usage (number of searches) look for the year 2015:

2015 NTNU UCL UEF
SciFinder Searches   28 146   25 847   9 842
Scopus Searches 131 440 182 384 68 752

For the year 2016  they look like this:

2016 NTNU UCL UEF
SciFinder Searches    21 097   25 090    9 473
Scopus Searches  143 272 722 761  85 167

How does this compare to the number of potential users? In all, NTNU had about 22 000 students and a staff of 6 700, while UCL had 24 000 students and a staff of 5 840, and finally, UEF — being half the size of the other two — had approximately 11 000 students and a staff of 2 800. Not all of them are potential users of the rather specific SciFinder but they all are potential users of the multi-disciplinary Scopus. So, here is a comparison of Scopus usage in 2016 per student and per staff member in each university:

2016 NTNU UCL UEF
Scopus Searches per potential user  5 24  5

The difference is significant. What could it mean? Should NTNU and UEF start marketing Scopus more? Probably. Or is there an alternative choice on their list of databases that UCL does not have?

The Web of Science (WoS) is the one that comes into mind as another multi-disciplinary database. What do the figures of WoS usage show?

2016 NTNU UEF
Web of Science Searches  151 608  65 640
Web of Science Searches per potential user   5  5

In NTNU and UEF WoS is probably one reason for the lower usage of Scopus but counting together Scopus and WoS usage per user in NTNU and UEF does not come even close to the figure of only Scopus usage in UCL. It is difficult to know what is behind these figures.

At UCL, but also NTNU and UEF, students are taught to use Scopus during library trainings. Moreover, research evaluation (for international, national, regional and university funding) and individual academic promotion are largely based on bibliometrics, when francophone universities in Belgium do not offer WoS. (Neither does the Belgian scientific fund, FNRS-FRS).

If we could publish the information about on how much each of the universities pays for these databases, we might be able to say something about value for money, but at the moment that is not possible. Maybe the paradigm of open science will progress to that direction, too, at least if the funders are the tax-payers. Once again, comparing and benchmarking internationally is not easy at all. It is complicated and time-demanding but also interesting and thought-provoking.

wp_20141031_047

Happy benchmarkers at NTNU main library building

Advertisements

Choosing ISO indicators

As consideration of ISO (International Organization for Standardization) performance indicators seemed to make sense in a library benchmarking project, we decided to pick up a couple of them out of ISO 11620 (2014).

  1. The first step was to read it completely and theoretically decide which could bring useful information.
  2. The second step consisted using actual data from our libraries.
  3. The third one is to use them to produce information.

ISO Indicators chosen and discussed

User per capita indicator stresses the importance of the library as a place for study, meeting, and as a learning centre, and indicates the institution’s support for these tasks. We decided to consider students only, including PhD, for this indicator as they are the most actual users of the physical library.

Staff per capita is supposed assessing the number of library employees per 1 000 members of the population to be served. The amount of work to be done can be considered proportional to the number of persons in the population to be served. We decided to consider students, including PhD, for this indicator as they are the most actual users of the physical library + academic staff from faculties and hospitals. Hospital nursing staff actually uses (physically or not) the library, yet less than academics in our opinion. We hence agreed on adding here 10 percent of this personnel.

The Number of User Attendances at Training Lessons per Capita can be used to assess the success of the library in reaching its users through the provision of training lessons. As this performance indicator is applicable to all libraries with a defined population to be served, and this number is impossible to define, we decided not to consider it.

User Services Staff as a Percentage of Total Staff indicator can be used to determine the library’s effort devoted to public services in relation to the background services. User services include the following functions: lending, reference interlibrary lending, user education, photocopying, shelving, and retrieving items. We decided to use it.

We now have to use our results, interpret them and find recommendations. This will probably be communicated in a paper or a conference presentation in the next months.