Statistics of usage — Can the figures of 2 or 3 databases be useful in benchmarking?

For comparison of usage figures, we have compared our lists of the available databases in our universities. While comparing them we have realised that our lists of available databases are very different. It is rather surprising to realise that three European health sciences libraries are not very similar even from this point of view.

There are many databases that two of us have but one does not, and this differs between all the three libraries so that it can be any of the three that does not have a database the two others do have. There are more than the two we have chosen, that we all provide access to, but the usage of all databases is not available or comparable for various reasons. For example, PubMed, naturally, is widely used in all the three universities but, as a freely accessible interface, it cannot provide this kind of information for us.

wp_20141104_047

Students using databases at UCL

The databases we have chosen are SciFinder and Scopus.

  • SciFinder is an information retrieval system for searching several databases about chemistry and related areas. It includes CAPLUS: literature reference database which contains references to more than 9000 journals on chemistry and related subjects, patents, technical reports, books, conference proceedings, and academic dissertations; CAS REGISTRY: factual database of over 50 million chemical substances (e.g. trade names, chemical names, chemical structures); CASREACT: contains 8 million single-phase and multiphase chemical reactions; CHEMCATS: trade catalogue of 7 million chemicals; CHEMLIST: catalogue of 232 000 regulated chemicals; MEDLINE: biomedical journal article reference database.
  • Scopus is a large abstract and citation database of research literature and quality web sources. Scopus offers nearly 19,000 peer-reviewed journals from more than 5,000 publishers. It contains also EMBASE and MEDLINE databases. It includes also book series and conference papers. The number of records is 46 million. Moreover, Scopus covers 433 million quality web sources, including 24 million patents.

This is how the figures of usage (number of searches) look for the year 2015:

2015 NTNU UCL UEF
SciFinder Searches   28 146   25 847   9 842
Scopus Searches 131 440 182 384 68 752

For the year 2016  they look like this:

2016 NTNU UCL UEF
SciFinder Searches    21 097   25 090    9 473
Scopus Searches  143 272 722 761  85 167

How does this compare to the number of potential users? In all, NTNU had about 22 000 students and a staff of 6 700, while UCL had 24 000 students and a staff of 5 840, and finally, UEF — being half the size of the other two — had approximately 11 000 students and a staff of 2 800. Not all of them are potential users of the rather specific SciFinder but they all are potential users of the multi-disciplinary Scopus. So, here is a comparison of Scopus usage in 2016 per student and per staff member in each university:

2016 NTNU UCL UEF
Scopus Searches per potential user  5 24  5

The difference is significant. What could it mean? Should NTNU and UEF start marketing Scopus more? Probably. Or is there an alternative choice on their list of databases that UCL does not have?

The Web of Science (WoS) is the one that comes into mind as another multi-disciplinary database. What do the figures of WoS usage show?

2016 NTNU UEF
Web of Science Searches  151 608  65 640
Web of Science Searches per potential user   5  5

In NTNU and UEF WoS is probably one reason for the lower usage of Scopus but counting together Scopus and WoS usage per user in NTNU and UEF does not come even close to the figure of only Scopus usage in UCL. It is difficult to know what is behind these figures.

At UCL, but also NTNU and UEF, students are taught to use Scopus during library trainings. Moreover, research evaluation (for international, national, regional and university funding) and individual academic promotion are largely based on bibliometrics, when francophone universities in Belgium do not offer WoS. (Neither does the Belgian scientific fund, FNRS-FRS).

If we could publish the information about on how much each of the universities pays for these databases, we might be able to say something about value for money, but at the moment that is not possible. Maybe the paradigm of open science will progress to that direction, too, at least if the funders are the tax-payers. Once again, comparing and benchmarking internationally is not easy at all. It is complicated and time-demanding but also interesting and thought-provoking.

wp_20141031_047

Happy benchmarkers at NTNU main library building

Advertisements

Comparing statistical information

In the beginning of the project we collected and shared plenty of statistical information about our libraries and universities. The plan was to compare the activities and results. Areas were

  • Library areas, facilities and equipment
  • Services for the public, including loan, ILL and user training
  • Collection management, bibliographic records
  • Institutional repository
  • Library staff, both number and staff training
  • Financial data

Statistical data will give more value when comparing with others or with oneself, over time, but which statistics can be compared? One example is loans. Number of loans is easy to compare, and the numbers can be extracted from the library systems. For 2013 we have the following statistics for loans, visits to the libraries and size of collections. But we miss important data, e.g. on downloads of articles, and use of e-books.

bench-fig1

*) Include renewals, for NTNU is for example the number for first time loan is about 50%.

**) Apply to the whole institutions, not just the medical libraries. Medicine has focus on articles more than books, so these numbers are not valid for medicine.

How to compare?

We can observe that NTNU and UCL have quite similar number of loans, and almost twice as much as UEF, at the same time  NTNU and UCL have almost twice as many visits as UEF. And the inter library loan at NTNU is three times as high as at UCL. Can we see any correlation at all? It is also easy to compare interlibrary loans. Use of collections in medical libraries tend to have a predominance on articles, at the same time prices of electronic journals are increasing more then price of books. This mean that no library are able to have all the journals needed in their collection. So ILL can say something about the quality of the library collection – and also about the size of the media budget. An example: the NTNU Library (BMH) use about 2.5 % of the media budget on buying copies, a very small amount when more than 90% of the budget is used on electronic resources and journals.

Many element affect the statistics and the use of library services. Number of loans must be seen in relation to the size of the universities. The NTNU part of BMH serves 3000 students and 11000 staff members at NTNU and St. Olavs Hospital. UCL serves about 6000 medical students and 760 academic and research personnel, and the UEF Library serves a total of 3000 university staff members and 15 000 students (about 1400 medical and dental students). Other elements affecting statistics are size of the collection, how updated is the collection, acquisitions per year, number of users, the amount of e-books and e-journals, the number of printed books replaced by e-books and so on.

Should we be able to compare, we must use indicators; this will be discussed  in a separate blog post. Examples of other useful indicators could be

  • Number of e-journals / download of articles
  • Number of e-books / number of pages read
  • Loans from library collection / ILL
  • Loans from library collection / number of students

As an example we can calculate the relation between visits to the library and loans. NTNU have 0.42 loans per visit, while at UCL and UEF are respectively 0.17 and 0.1 loans per visit. If this is done for several years, one can get a picture of how user activity in the library develops over a period of time. And then it also gives more meaning to comparing libraries.

Follow the money

When we started collecting data, we did not know how to compare. We have lots of data, but not necessarily the most interesting or useful data. It became clear that though we all three are medical/health sciences libraries that serve both faculty and university hospital and also other users, we are neither organized nor financed in the same way. Due to these differences  it is difficult to compare economic data. Yet, it would be useful, and the library directors are keen to compare both financial and other data. While visiting the three libraries, we had discussions with the  library directors and got suggestions for further work on statistics and data. In Trondheim we were encouraged to measure the impact of the library, and to look at the connections between quantitative and qualitative indicators. In Louvain-la-Neuve we talked about library statistics and economics, and that it is important to seek out at least some comparable indicators nationally and internationally. In Kuopio we discussed statistics and other data as useful background information.

Next step for our project is to find indicators for library performance. More on this topic in another blog post.